Dec 04 21:57:49.600236 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 04 21:57:49.826275 master-0 kubenswrapper[4842]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 21:57:49.826275 master-0 kubenswrapper[4842]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 04 21:57:49.826275 master-0 kubenswrapper[4842]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 21:57:49.826275 master-0 kubenswrapper[4842]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 21:57:49.826275 master-0 kubenswrapper[4842]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 04 21:57:49.826275 master-0 kubenswrapper[4842]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 21:57:49.827978 master-0 kubenswrapper[4842]: I1204 21:57:49.826515 4842 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 04 21:57:49.831072 master-0 kubenswrapper[4842]: W1204 21:57:49.830946 4842 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 21:57:49.831072 master-0 kubenswrapper[4842]: W1204 21:57:49.831076 4842 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 21:57:49.831247 master-0 kubenswrapper[4842]: W1204 21:57:49.831092 4842 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 21:57:49.831247 master-0 kubenswrapper[4842]: W1204 21:57:49.831103 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 21:57:49.831247 master-0 kubenswrapper[4842]: W1204 21:57:49.831114 4842 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 21:57:49.831247 master-0 kubenswrapper[4842]: W1204 21:57:49.831136 4842 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 21:57:49.831247 master-0 kubenswrapper[4842]: W1204 21:57:49.831158 4842 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 21:57:49.831247 master-0 kubenswrapper[4842]: W1204 21:57:49.831172 4842 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831337 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831406 4842 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831417 4842 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831427 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831477 4842 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831486 4842 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831495 4842 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831538 4842 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831550 4842 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831562 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831574 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831586 4842 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 21:57:49.831593 master-0 kubenswrapper[4842]: W1204 21:57:49.831607 4842 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831620 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831631 4842 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831644 4842 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831667 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831678 4842 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831689 4842 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831699 4842 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831713 4842 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831729 4842 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831739 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831748 4842 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831758 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831767 4842 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831775 4842 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831785 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.831797 4842 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.832150 4842 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.832167 4842 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 21:57:49.832235 master-0 kubenswrapper[4842]: W1204 21:57:49.832176 4842 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832185 4842 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832193 4842 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832202 4842 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832210 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832219 4842 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832228 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832236 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832244 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832253 4842 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832261 4842 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832270 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832279 4842 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832288 4842 feature_gate.go:330] unrecognized feature gate: Example Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832296 4842 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832304 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832312 4842 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832320 4842 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832332 4842 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832340 4842 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 21:57:49.833217 master-0 kubenswrapper[4842]: W1204 21:57:49.832348 4842 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832356 4842 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832365 4842 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832373 4842 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832383 4842 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832392 4842 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832400 4842 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832410 4842 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832420 4842 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832427 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832439 4842 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832450 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: W1204 21:57:49.832459 4842 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: I1204 21:57:49.832726 4842 flags.go:64] FLAG: --address="0.0.0.0" Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: I1204 21:57:49.832750 4842 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: I1204 21:57:49.832769 4842 flags.go:64] FLAG: --anonymous-auth="true" Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: I1204 21:57:49.832782 4842 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: I1204 21:57:49.832796 4842 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: I1204 21:57:49.832807 4842 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: I1204 21:57:49.832820 4842 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: I1204 21:57:49.832832 4842 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 04 21:57:49.834305 master-0 kubenswrapper[4842]: I1204 21:57:49.832841 4842 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832851 4842 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832862 4842 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832872 4842 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832881 4842 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832891 4842 flags.go:64] FLAG: --cgroup-root="" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832900 4842 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832910 4842 flags.go:64] FLAG: --client-ca-file="" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832918 4842 flags.go:64] FLAG: --cloud-config="" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832927 4842 flags.go:64] FLAG: --cloud-provider="" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832936 4842 flags.go:64] FLAG: --cluster-dns="[]" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832950 4842 flags.go:64] FLAG: --cluster-domain="" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832959 4842 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832969 4842 flags.go:64] FLAG: --config-dir="" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832977 4842 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.832987 4842 flags.go:64] FLAG: --container-log-max-files="5" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.833000 4842 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.833010 4842 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.833020 4842 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.833029 4842 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.833038 4842 flags.go:64] FLAG: --contention-profiling="false" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.833048 4842 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.833058 4842 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.833068 4842 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.833077 4842 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 04 21:57:49.835376 master-0 kubenswrapper[4842]: I1204 21:57:49.833091 4842 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833101 4842 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833110 4842 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833120 4842 flags.go:64] FLAG: --enable-load-reader="false" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833129 4842 flags.go:64] FLAG: --enable-server="true" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833138 4842 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833151 4842 flags.go:64] FLAG: --event-burst="100" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833161 4842 flags.go:64] FLAG: --event-qps="50" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833171 4842 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833180 4842 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833189 4842 flags.go:64] FLAG: --eviction-hard="" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833201 4842 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833210 4842 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833219 4842 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833231 4842 flags.go:64] FLAG: --eviction-soft="" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833241 4842 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833250 4842 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833259 4842 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833269 4842 flags.go:64] FLAG: --experimental-mounter-path="" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833279 4842 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833288 4842 flags.go:64] FLAG: --fail-swap-on="true" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833297 4842 flags.go:64] FLAG: --feature-gates="" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833308 4842 flags.go:64] FLAG: --file-check-frequency="20s" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833318 4842 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833328 4842 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 04 21:57:49.836636 master-0 kubenswrapper[4842]: I1204 21:57:49.833338 4842 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833347 4842 flags.go:64] FLAG: --healthz-port="10248" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833357 4842 flags.go:64] FLAG: --help="false" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833366 4842 flags.go:64] FLAG: --hostname-override="" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833375 4842 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833384 4842 flags.go:64] FLAG: --http-check-frequency="20s" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833393 4842 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833402 4842 flags.go:64] FLAG: --image-credential-provider-config="" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833411 4842 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833420 4842 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833430 4842 flags.go:64] FLAG: --image-service-endpoint="" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833439 4842 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833448 4842 flags.go:64] FLAG: --kube-api-burst="100" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833457 4842 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833467 4842 flags.go:64] FLAG: --kube-api-qps="50" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833476 4842 flags.go:64] FLAG: --kube-reserved="" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833485 4842 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833495 4842 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833530 4842 flags.go:64] FLAG: --kubelet-cgroups="" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833540 4842 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833550 4842 flags.go:64] FLAG: --lock-file="" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833560 4842 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833569 4842 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833579 4842 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833597 4842 flags.go:64] FLAG: --log-json-split-stream="false" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833607 4842 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 04 21:57:49.837968 master-0 kubenswrapper[4842]: I1204 21:57:49.833616 4842 flags.go:64] FLAG: --log-text-split-stream="false" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833625 4842 flags.go:64] FLAG: --logging-format="text" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833673 4842 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833684 4842 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833693 4842 flags.go:64] FLAG: --manifest-url="" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833703 4842 flags.go:64] FLAG: --manifest-url-header="" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833716 4842 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833726 4842 flags.go:64] FLAG: --max-open-files="1000000" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833738 4842 flags.go:64] FLAG: --max-pods="110" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833747 4842 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833757 4842 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833767 4842 flags.go:64] FLAG: --memory-manager-policy="None" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833776 4842 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833791 4842 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833800 4842 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833810 4842 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833837 4842 flags.go:64] FLAG: --node-status-max-images="50" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833846 4842 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833856 4842 flags.go:64] FLAG: --oom-score-adj="-999" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833866 4842 flags.go:64] FLAG: --pod-cidr="" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833875 4842 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833889 4842 flags.go:64] FLAG: --pod-manifest-path="" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833898 4842 flags.go:64] FLAG: --pod-max-pids="-1" Dec 04 21:57:49.839394 master-0 kubenswrapper[4842]: I1204 21:57:49.833907 4842 flags.go:64] FLAG: --pods-per-core="0" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.833917 4842 flags.go:64] FLAG: --port="10250" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.833927 4842 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.833936 4842 flags.go:64] FLAG: --provider-id="" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.833945 4842 flags.go:64] FLAG: --qos-reserved="" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.833955 4842 flags.go:64] FLAG: --read-only-port="10255" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.833964 4842 flags.go:64] FLAG: --register-node="true" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.833974 4842 flags.go:64] FLAG: --register-schedulable="true" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.833983 4842 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834000 4842 flags.go:64] FLAG: --registry-burst="10" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834010 4842 flags.go:64] FLAG: --registry-qps="5" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834020 4842 flags.go:64] FLAG: --reserved-cpus="" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834030 4842 flags.go:64] FLAG: --reserved-memory="" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834042 4842 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834051 4842 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834060 4842 flags.go:64] FLAG: --rotate-certificates="false" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834070 4842 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834079 4842 flags.go:64] FLAG: --runonce="false" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834088 4842 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834098 4842 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834107 4842 flags.go:64] FLAG: --seccomp-default="false" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834116 4842 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834125 4842 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834138 4842 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834147 4842 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834156 4842 flags.go:64] FLAG: --storage-driver-password="root" Dec 04 21:57:49.840631 master-0 kubenswrapper[4842]: I1204 21:57:49.834165 4842 flags.go:64] FLAG: --storage-driver-secure="false" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834174 4842 flags.go:64] FLAG: --storage-driver-table="stats" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834184 4842 flags.go:64] FLAG: --storage-driver-user="root" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834193 4842 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834203 4842 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834212 4842 flags.go:64] FLAG: --system-cgroups="" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834221 4842 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834238 4842 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834247 4842 flags.go:64] FLAG: --tls-cert-file="" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834255 4842 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834271 4842 flags.go:64] FLAG: --tls-min-version="" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834280 4842 flags.go:64] FLAG: --tls-private-key-file="" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834289 4842 flags.go:64] FLAG: --topology-manager-policy="none" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834298 4842 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834308 4842 flags.go:64] FLAG: --topology-manager-scope="container" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834317 4842 flags.go:64] FLAG: --v="2" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834330 4842 flags.go:64] FLAG: --version="false" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834343 4842 flags.go:64] FLAG: --vmodule="" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834355 4842 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: I1204 21:57:49.834364 4842 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: W1204 21:57:49.834633 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: W1204 21:57:49.834646 4842 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: W1204 21:57:49.834655 4842 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 21:57:49.842091 master-0 kubenswrapper[4842]: W1204 21:57:49.834664 4842 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834674 4842 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834683 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834692 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834701 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834710 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834720 4842 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834729 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834738 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834746 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834755 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834763 4842 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834771 4842 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834779 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834787 4842 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834795 4842 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834803 4842 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834810 4842 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834820 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 21:57:49.843280 master-0 kubenswrapper[4842]: W1204 21:57:49.834827 4842 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834835 4842 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834843 4842 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834851 4842 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834858 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834866 4842 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834873 4842 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834881 4842 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834889 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834897 4842 feature_gate.go:330] unrecognized feature gate: Example Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834905 4842 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834913 4842 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834921 4842 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834931 4842 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834941 4842 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834986 4842 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.834996 4842 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.835007 4842 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.835017 4842 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.835025 4842 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.835033 4842 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 21:57:49.844747 master-0 kubenswrapper[4842]: W1204 21:57:49.835040 4842 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835048 4842 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835056 4842 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835063 4842 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835072 4842 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835079 4842 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835087 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835096 4842 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835106 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835115 4842 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835123 4842 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835133 4842 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835144 4842 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835152 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835161 4842 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835168 4842 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835176 4842 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835184 4842 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835192 4842 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 21:57:49.845773 master-0 kubenswrapper[4842]: W1204 21:57:49.835199 4842 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 21:57:49.846859 master-0 kubenswrapper[4842]: W1204 21:57:49.835207 4842 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 21:57:49.846859 master-0 kubenswrapper[4842]: W1204 21:57:49.835214 4842 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 21:57:49.846859 master-0 kubenswrapper[4842]: W1204 21:57:49.835224 4842 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 21:57:49.846859 master-0 kubenswrapper[4842]: W1204 21:57:49.835232 4842 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 21:57:49.846859 master-0 kubenswrapper[4842]: W1204 21:57:49.835239 4842 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 21:57:49.846859 master-0 kubenswrapper[4842]: W1204 21:57:49.835247 4842 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 21:57:49.846859 master-0 kubenswrapper[4842]: W1204 21:57:49.835255 4842 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 21:57:49.846859 master-0 kubenswrapper[4842]: W1204 21:57:49.835263 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 21:57:49.846859 master-0 kubenswrapper[4842]: W1204 21:57:49.835271 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 21:57:49.846859 master-0 kubenswrapper[4842]: I1204 21:57:49.835296 4842 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 21:57:49.847443 master-0 kubenswrapper[4842]: I1204 21:57:49.847183 4842 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 04 21:57:49.847443 master-0 kubenswrapper[4842]: I1204 21:57:49.847248 4842 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 04 21:57:49.847443 master-0 kubenswrapper[4842]: W1204 21:57:49.847416 4842 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 21:57:49.847443 master-0 kubenswrapper[4842]: W1204 21:57:49.847428 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 21:57:49.847443 master-0 kubenswrapper[4842]: W1204 21:57:49.847435 4842 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 21:57:49.847443 master-0 kubenswrapper[4842]: W1204 21:57:49.847444 4842 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 21:57:49.847443 master-0 kubenswrapper[4842]: W1204 21:57:49.847453 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 21:57:49.847443 master-0 kubenswrapper[4842]: W1204 21:57:49.847463 4842 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847472 4842 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847480 4842 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847488 4842 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847522 4842 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847530 4842 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847537 4842 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847543 4842 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847551 4842 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847557 4842 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847564 4842 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847570 4842 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847576 4842 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847583 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847589 4842 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847595 4842 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847602 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847608 4842 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847614 4842 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847619 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 21:57:49.848020 master-0 kubenswrapper[4842]: W1204 21:57:49.847624 4842 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847629 4842 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847634 4842 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847640 4842 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847645 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847651 4842 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847656 4842 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847662 4842 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847667 4842 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847675 4842 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847680 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847687 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847696 4842 feature_gate.go:330] unrecognized feature gate: Example Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847709 4842 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847719 4842 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847726 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847733 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847739 4842 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847745 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847751 4842 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 21:57:49.849035 master-0 kubenswrapper[4842]: W1204 21:57:49.847759 4842 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847766 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847772 4842 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847778 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847783 4842 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847789 4842 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847796 4842 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847803 4842 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847809 4842 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847815 4842 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847822 4842 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847829 4842 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847835 4842 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847844 4842 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847850 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847856 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847861 4842 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847867 4842 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847872 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 21:57:49.850458 master-0 kubenswrapper[4842]: W1204 21:57:49.847878 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.847883 4842 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.847889 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.847894 4842 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.847900 4842 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.847905 4842 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.847912 4842 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.847918 4842 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: I1204 21:57:49.847928 4842 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.848125 4842 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.848134 4842 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.848140 4842 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.848146 4842 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.848153 4842 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.848159 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 21:57:49.851944 master-0 kubenswrapper[4842]: W1204 21:57:49.848165 4842 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848170 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848176 4842 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848181 4842 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848186 4842 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848192 4842 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848197 4842 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848202 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848208 4842 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848214 4842 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848220 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848225 4842 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848231 4842 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848236 4842 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848241 4842 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848247 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848253 4842 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848260 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848275 4842 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848283 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 21:57:49.853076 master-0 kubenswrapper[4842]: W1204 21:57:49.848292 4842 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848298 4842 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848304 4842 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848311 4842 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848318 4842 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848327 4842 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848333 4842 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848341 4842 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848347 4842 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848352 4842 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848358 4842 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848363 4842 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848369 4842 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848374 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848380 4842 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848388 4842 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848394 4842 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848401 4842 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848408 4842 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 21:57:49.854485 master-0 kubenswrapper[4842]: W1204 21:57:49.848414 4842 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848420 4842 feature_gate.go:330] unrecognized feature gate: Example Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848427 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848432 4842 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848438 4842 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848444 4842 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848450 4842 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848456 4842 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848461 4842 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848467 4842 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848473 4842 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848478 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848484 4842 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848490 4842 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848497 4842 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848529 4842 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848535 4842 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848541 4842 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848548 4842 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848554 4842 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 21:57:49.855757 master-0 kubenswrapper[4842]: W1204 21:57:49.848561 4842 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: W1204 21:57:49.848566 4842 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: W1204 21:57:49.848573 4842 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: W1204 21:57:49.848578 4842 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: W1204 21:57:49.848585 4842 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: W1204 21:57:49.848592 4842 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: W1204 21:57:49.848599 4842 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: I1204 21:57:49.848610 4842 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: I1204 21:57:49.849150 4842 server.go:940] "Client rotation is on, will bootstrap in background" Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: I1204 21:57:49.851936 4842 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: I1204 21:57:49.852724 4842 server.go:997] "Starting client certificate rotation" Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: I1204 21:57:49.852750 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 04 21:57:49.856961 master-0 kubenswrapper[4842]: I1204 21:57:49.853034 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 21:57:49.860717 master-0 kubenswrapper[4842]: I1204 21:57:49.860660 4842 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 21:57:49.863304 master-0 kubenswrapper[4842]: I1204 21:57:49.863235 4842 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 21:57:49.864763 master-0 kubenswrapper[4842]: E1204 21:57:49.864693 4842 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:49.875934 master-0 kubenswrapper[4842]: I1204 21:57:49.875857 4842 log.go:25] "Validated CRI v1 runtime API" Dec 04 21:57:49.879536 master-0 kubenswrapper[4842]: I1204 21:57:49.879450 4842 log.go:25] "Validated CRI v1 image API" Dec 04 21:57:49.881947 master-0 kubenswrapper[4842]: I1204 21:57:49.881898 4842 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 04 21:57:49.890405 master-0 kubenswrapper[4842]: I1204 21:57:49.889687 4842 fs.go:135] Filesystem UUIDs: map[4c52ad11-dbba-45ec-8a7c-4164b2d3de92:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Dec 04 21:57:49.890586 master-0 kubenswrapper[4842]: I1204 21:57:49.890406 4842 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Dec 04 21:57:49.928089 master-0 kubenswrapper[4842]: I1204 21:57:49.927341 4842 manager.go:217] Machine: {Timestamp:2025-12-04 21:57:49.924421633 +0000 UTC m=+0.239233918 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:58e57637271046a9a49cd83dda54d0eb SystemUUID:58e57637-2710-46a9-a49c-d83dda54d0eb BootID:4d17516d-34b9-4c3d-aaa6-c745ecd06d22 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:65:18:02 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:40:5b:43 Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:c6:7e:e9:15:bf:77 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 04 21:57:49.928089 master-0 kubenswrapper[4842]: I1204 21:57:49.927928 4842 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 04 21:57:49.928551 master-0 kubenswrapper[4842]: I1204 21:57:49.928180 4842 manager.go:233] Version: {KernelVersion:5.14.0-427.100.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511170715-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 04 21:57:49.929306 master-0 kubenswrapper[4842]: I1204 21:57:49.929248 4842 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 04 21:57:49.929701 master-0 kubenswrapper[4842]: I1204 21:57:49.929633 4842 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 04 21:57:49.930123 master-0 kubenswrapper[4842]: I1204 21:57:49.929690 4842 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 04 21:57:49.930233 master-0 kubenswrapper[4842]: I1204 21:57:49.930158 4842 topology_manager.go:138] "Creating topology manager with none policy" Dec 04 21:57:49.930233 master-0 kubenswrapper[4842]: I1204 21:57:49.930178 4842 container_manager_linux.go:303] "Creating device plugin manager" Dec 04 21:57:49.930659 master-0 kubenswrapper[4842]: I1204 21:57:49.930613 4842 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 21:57:49.930735 master-0 kubenswrapper[4842]: I1204 21:57:49.930687 4842 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 21:57:49.931331 master-0 kubenswrapper[4842]: I1204 21:57:49.931285 4842 state_mem.go:36] "Initialized new in-memory state store" Dec 04 21:57:49.931488 master-0 kubenswrapper[4842]: I1204 21:57:49.931450 4842 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 04 21:57:49.933026 master-0 kubenswrapper[4842]: I1204 21:57:49.932967 4842 kubelet.go:418] "Attempting to sync node with API server" Dec 04 21:57:49.933026 master-0 kubenswrapper[4842]: I1204 21:57:49.933014 4842 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 04 21:57:49.933258 master-0 kubenswrapper[4842]: I1204 21:57:49.933065 4842 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 04 21:57:49.933258 master-0 kubenswrapper[4842]: I1204 21:57:49.933098 4842 kubelet.go:324] "Adding apiserver pod source" Dec 04 21:57:49.933258 master-0 kubenswrapper[4842]: I1204 21:57:49.933135 4842 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 04 21:57:49.936707 master-0 kubenswrapper[4842]: I1204 21:57:49.936616 4842 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 04 21:57:49.937588 master-0 kubenswrapper[4842]: W1204 21:57:49.937432 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:49.937703 master-0 kubenswrapper[4842]: E1204 21:57:49.937613 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:49.937703 master-0 kubenswrapper[4842]: W1204 21:57:49.937415 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:49.937703 master-0 kubenswrapper[4842]: E1204 21:57:49.937694 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:49.938430 master-0 kubenswrapper[4842]: I1204 21:57:49.938370 4842 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 04 21:57:49.938875 master-0 kubenswrapper[4842]: I1204 21:57:49.938782 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 04 21:57:49.938875 master-0 kubenswrapper[4842]: I1204 21:57:49.938863 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 04 21:57:49.938875 master-0 kubenswrapper[4842]: I1204 21:57:49.938882 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 04 21:57:49.939142 master-0 kubenswrapper[4842]: I1204 21:57:49.938898 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 04 21:57:49.939142 master-0 kubenswrapper[4842]: I1204 21:57:49.938913 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 04 21:57:49.939142 master-0 kubenswrapper[4842]: I1204 21:57:49.938928 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 04 21:57:49.939142 master-0 kubenswrapper[4842]: I1204 21:57:49.938943 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 04 21:57:49.939142 master-0 kubenswrapper[4842]: I1204 21:57:49.938956 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 04 21:57:49.939142 master-0 kubenswrapper[4842]: I1204 21:57:49.938973 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 04 21:57:49.939142 master-0 kubenswrapper[4842]: I1204 21:57:49.938990 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 04 21:57:49.939142 master-0 kubenswrapper[4842]: I1204 21:57:49.939009 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 04 21:57:49.939142 master-0 kubenswrapper[4842]: I1204 21:57:49.939032 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 04 21:57:49.939878 master-0 kubenswrapper[4842]: I1204 21:57:49.939404 4842 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 04 21:57:49.940377 master-0 kubenswrapper[4842]: I1204 21:57:49.940312 4842 server.go:1280] "Started kubelet" Dec 04 21:57:49.940955 master-0 kubenswrapper[4842]: I1204 21:57:49.940846 4842 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 04 21:57:49.941087 master-0 kubenswrapper[4842]: I1204 21:57:49.940847 4842 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 04 21:57:49.941087 master-0 kubenswrapper[4842]: I1204 21:57:49.941037 4842 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 04 21:57:49.942361 master-0 kubenswrapper[4842]: I1204 21:57:49.942250 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:49.943119 master-0 kubenswrapper[4842]: I1204 21:57:49.943032 4842 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 04 21:57:49.943790 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 04 21:57:49.946595 master-0 kubenswrapper[4842]: I1204 21:57:49.946523 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 04 21:57:49.946595 master-0 kubenswrapper[4842]: I1204 21:57:49.946598 4842 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 04 21:57:49.946874 master-0 kubenswrapper[4842]: E1204 21:57:49.946740 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:57:49.947042 master-0 kubenswrapper[4842]: I1204 21:57:49.946925 4842 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 04 21:57:49.947042 master-0 kubenswrapper[4842]: I1204 21:57:49.946951 4842 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 04 21:57:49.947042 master-0 kubenswrapper[4842]: I1204 21:57:49.946959 4842 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 04 21:57:49.948555 master-0 kubenswrapper[4842]: I1204 21:57:49.947462 4842 server.go:449] "Adding debug handlers to kubelet server" Dec 04 21:57:49.948555 master-0 kubenswrapper[4842]: I1204 21:57:49.947556 4842 reconstruct.go:97] "Volume reconstruction finished" Dec 04 21:57:49.948555 master-0 kubenswrapper[4842]: I1204 21:57:49.947582 4842 reconciler.go:26] "Reconciler: start to sync state" Dec 04 21:57:49.948555 master-0 kubenswrapper[4842]: E1204 21:57:49.947287 4842 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.187e21fbf109c948 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:49.940271432 +0000 UTC m=+0.255083647,LastTimestamp:2025-12-04 21:57:49.940271432 +0000 UTC m=+0.255083647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:57:49.951899 master-0 kubenswrapper[4842]: E1204 21:57:49.951796 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Dec 04 21:57:49.952420 master-0 kubenswrapper[4842]: I1204 21:57:49.952358 4842 factory.go:55] Registering systemd factory Dec 04 21:57:49.952420 master-0 kubenswrapper[4842]: I1204 21:57:49.952406 4842 factory.go:221] Registration of the systemd container factory successfully Dec 04 21:57:49.952918 master-0 kubenswrapper[4842]: I1204 21:57:49.952852 4842 factory.go:153] Registering CRI-O factory Dec 04 21:57:49.952918 master-0 kubenswrapper[4842]: I1204 21:57:49.952890 4842 factory.go:221] Registration of the crio container factory successfully Dec 04 21:57:49.952918 master-0 kubenswrapper[4842]: W1204 21:57:49.952848 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:49.953173 master-0 kubenswrapper[4842]: E1204 21:57:49.952978 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:49.953173 master-0 kubenswrapper[4842]: I1204 21:57:49.953001 4842 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 04 21:57:49.953173 master-0 kubenswrapper[4842]: I1204 21:57:49.953077 4842 factory.go:103] Registering Raw factory Dec 04 21:57:49.953173 master-0 kubenswrapper[4842]: I1204 21:57:49.953109 4842 manager.go:1196] Started watching for new ooms in manager Dec 04 21:57:49.954320 master-0 kubenswrapper[4842]: I1204 21:57:49.954252 4842 manager.go:319] Starting recovery of all containers Dec 04 21:57:49.957577 master-0 kubenswrapper[4842]: E1204 21:57:49.957444 4842 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Dec 04 21:57:49.985403 master-0 kubenswrapper[4842]: I1204 21:57:49.984884 4842 manager.go:324] Recovery completed Dec 04 21:57:50.000738 master-0 kubenswrapper[4842]: I1204 21:57:50.000682 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.002793 master-0 kubenswrapper[4842]: I1204 21:57:50.002754 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.002860 master-0 kubenswrapper[4842]: I1204 21:57:50.002815 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.002860 master-0 kubenswrapper[4842]: I1204 21:57:50.002832 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.004368 master-0 kubenswrapper[4842]: I1204 21:57:50.004328 4842 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 04 21:57:50.004368 master-0 kubenswrapper[4842]: I1204 21:57:50.004348 4842 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 04 21:57:50.004368 master-0 kubenswrapper[4842]: I1204 21:57:50.004373 4842 state_mem.go:36] "Initialized new in-memory state store" Dec 04 21:57:50.007239 master-0 kubenswrapper[4842]: I1204 21:57:50.007202 4842 policy_none.go:49] "None policy: Start" Dec 04 21:57:50.008464 master-0 kubenswrapper[4842]: I1204 21:57:50.008412 4842 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 04 21:57:50.008554 master-0 kubenswrapper[4842]: I1204 21:57:50.008533 4842 state_mem.go:35] "Initializing new in-memory state store" Dec 04 21:57:50.047885 master-0 kubenswrapper[4842]: E1204 21:57:50.047766 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:57:50.085830 master-0 kubenswrapper[4842]: I1204 21:57:50.085638 4842 manager.go:334] "Starting Device Plugin manager" Dec 04 21:57:50.085830 master-0 kubenswrapper[4842]: I1204 21:57:50.085704 4842 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 04 21:57:50.085830 master-0 kubenswrapper[4842]: I1204 21:57:50.085721 4842 server.go:79] "Starting device plugin registration server" Dec 04 21:57:50.086198 master-0 kubenswrapper[4842]: I1204 21:57:50.086167 4842 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 04 21:57:50.086258 master-0 kubenswrapper[4842]: I1204 21:57:50.086193 4842 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 04 21:57:50.086415 master-0 kubenswrapper[4842]: I1204 21:57:50.086355 4842 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 04 21:57:50.087012 master-0 kubenswrapper[4842]: I1204 21:57:50.086691 4842 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 04 21:57:50.087012 master-0 kubenswrapper[4842]: I1204 21:57:50.086718 4842 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 04 21:57:50.088654 master-0 kubenswrapper[4842]: E1204 21:57:50.088579 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 04 21:57:50.139815 master-0 kubenswrapper[4842]: I1204 21:57:50.139696 4842 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 04 21:57:50.143820 master-0 kubenswrapper[4842]: I1204 21:57:50.143755 4842 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 04 21:57:50.143974 master-0 kubenswrapper[4842]: I1204 21:57:50.143863 4842 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 04 21:57:50.143974 master-0 kubenswrapper[4842]: I1204 21:57:50.143914 4842 kubelet.go:2335] "Starting kubelet main sync loop" Dec 04 21:57:50.144130 master-0 kubenswrapper[4842]: E1204 21:57:50.144014 4842 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Dec 04 21:57:50.145433 master-0 kubenswrapper[4842]: W1204 21:57:50.145351 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:50.145623 master-0 kubenswrapper[4842]: E1204 21:57:50.145446 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:50.153559 master-0 kubenswrapper[4842]: E1204 21:57:50.153436 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Dec 04 21:57:50.186720 master-0 kubenswrapper[4842]: I1204 21:57:50.186567 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.188143 master-0 kubenswrapper[4842]: I1204 21:57:50.188104 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.188253 master-0 kubenswrapper[4842]: I1204 21:57:50.188154 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.188253 master-0 kubenswrapper[4842]: I1204 21:57:50.188169 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.188253 master-0 kubenswrapper[4842]: I1204 21:57:50.188205 4842 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 21:57:50.189698 master-0 kubenswrapper[4842]: E1204 21:57:50.189610 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 04 21:57:50.244355 master-0 kubenswrapper[4842]: I1204 21:57:50.244162 4842 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Dec 04 21:57:50.244355 master-0 kubenswrapper[4842]: I1204 21:57:50.244333 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.245873 master-0 kubenswrapper[4842]: I1204 21:57:50.245828 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.245952 master-0 kubenswrapper[4842]: I1204 21:57:50.245896 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.245952 master-0 kubenswrapper[4842]: I1204 21:57:50.245915 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.246123 master-0 kubenswrapper[4842]: I1204 21:57:50.246095 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.246634 master-0 kubenswrapper[4842]: I1204 21:57:50.246469 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 21:57:50.246699 master-0 kubenswrapper[4842]: I1204 21:57:50.246663 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.247357 master-0 kubenswrapper[4842]: I1204 21:57:50.247333 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.247453 master-0 kubenswrapper[4842]: I1204 21:57:50.247361 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.247453 master-0 kubenswrapper[4842]: I1204 21:57:50.247371 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.247566 master-0 kubenswrapper[4842]: I1204 21:57:50.247475 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.247610 master-0 kubenswrapper[4842]: I1204 21:57:50.247592 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 04 21:57:50.247652 master-0 kubenswrapper[4842]: I1204 21:57:50.247627 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.247822 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.247847 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.247868 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.247908 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.248103 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.248119 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.248126 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.248122 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.248178 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.248193 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.248238 master-0 kubenswrapper[4842]: I1204 21:57:50.248274 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.249229 master-0 kubenswrapper[4842]: I1204 21:57:50.248451 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.249229 master-0 kubenswrapper[4842]: I1204 21:57:50.248483 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.249229 master-0 kubenswrapper[4842]: I1204 21:57:50.248532 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.249229 master-0 kubenswrapper[4842]: I1204 21:57:50.248686 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.249229 master-0 kubenswrapper[4842]: I1204 21:57:50.248796 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.249551 master-0 kubenswrapper[4842]: I1204 21:57:50.249361 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.249551 master-0 kubenswrapper[4842]: I1204 21:57:50.249380 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.249551 master-0 kubenswrapper[4842]: I1204 21:57:50.249389 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.249551 master-0 kubenswrapper[4842]: I1204 21:57:50.249462 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.249793 master-0 kubenswrapper[4842]: I1204 21:57:50.249649 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.249793 master-0 kubenswrapper[4842]: I1204 21:57:50.249692 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.250965 master-0 kubenswrapper[4842]: I1204 21:57:50.250913 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.250965 master-0 kubenswrapper[4842]: I1204 21:57:50.250962 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.251112 master-0 kubenswrapper[4842]: I1204 21:57:50.250969 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.251112 master-0 kubenswrapper[4842]: I1204 21:57:50.251023 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.251112 master-0 kubenswrapper[4842]: I1204 21:57:50.251049 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.251112 master-0 kubenswrapper[4842]: I1204 21:57:50.250987 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.251112 master-0 kubenswrapper[4842]: I1204 21:57:50.251049 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.251406 master-0 kubenswrapper[4842]: I1204 21:57:50.251146 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.251406 master-0 kubenswrapper[4842]: I1204 21:57:50.251169 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.251552 master-0 kubenswrapper[4842]: I1204 21:57:50.251466 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 21:57:50.251552 master-0 kubenswrapper[4842]: I1204 21:57:50.251534 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.252616 master-0 kubenswrapper[4842]: I1204 21:57:50.252559 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.252705 master-0 kubenswrapper[4842]: I1204 21:57:50.252623 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.252705 master-0 kubenswrapper[4842]: I1204 21:57:50.252648 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.348379 master-0 kubenswrapper[4842]: I1204 21:57:50.348314 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.348516 master-0 kubenswrapper[4842]: I1204 21:57:50.348391 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.348516 master-0 kubenswrapper[4842]: I1204 21:57:50.348437 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.348662 master-0 kubenswrapper[4842]: I1204 21:57:50.348600 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 21:57:50.348782 master-0 kubenswrapper[4842]: I1204 21:57:50.348750 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 21:57:50.348832 master-0 kubenswrapper[4842]: I1204 21:57:50.348801 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 21:57:50.348871 master-0 kubenswrapper[4842]: I1204 21:57:50.348845 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.348903 master-0 kubenswrapper[4842]: I1204 21:57:50.348890 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.349344 master-0 kubenswrapper[4842]: I1204 21:57:50.349294 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 21:57:50.349394 master-0 kubenswrapper[4842]: I1204 21:57:50.349251 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 21:57:50.349447 master-0 kubenswrapper[4842]: I1204 21:57:50.349415 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 21:57:50.349485 master-0 kubenswrapper[4842]: I1204 21:57:50.349341 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 21:57:50.349545 master-0 kubenswrapper[4842]: I1204 21:57:50.349524 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.349593 master-0 kubenswrapper[4842]: I1204 21:57:50.349566 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.349631 master-0 kubenswrapper[4842]: I1204 21:57:50.349608 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.349662 master-0 kubenswrapper[4842]: I1204 21:57:50.349645 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 21:57:50.349695 master-0 kubenswrapper[4842]: I1204 21:57:50.349682 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.349772 master-0 kubenswrapper[4842]: I1204 21:57:50.349745 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 21:57:50.349822 master-0 kubenswrapper[4842]: I1204 21:57:50.349742 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 21:57:50.349822 master-0 kubenswrapper[4842]: I1204 21:57:50.349791 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.349882 master-0 kubenswrapper[4842]: I1204 21:57:50.349848 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.390475 master-0 kubenswrapper[4842]: I1204 21:57:50.390400 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.392205 master-0 kubenswrapper[4842]: I1204 21:57:50.392160 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.392261 master-0 kubenswrapper[4842]: I1204 21:57:50.392227 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.392261 master-0 kubenswrapper[4842]: I1204 21:57:50.392248 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.392366 master-0 kubenswrapper[4842]: I1204 21:57:50.392336 4842 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 21:57:50.393719 master-0 kubenswrapper[4842]: E1204 21:57:50.393646 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 04 21:57:50.450757 master-0 kubenswrapper[4842]: I1204 21:57:50.450683 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.450864 master-0 kubenswrapper[4842]: I1204 21:57:50.450765 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.450864 master-0 kubenswrapper[4842]: I1204 21:57:50.450801 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.450864 master-0 kubenswrapper[4842]: I1204 21:57:50.450811 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.450864 master-0 kubenswrapper[4842]: I1204 21:57:50.450864 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.451013 master-0 kubenswrapper[4842]: I1204 21:57:50.450879 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.451107 master-0 kubenswrapper[4842]: I1204 21:57:50.451032 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.451306 master-0 kubenswrapper[4842]: I1204 21:57:50.451228 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.451341 master-0 kubenswrapper[4842]: I1204 21:57:50.451300 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 21:57:50.451387 master-0 kubenswrapper[4842]: I1204 21:57:50.451361 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 21:57:50.451422 master-0 kubenswrapper[4842]: I1204 21:57:50.451383 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.451422 master-0 kubenswrapper[4842]: I1204 21:57:50.451415 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.451475 master-0 kubenswrapper[4842]: I1204 21:57:50.451438 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.451475 master-0 kubenswrapper[4842]: I1204 21:57:50.451462 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.451566 master-0 kubenswrapper[4842]: I1204 21:57:50.451488 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 21:57:50.451566 master-0 kubenswrapper[4842]: I1204 21:57:50.451537 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.451625 master-0 kubenswrapper[4842]: I1204 21:57:50.451564 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.451625 master-0 kubenswrapper[4842]: I1204 21:57:50.451560 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.451625 master-0 kubenswrapper[4842]: I1204 21:57:50.451588 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.451625 master-0 kubenswrapper[4842]: I1204 21:57:50.451612 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.451737 master-0 kubenswrapper[4842]: I1204 21:57:50.451629 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.451737 master-0 kubenswrapper[4842]: I1204 21:57:50.451638 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 21:57:50.451737 master-0 kubenswrapper[4842]: I1204 21:57:50.451684 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.451737 master-0 kubenswrapper[4842]: I1204 21:57:50.451728 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.451846 master-0 kubenswrapper[4842]: I1204 21:57:50.451740 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.451846 master-0 kubenswrapper[4842]: I1204 21:57:50.451783 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.555316 master-0 kubenswrapper[4842]: E1204 21:57:50.555151 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Dec 04 21:57:50.590826 master-0 kubenswrapper[4842]: I1204 21:57:50.590656 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 21:57:50.612737 master-0 kubenswrapper[4842]: I1204 21:57:50.612626 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 04 21:57:50.630953 master-0 kubenswrapper[4842]: I1204 21:57:50.630855 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:57:50.648488 master-0 kubenswrapper[4842]: I1204 21:57:50.648225 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:57:50.657303 master-0 kubenswrapper[4842]: I1204 21:57:50.657215 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 21:57:50.794246 master-0 kubenswrapper[4842]: I1204 21:57:50.794132 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:50.795881 master-0 kubenswrapper[4842]: I1204 21:57:50.795825 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:50.795881 master-0 kubenswrapper[4842]: I1204 21:57:50.795876 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:50.795881 master-0 kubenswrapper[4842]: I1204 21:57:50.795886 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:50.796092 master-0 kubenswrapper[4842]: I1204 21:57:50.795956 4842 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 21:57:50.797230 master-0 kubenswrapper[4842]: E1204 21:57:50.797127 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 04 21:57:50.944632 master-0 kubenswrapper[4842]: I1204 21:57:50.944346 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:51.166540 master-0 kubenswrapper[4842]: W1204 21:57:51.166385 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:51.166879 master-0 kubenswrapper[4842]: E1204 21:57:51.166561 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:51.296091 master-0 kubenswrapper[4842]: W1204 21:57:51.295991 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:51.296091 master-0 kubenswrapper[4842]: E1204 21:57:51.296084 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:51.305323 master-0 kubenswrapper[4842]: W1204 21:57:51.305250 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:51.305323 master-0 kubenswrapper[4842]: E1204 21:57:51.305308 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:51.357266 master-0 kubenswrapper[4842]: E1204 21:57:51.357199 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Dec 04 21:57:51.364202 master-0 kubenswrapper[4842]: W1204 21:57:51.364099 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc0396a9a2689b3e8c132c12640cbe83.slice/crio-4c79666e90f7715124e93544d73c2c9b1066ea79ae6d56c7b16c532bd0846566 WatchSource:0}: Error finding container 4c79666e90f7715124e93544d73c2c9b1066ea79ae6d56c7b16c532bd0846566: Status 404 returned error can't find the container with id 4c79666e90f7715124e93544d73c2c9b1066ea79ae6d56c7b16c532bd0846566 Dec 04 21:57:51.364901 master-0 kubenswrapper[4842]: W1204 21:57:51.364862 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b47694fcc32464ab24d09c23d6efb57.slice/crio-f7b0f16a53cc394a75f1d385a9c55dea4c65ab334eb9a3cd2bbdaa30b3396154 WatchSource:0}: Error finding container f7b0f16a53cc394a75f1d385a9c55dea4c65ab334eb9a3cd2bbdaa30b3396154: Status 404 returned error can't find the container with id f7b0f16a53cc394a75f1d385a9c55dea4c65ab334eb9a3cd2bbdaa30b3396154 Dec 04 21:57:51.370088 master-0 kubenswrapper[4842]: I1204 21:57:51.370034 4842 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 21:57:51.423550 master-0 kubenswrapper[4842]: W1204 21:57:51.423396 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75143d9bc4a2dc15781dc51ccff632a.slice/crio-88063b8731e190e3ef35fcb2f8650f0d31e7321d57a43954195df2634f632310 WatchSource:0}: Error finding container 88063b8731e190e3ef35fcb2f8650f0d31e7321d57a43954195df2634f632310: Status 404 returned error can't find the container with id 88063b8731e190e3ef35fcb2f8650f0d31e7321d57a43954195df2634f632310 Dec 04 21:57:51.462626 master-0 kubenswrapper[4842]: W1204 21:57:51.462418 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:51.462626 master-0 kubenswrapper[4842]: E1204 21:57:51.462613 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:51.598040 master-0 kubenswrapper[4842]: I1204 21:57:51.597830 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:51.599762 master-0 kubenswrapper[4842]: I1204 21:57:51.599699 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:51.599762 master-0 kubenswrapper[4842]: I1204 21:57:51.599754 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:51.599762 master-0 kubenswrapper[4842]: I1204 21:57:51.599765 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:51.600052 master-0 kubenswrapper[4842]: I1204 21:57:51.599828 4842 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 21:57:51.601341 master-0 kubenswrapper[4842]: E1204 21:57:51.601234 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 04 21:57:51.627330 master-0 kubenswrapper[4842]: W1204 21:57:51.627059 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e09e2af7200e6f9be469dbfd9bb1127.slice/crio-4c2dca88da5957207daf9ea6b5fc6fb1e136ded6ac37a7d6c7df7c70ee7176a1 WatchSource:0}: Error finding container 4c2dca88da5957207daf9ea6b5fc6fb1e136ded6ac37a7d6c7df7c70ee7176a1: Status 404 returned error can't find the container with id 4c2dca88da5957207daf9ea6b5fc6fb1e136ded6ac37a7d6c7df7c70ee7176a1 Dec 04 21:57:51.725719 master-0 kubenswrapper[4842]: W1204 21:57:51.725632 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3169f44496ed8a28c6d6a15511ab0eec.slice/crio-ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94 WatchSource:0}: Error finding container ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94: Status 404 returned error can't find the container with id ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94 Dec 04 21:57:51.944240 master-0 kubenswrapper[4842]: I1204 21:57:51.944028 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:52.016729 master-0 kubenswrapper[4842]: I1204 21:57:52.016626 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 21:57:52.018755 master-0 kubenswrapper[4842]: E1204 21:57:52.018680 4842 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:52.152106 master-0 kubenswrapper[4842]: I1204 21:57:52.151939 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"f7b0f16a53cc394a75f1d385a9c55dea4c65ab334eb9a3cd2bbdaa30b3396154"} Dec 04 21:57:52.153856 master-0 kubenswrapper[4842]: I1204 21:57:52.153805 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"4c79666e90f7715124e93544d73c2c9b1066ea79ae6d56c7b16c532bd0846566"} Dec 04 21:57:52.154986 master-0 kubenswrapper[4842]: I1204 21:57:52.154939 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94"} Dec 04 21:57:52.156419 master-0 kubenswrapper[4842]: I1204 21:57:52.156370 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"4c2dca88da5957207daf9ea6b5fc6fb1e136ded6ac37a7d6c7df7c70ee7176a1"} Dec 04 21:57:52.157834 master-0 kubenswrapper[4842]: I1204 21:57:52.157780 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"88063b8731e190e3ef35fcb2f8650f0d31e7321d57a43954195df2634f632310"} Dec 04 21:57:52.943885 master-0 kubenswrapper[4842]: I1204 21:57:52.943814 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:52.959111 master-0 kubenswrapper[4842]: E1204 21:57:52.959042 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Dec 04 21:57:53.202376 master-0 kubenswrapper[4842]: I1204 21:57:53.202245 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:53.203536 master-0 kubenswrapper[4842]: I1204 21:57:53.203480 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:53.203587 master-0 kubenswrapper[4842]: I1204 21:57:53.203554 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:53.203587 master-0 kubenswrapper[4842]: I1204 21:57:53.203573 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:53.203677 master-0 kubenswrapper[4842]: I1204 21:57:53.203660 4842 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 21:57:53.204645 master-0 kubenswrapper[4842]: E1204 21:57:53.204606 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 04 21:57:53.349533 master-0 kubenswrapper[4842]: W1204 21:57:53.349433 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:53.349815 master-0 kubenswrapper[4842]: E1204 21:57:53.349557 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:53.921030 master-0 kubenswrapper[4842]: W1204 21:57:53.920961 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:53.921030 master-0 kubenswrapper[4842]: E1204 21:57:53.921033 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:53.943005 master-0 kubenswrapper[4842]: I1204 21:57:53.942969 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:54.088594 master-0 kubenswrapper[4842]: W1204 21:57:54.087976 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:54.088839 master-0 kubenswrapper[4842]: E1204 21:57:54.088606 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:54.303408 master-0 kubenswrapper[4842]: W1204 21:57:54.303302 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:54.303408 master-0 kubenswrapper[4842]: E1204 21:57:54.303393 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:54.944176 master-0 kubenswrapper[4842]: I1204 21:57:54.944091 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:55.167655 master-0 kubenswrapper[4842]: I1204 21:57:55.167593 4842 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="7ab8b346978ad6f1cf331a2cd2e464eb81897737f06c530111b331aae07ed9d5" exitCode=0 Dec 04 21:57:55.167932 master-0 kubenswrapper[4842]: I1204 21:57:55.167726 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:55.167932 master-0 kubenswrapper[4842]: I1204 21:57:55.167703 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"7ab8b346978ad6f1cf331a2cd2e464eb81897737f06c530111b331aae07ed9d5"} Dec 04 21:57:55.169128 master-0 kubenswrapper[4842]: I1204 21:57:55.169092 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:55.169205 master-0 kubenswrapper[4842]: I1204 21:57:55.169147 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:55.169205 master-0 kubenswrapper[4842]: I1204 21:57:55.169162 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:55.169957 master-0 kubenswrapper[4842]: I1204 21:57:55.169924 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"b4b557e71fac173d7ebddbf04536e46989f934644030fceea9234231919b8e8f"} Dec 04 21:57:55.170029 master-0 kubenswrapper[4842]: I1204 21:57:55.169963 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"75e4e520a75639c893eb6ea15b07a3187aaf4dfc898564bd4832b04c7d30a431"} Dec 04 21:57:55.170114 master-0 kubenswrapper[4842]: I1204 21:57:55.170085 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:55.170982 master-0 kubenswrapper[4842]: I1204 21:57:55.170957 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:55.171031 master-0 kubenswrapper[4842]: I1204 21:57:55.170990 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:55.171031 master-0 kubenswrapper[4842]: I1204 21:57:55.171006 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:55.571658 master-0 kubenswrapper[4842]: E1204 21:57:55.571387 4842 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.187e21fbf109c948 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:49.940271432 +0000 UTC m=+0.255083647,LastTimestamp:2025-12-04 21:57:49.940271432 +0000 UTC m=+0.255083647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:57:55.944006 master-0 kubenswrapper[4842]: I1204 21:57:55.943846 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:56.160650 master-0 kubenswrapper[4842]: E1204 21:57:56.160581 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Dec 04 21:57:56.173912 master-0 kubenswrapper[4842]: I1204 21:57:56.173866 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/0.log" Dec 04 21:57:56.174446 master-0 kubenswrapper[4842]: I1204 21:57:56.174414 4842 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="1869b63fd92f7983086b3f319eae480d99193930cb8dfedef3a0d5482fd43fa2" exitCode=1 Dec 04 21:57:56.174563 master-0 kubenswrapper[4842]: I1204 21:57:56.174485 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"1869b63fd92f7983086b3f319eae480d99193930cb8dfedef3a0d5482fd43fa2"} Dec 04 21:57:56.174563 master-0 kubenswrapper[4842]: I1204 21:57:56.174559 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:56.174651 master-0 kubenswrapper[4842]: I1204 21:57:56.174627 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:56.175454 master-0 kubenswrapper[4842]: I1204 21:57:56.175296 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:56.175454 master-0 kubenswrapper[4842]: I1204 21:57:56.175330 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:56.175454 master-0 kubenswrapper[4842]: I1204 21:57:56.175341 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:56.175454 master-0 kubenswrapper[4842]: I1204 21:57:56.175404 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:56.175454 master-0 kubenswrapper[4842]: I1204 21:57:56.175421 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:56.175454 master-0 kubenswrapper[4842]: I1204 21:57:56.175431 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:56.176188 master-0 kubenswrapper[4842]: I1204 21:57:56.175983 4842 scope.go:117] "RemoveContainer" containerID="1869b63fd92f7983086b3f319eae480d99193930cb8dfedef3a0d5482fd43fa2" Dec 04 21:57:56.274302 master-0 kubenswrapper[4842]: I1204 21:57:56.274198 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 21:57:56.275853 master-0 kubenswrapper[4842]: E1204 21:57:56.275770 4842 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:56.405012 master-0 kubenswrapper[4842]: I1204 21:57:56.404926 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:57:56.407206 master-0 kubenswrapper[4842]: I1204 21:57:56.406663 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:57:56.407206 master-0 kubenswrapper[4842]: I1204 21:57:56.406708 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:57:56.407206 master-0 kubenswrapper[4842]: I1204 21:57:56.406719 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:57:56.407206 master-0 kubenswrapper[4842]: I1204 21:57:56.406773 4842 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 21:57:56.408141 master-0 kubenswrapper[4842]: E1204 21:57:56.408056 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 04 21:57:56.929845 master-0 kubenswrapper[4842]: W1204 21:57:56.929727 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:56.930403 master-0 kubenswrapper[4842]: E1204 21:57:56.929858 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:56.943949 master-0 kubenswrapper[4842]: I1204 21:57:56.943903 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:57.588408 master-0 kubenswrapper[4842]: W1204 21:57:57.588299 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:57.588408 master-0 kubenswrapper[4842]: E1204 21:57:57.588413 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:57.944149 master-0 kubenswrapper[4842]: I1204 21:57:57.944006 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:58.945021 master-0 kubenswrapper[4842]: I1204 21:57:58.944910 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:59.620445 master-0 kubenswrapper[4842]: W1204 21:57:59.620308 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:57:59.620445 master-0 kubenswrapper[4842]: E1204 21:57:59.620426 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:57:59.943772 master-0 kubenswrapper[4842]: I1204 21:57:59.943704 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:58:00.088880 master-0 kubenswrapper[4842]: E1204 21:58:00.088809 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 04 21:58:00.188623 master-0 kubenswrapper[4842]: I1204 21:58:00.188391 4842 generic.go:334] "Generic (PLEG): container finished" podID="d75143d9bc4a2dc15781dc51ccff632a" containerID="2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8" exitCode=0 Dec 04 21:58:00.188623 master-0 kubenswrapper[4842]: I1204 21:58:00.188528 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerDied","Data":"2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8"} Dec 04 21:58:00.189343 master-0 kubenswrapper[4842]: I1204 21:58:00.189277 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:00.190914 master-0 kubenswrapper[4842]: I1204 21:58:00.190861 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:00.190914 master-0 kubenswrapper[4842]: I1204 21:58:00.190898 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:00.190914 master-0 kubenswrapper[4842]: I1204 21:58:00.190909 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:00.191630 master-0 kubenswrapper[4842]: I1204 21:58:00.191021 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"183ef4cdec698bdfc64b154e7dcc6b79ea7ba6ed603e295f5859a3a5c552d57d"} Dec 04 21:58:00.196281 master-0 kubenswrapper[4842]: I1204 21:58:00.196165 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/1.log" Dec 04 21:58:00.196776 master-0 kubenswrapper[4842]: I1204 21:58:00.196725 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:00.197287 master-0 kubenswrapper[4842]: I1204 21:58:00.197227 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/0.log" Dec 04 21:58:00.198235 master-0 kubenswrapper[4842]: I1204 21:58:00.198158 4842 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="717ada19e94e366402ae8b80d88baf330e3c481008a69b23e54f05dc45212171" exitCode=1 Dec 04 21:58:00.198486 master-0 kubenswrapper[4842]: I1204 21:58:00.198255 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"717ada19e94e366402ae8b80d88baf330e3c481008a69b23e54f05dc45212171"} Dec 04 21:58:00.198486 master-0 kubenswrapper[4842]: I1204 21:58:00.198337 4842 scope.go:117] "RemoveContainer" containerID="1869b63fd92f7983086b3f319eae480d99193930cb8dfedef3a0d5482fd43fa2" Dec 04 21:58:00.198486 master-0 kubenswrapper[4842]: I1204 21:58:00.198347 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:00.198615 master-0 kubenswrapper[4842]: I1204 21:58:00.198546 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:00.198615 master-0 kubenswrapper[4842]: I1204 21:58:00.198574 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:00.198670 master-0 kubenswrapper[4842]: I1204 21:58:00.198618 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:00.200253 master-0 kubenswrapper[4842]: I1204 21:58:00.200218 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:00.200884 master-0 kubenswrapper[4842]: I1204 21:58:00.200843 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:00.200884 master-0 kubenswrapper[4842]: I1204 21:58:00.200877 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:00.201235 master-0 kubenswrapper[4842]: I1204 21:58:00.201188 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"d00575d56d81b17e8c0212c4cad634fe2f3afd13660a8de6afbe8a4381dd50d7"} Dec 04 21:58:00.202265 master-0 kubenswrapper[4842]: I1204 21:58:00.201855 4842 scope.go:117] "RemoveContainer" containerID="717ada19e94e366402ae8b80d88baf330e3c481008a69b23e54f05dc45212171" Dec 04 21:58:00.202265 master-0 kubenswrapper[4842]: I1204 21:58:00.201857 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:00.202265 master-0 kubenswrapper[4842]: E1204 21:58:00.202129 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="3169f44496ed8a28c6d6a15511ab0eec" Dec 04 21:58:00.203034 master-0 kubenswrapper[4842]: I1204 21:58:00.202983 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:00.203117 master-0 kubenswrapper[4842]: I1204 21:58:00.203044 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:00.203117 master-0 kubenswrapper[4842]: I1204 21:58:00.203066 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:00.235682 master-0 kubenswrapper[4842]: W1204 21:58:00.235575 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 04 21:58:00.235860 master-0 kubenswrapper[4842]: E1204 21:58:00.235704 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 04 21:58:01.206517 master-0 kubenswrapper[4842]: I1204 21:58:01.206149 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/1.log" Dec 04 21:58:01.207196 master-0 kubenswrapper[4842]: I1204 21:58:01.207167 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:01.211493 master-0 kubenswrapper[4842]: I1204 21:58:01.211422 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:01.211493 master-0 kubenswrapper[4842]: I1204 21:58:01.211477 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:01.211493 master-0 kubenswrapper[4842]: I1204 21:58:01.211487 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:01.212894 master-0 kubenswrapper[4842]: I1204 21:58:01.212856 4842 scope.go:117] "RemoveContainer" containerID="717ada19e94e366402ae8b80d88baf330e3c481008a69b23e54f05dc45212171" Dec 04 21:58:01.213137 master-0 kubenswrapper[4842]: E1204 21:58:01.213097 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="3169f44496ed8a28c6d6a15511ab0eec" Dec 04 21:58:01.220299 master-0 kubenswrapper[4842]: I1204 21:58:01.218356 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10"} Dec 04 21:58:01.220299 master-0 kubenswrapper[4842]: I1204 21:58:01.219868 4842 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="183ef4cdec698bdfc64b154e7dcc6b79ea7ba6ed603e295f5859a3a5c552d57d" exitCode=1 Dec 04 21:58:01.220299 master-0 kubenswrapper[4842]: I1204 21:58:01.219981 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:01.220299 master-0 kubenswrapper[4842]: I1204 21:58:01.220166 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"183ef4cdec698bdfc64b154e7dcc6b79ea7ba6ed603e295f5859a3a5c552d57d"} Dec 04 21:58:01.221267 master-0 kubenswrapper[4842]: I1204 21:58:01.220849 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:01.221267 master-0 kubenswrapper[4842]: I1204 21:58:01.220897 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:01.221267 master-0 kubenswrapper[4842]: I1204 21:58:01.220909 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:02.029679 master-0 kubenswrapper[4842]: I1204 21:58:02.029598 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:02.573909 master-0 kubenswrapper[4842]: E1204 21:58:02.573845 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 04 21:58:02.809320 master-0 kubenswrapper[4842]: I1204 21:58:02.809192 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:02.811783 master-0 kubenswrapper[4842]: I1204 21:58:02.811047 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:02.811783 master-0 kubenswrapper[4842]: I1204 21:58:02.811115 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:02.811783 master-0 kubenswrapper[4842]: I1204 21:58:02.811130 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:02.811783 master-0 kubenswrapper[4842]: I1204 21:58:02.811206 4842 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 21:58:02.817123 master-0 kubenswrapper[4842]: E1204 21:58:02.817061 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 04 21:58:02.949998 master-0 kubenswrapper[4842]: I1204 21:58:02.949804 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:03.950787 master-0 kubenswrapper[4842]: I1204 21:58:03.950695 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:04.671331 master-0 kubenswrapper[4842]: I1204 21:58:04.671161 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 04 21:58:04.690867 master-0 kubenswrapper[4842]: I1204 21:58:04.690805 4842 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 04 21:58:04.949707 master-0 kubenswrapper[4842]: I1204 21:58:04.949529 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:05.240110 master-0 kubenswrapper[4842]: I1204 21:58:05.239839 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"70ec2f528f522213daf96bac275fda7cf7f15b026ed56e4b58dab19aaca3bd29"} Dec 04 21:58:05.241292 master-0 kubenswrapper[4842]: I1204 21:58:05.241253 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:05.242640 master-0 kubenswrapper[4842]: I1204 21:58:05.242572 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a"} Dec 04 21:58:05.242640 master-0 kubenswrapper[4842]: I1204 21:58:05.242583 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:05.242887 master-0 kubenswrapper[4842]: I1204 21:58:05.242663 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:05.242887 master-0 kubenswrapper[4842]: I1204 21:58:05.242665 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:05.242887 master-0 kubenswrapper[4842]: I1204 21:58:05.242874 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:05.243472 master-0 kubenswrapper[4842]: I1204 21:58:05.243415 4842 scope.go:117] "RemoveContainer" containerID="183ef4cdec698bdfc64b154e7dcc6b79ea7ba6ed603e295f5859a3a5c552d57d" Dec 04 21:58:05.244178 master-0 kubenswrapper[4842]: I1204 21:58:05.244121 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:05.244178 master-0 kubenswrapper[4842]: I1204 21:58:05.244159 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:05.244178 master-0 kubenswrapper[4842]: I1204 21:58:05.244172 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:05.582570 master-0 kubenswrapper[4842]: E1204 21:58:05.579938 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf109c948 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:49.940271432 +0000 UTC m=+0.255083647,LastTimestamp:2025-12-04 21:57:49.940271432 +0000 UTC m=+0.255083647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.588254 master-0 kubenswrapper[4842]: E1204 21:58:05.587981 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c3c51b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002791707 +0000 UTC m=+0.317603912,LastTimestamp:2025-12-04 21:57:50.002791707 +0000 UTC m=+0.317603912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.596974 master-0 kubenswrapper[4842]: E1204 21:58:05.596791 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c449f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002825718 +0000 UTC m=+0.317637923,LastTimestamp:2025-12-04 21:57:50.002825718 +0000 UTC m=+0.317637923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.605320 master-0 kubenswrapper[4842]: E1204 21:58:05.605157 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c48330 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002840368 +0000 UTC m=+0.317652563,LastTimestamp:2025-12-04 21:57:50.002840368 +0000 UTC m=+0.317652563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.612544 master-0 kubenswrapper[4842]: E1204 21:58:05.612333 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf9e86d61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.089080161 +0000 UTC m=+0.403892356,LastTimestamp:2025-12-04 21:57:50.089080161 +0000 UTC m=+0.403892356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.619766 master-0 kubenswrapper[4842]: E1204 21:58:05.619604 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c3c51b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c3c51b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002791707 +0000 UTC m=+0.317603912,LastTimestamp:2025-12-04 21:57:50.188135536 +0000 UTC m=+0.502947731,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.626596 master-0 kubenswrapper[4842]: E1204 21:58:05.626377 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c449f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c449f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002825718 +0000 UTC m=+0.317637923,LastTimestamp:2025-12-04 21:57:50.188163517 +0000 UTC m=+0.502975712,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.633657 master-0 kubenswrapper[4842]: E1204 21:58:05.633410 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c48330\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c48330 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002840368 +0000 UTC m=+0.317652563,LastTimestamp:2025-12-04 21:57:50.188175357 +0000 UTC m=+0.502987552,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.640893 master-0 kubenswrapper[4842]: E1204 21:58:05.640695 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c3c51b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c3c51b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002791707 +0000 UTC m=+0.317603912,LastTimestamp:2025-12-04 21:57:50.245873632 +0000 UTC m=+0.560685857,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.648220 master-0 kubenswrapper[4842]: E1204 21:58:05.648020 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c449f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c449f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002825718 +0000 UTC m=+0.317637923,LastTimestamp:2025-12-04 21:57:50.245908952 +0000 UTC m=+0.560721177,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.655009 master-0 kubenswrapper[4842]: E1204 21:58:05.654791 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c48330\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c48330 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002840368 +0000 UTC m=+0.317652563,LastTimestamp:2025-12-04 21:57:50.245925723 +0000 UTC m=+0.560737948,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.662378 master-0 kubenswrapper[4842]: E1204 21:58:05.662199 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c3c51b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c3c51b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002791707 +0000 UTC m=+0.317603912,LastTimestamp:2025-12-04 21:57:50.247353688 +0000 UTC m=+0.562165873,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.670305 master-0 kubenswrapper[4842]: E1204 21:58:05.670087 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c449f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c449f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002825718 +0000 UTC m=+0.317637923,LastTimestamp:2025-12-04 21:57:50.247368409 +0000 UTC m=+0.562180594,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.677652 master-0 kubenswrapper[4842]: E1204 21:58:05.677437 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c48330\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c48330 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002840368 +0000 UTC m=+0.317652563,LastTimestamp:2025-12-04 21:57:50.247377429 +0000 UTC m=+0.562189614,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.684638 master-0 kubenswrapper[4842]: E1204 21:58:05.684474 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c3c51b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c3c51b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002791707 +0000 UTC m=+0.317603912,LastTimestamp:2025-12-04 21:57:50.248112848 +0000 UTC m=+0.562925033,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.693076 master-0 kubenswrapper[4842]: E1204 21:58:05.692998 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c449f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c449f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002825718 +0000 UTC m=+0.317637923,LastTimestamp:2025-12-04 21:57:50.248123998 +0000 UTC m=+0.562936184,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.699646 master-0 kubenswrapper[4842]: E1204 21:58:05.699424 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c48330\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c48330 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002840368 +0000 UTC m=+0.317652563,LastTimestamp:2025-12-04 21:57:50.248137579 +0000 UTC m=+0.562949764,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.704379 master-0 kubenswrapper[4842]: E1204 21:58:05.704253 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c3c51b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c3c51b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002791707 +0000 UTC m=+0.317603912,LastTimestamp:2025-12-04 21:57:50.248164039 +0000 UTC m=+0.562976234,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.711151 master-0 kubenswrapper[4842]: E1204 21:58:05.710981 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c449f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c449f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002825718 +0000 UTC m=+0.317637923,LastTimestamp:2025-12-04 21:57:50.24818655 +0000 UTC m=+0.562998745,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.717033 master-0 kubenswrapper[4842]: E1204 21:58:05.716875 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c48330\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c48330 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002840368 +0000 UTC m=+0.317652563,LastTimestamp:2025-12-04 21:57:50.2481999 +0000 UTC m=+0.563012095,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.722660 master-0 kubenswrapper[4842]: E1204 21:58:05.722431 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c3c51b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c3c51b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002791707 +0000 UTC m=+0.317603912,LastTimestamp:2025-12-04 21:57:50.248472677 +0000 UTC m=+0.563284902,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.727553 master-0 kubenswrapper[4842]: E1204 21:58:05.727372 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c449f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c449f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002825718 +0000 UTC m=+0.317637923,LastTimestamp:2025-12-04 21:57:50.248493118 +0000 UTC m=+0.563305343,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.734777 master-0 kubenswrapper[4842]: E1204 21:58:05.734591 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c48330\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c48330 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002840368 +0000 UTC m=+0.317652563,LastTimestamp:2025-12-04 21:57:50.248543199 +0000 UTC m=+0.563355414,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.742102 master-0 kubenswrapper[4842]: E1204 21:58:05.741894 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c3c51b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c3c51b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002791707 +0000 UTC m=+0.317603912,LastTimestamp:2025-12-04 21:57:50.249373359 +0000 UTC m=+0.564185534,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.749905 master-0 kubenswrapper[4842]: E1204 21:58:05.749717 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e21fbf4c449f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e21fbf4c449f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:50.002825718 +0000 UTC m=+0.317637923,LastTimestamp:2025-12-04 21:57:50.24938685 +0000 UTC m=+0.564199035,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.758411 master-0 kubenswrapper[4842]: E1204 21:58:05.758206 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e21fc46410b8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:51.369956235 +0000 UTC m=+1.684768420,LastTimestamp:2025-12-04 21:57:51.369956235 +0000 UTC m=+1.684768420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.765335 master-0 kubenswrapper[4842]: E1204 21:58:05.765150 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21fc464283dc kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:51.370052572 +0000 UTC m=+1.684864797,LastTimestamp:2025-12-04 21:57:51.370052572 +0000 UTC m=+1.684864797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.772681 master-0 kubenswrapper[4842]: E1204 21:58:05.772414 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21fc499e9a19 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:51.426419225 +0000 UTC m=+1.741231450,LastTimestamp:2025-12-04 21:57:51.426419225 +0000 UTC m=+1.741231450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.780219 master-0 kubenswrapper[4842]: E1204 21:58:05.780034 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e21fc55c82532 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:51.630468402 +0000 UTC m=+1.945280577,LastTimestamp:2025-12-04 21:57:51.630468402 +0000 UTC m=+1.945280577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.785384 master-0 kubenswrapper[4842]: E1204 21:58:05.785201 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fc5b9c725d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:51.728267869 +0000 UTC m=+2.043080064,LastTimestamp:2025-12-04 21:57:51.728267869 +0000 UTC m=+2.043080064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.794662 master-0 kubenswrapper[4842]: E1204 21:58:05.793804 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fce2791ef7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\" in 2.262s (2.262s including waiting). Image size: 459552216 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:53.990876919 +0000 UTC m=+4.305689114,LastTimestamp:2025-12-04 21:57:53.990876919 +0000 UTC m=+4.305689114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.800980 master-0 kubenswrapper[4842]: E1204 21:58:05.800792 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e21fce34e3147 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\" in 2.634s (2.634s including waiting). Image size: 532719167 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:54.004840775 +0000 UTC m=+4.319652960,LastTimestamp:2025-12-04 21:57:54.004840775 +0000 UTC m=+4.319652960,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.808749 master-0 kubenswrapper[4842]: E1204 21:58:05.808484 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fcf059e8ae openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:54.22371243 +0000 UTC m=+4.538524615,LastTimestamp:2025-12-04 21:57:54.22371243 +0000 UTC m=+4.538524615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.818753 master-0 kubenswrapper[4842]: E1204 21:58:05.818543 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e21fcf05b45fc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:54.223801852 +0000 UTC m=+4.538614037,LastTimestamp:2025-12-04 21:57:54.223801852 +0000 UTC m=+4.538614037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.825157 master-0 kubenswrapper[4842]: E1204 21:58:05.824946 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fcf1373e01 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:54.238217729 +0000 UTC m=+4.553029914,LastTimestamp:2025-12-04 21:57:54.238217729 +0000 UTC m=+4.553029914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.833769 master-0 kubenswrapper[4842]: E1204 21:58:05.833446 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e21fcf14e2043 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:54.239717443 +0000 UTC m=+4.554529628,LastTimestamp:2025-12-04 21:57:54.239717443 +0000 UTC m=+4.554529628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.843702 master-0 kubenswrapper[4842]: E1204 21:58:05.843418 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e21fcf171744e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:54.242032718 +0000 UTC m=+4.556844913,LastTimestamp:2025-12-04 21:57:54.242032718 +0000 UTC m=+4.556844913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.854267 master-0 kubenswrapper[4842]: E1204 21:58:05.854040 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e21fcfe33f35d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:54.456105821 +0000 UTC m=+4.770918006,LastTimestamp:2025-12-04 21:57:54.456105821 +0000 UTC m=+4.770918006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.860032 master-0 kubenswrapper[4842]: E1204 21:58:05.859783 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e21fcff063dca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:54.469887434 +0000 UTC m=+4.784699619,LastTimestamp:2025-12-04 21:57:54.469887434 +0000 UTC m=+4.784699619,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.868383 master-0 kubenswrapper[4842]: E1204 21:58:05.868142 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fd28fa61b9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:55.173753273 +0000 UTC m=+5.488565458,LastTimestamp:2025-12-04 21:57:55.173753273 +0000 UTC m=+5.488565458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.876068 master-0 kubenswrapper[4842]: E1204 21:58:05.875869 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fd34156f93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:55.360075667 +0000 UTC m=+5.674887852,LastTimestamp:2025-12-04 21:57:55.360075667 +0000 UTC m=+5.674887852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.881861 master-0 kubenswrapper[4842]: E1204 21:58:05.881664 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fd34be7db3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:55.371154867 +0000 UTC m=+5.685967062,LastTimestamp:2025-12-04 21:57:55.371154867 +0000 UTC m=+5.685967062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.888839 master-0 kubenswrapper[4842]: E1204 21:58:05.888643 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e21fd28fa61b9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fd28fa61b9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:55.173753273 +0000 UTC m=+5.488565458,LastTimestamp:2025-12-04 21:57:59.500464643 +0000 UTC m=+9.815276838,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.896538 master-0 kubenswrapper[4842]: E1204 21:58:05.896298 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21fe30b5387e kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" in 8.228s (8.228s including waiting). Image size: 938303566 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.598405758 +0000 UTC m=+9.913217983,LastTimestamp:2025-12-04 21:57:59.598405758 +0000 UTC m=+9.913217983,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.923409 master-0 kubenswrapper[4842]: E1204 21:58:05.923160 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e21fe3248fd26 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" in 7.994s (7.994s including waiting). Image size: 938303566 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.62486711 +0000 UTC m=+9.939679295,LastTimestamp:2025-12-04 21:57:59.62486711 +0000 UTC m=+9.939679295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.939962 master-0 kubenswrapper[4842]: E1204 21:58:05.939746 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21fe339c5b68 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" in 8.22s (8.22s including waiting). Image size: 938303566 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.647107944 +0000 UTC m=+9.961920139,LastTimestamp:2025-12-04 21:57:59.647107944 +0000 UTC m=+9.961920139,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.948448 master-0 kubenswrapper[4842]: I1204 21:58:05.948410 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:05.948995 master-0 kubenswrapper[4842]: E1204 21:58:05.948826 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e21fd34156f93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fd34156f93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:55.360075667 +0000 UTC m=+5.674887852,LastTimestamp:2025-12-04 21:57:59.758185101 +0000 UTC m=+10.072997296,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.955018 master-0 kubenswrapper[4842]: E1204 21:58:05.954883 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e21fd34be7db3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fd34be7db3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:55.371154867 +0000 UTC m=+5.685967062,LastTimestamp:2025-12-04 21:57:59.772989078 +0000 UTC m=+10.087801273,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.960045 master-0 kubenswrapper[4842]: E1204 21:58:05.959885 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21fe3c4aaed1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.792750289 +0000 UTC m=+10.107562484,LastTimestamp:2025-12-04 21:57:59.792750289 +0000 UTC m=+10.107562484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.971882 master-0 kubenswrapper[4842]: E1204 21:58:05.971752 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e21fe3d089eaa kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.805197994 +0000 UTC m=+10.120010189,LastTimestamp:2025-12-04 21:57:59.805197994 +0000 UTC m=+10.120010189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.976469 master-0 kubenswrapper[4842]: E1204 21:58:05.976301 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21fe3d18e767 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.806265191 +0000 UTC m=+10.121077386,LastTimestamp:2025-12-04 21:57:59.806265191 +0000 UTC m=+10.121077386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.982116 master-0 kubenswrapper[4842]: E1204 21:58:05.981976 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21fe3d2f727d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.807742589 +0000 UTC m=+10.122554784,LastTimestamp:2025-12-04 21:57:59.807742589 +0000 UTC m=+10.122554784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.987939 master-0 kubenswrapper[4842]: E1204 21:58:05.987784 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e21fe3dbd1aba kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.817026234 +0000 UTC m=+10.131838439,LastTimestamp:2025-12-04 21:57:59.817026234 +0000 UTC m=+10.131838439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.994124 master-0 kubenswrapper[4842]: E1204 21:58:05.993974 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21fe471b285b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.974185051 +0000 UTC m=+10.288997256,LastTimestamp:2025-12-04 21:57:59.974185051 +0000 UTC m=+10.288997256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:05.999283 master-0 kubenswrapper[4842]: E1204 21:58:05.999169 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21fe4866d8c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.995922633 +0000 UTC m=+10.310734828,LastTimestamp:2025-12-04 21:57:59.995922633 +0000 UTC m=+10.310734828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.005082 master-0 kubenswrapper[4842]: E1204 21:58:06.004965 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21fe545c3bd3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:00.196553683 +0000 UTC m=+10.511365908,LastTimestamp:2025-12-04 21:58:00.196553683 +0000 UTC m=+10.511365908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.012666 master-0 kubenswrapper[4842]: E1204 21:58:06.012539 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fe54b083eb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:00.202077163 +0000 UTC m=+10.516889388,LastTimestamp:2025-12-04 21:58:00.202077163 +0000 UTC m=+10.516889388,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.019859 master-0 kubenswrapper[4842]: E1204 21:58:06.019746 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21fe607ce45e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:00.400020574 +0000 UTC m=+10.714832799,LastTimestamp:2025-12-04 21:58:00.400020574 +0000 UTC m=+10.714832799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.029485 master-0 kubenswrapper[4842]: E1204 21:58:06.029296 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21fe6128a5f6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:00.41127679 +0000 UTC m=+10.726088985,LastTimestamp:2025-12-04 21:58:00.41127679 +0000 UTC m=+10.726088985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.035763 master-0 kubenswrapper[4842]: E1204 21:58:06.035616 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21fe61406d6c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:00.41283518 +0000 UTC m=+10.727647395,LastTimestamp:2025-12-04 21:58:00.41283518 +0000 UTC m=+10.727647395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.042197 master-0 kubenswrapper[4842]: E1204 21:58:06.042075 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e21fe54b083eb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fe54b083eb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:00.202077163 +0000 UTC m=+10.516889388,LastTimestamp:2025-12-04 21:58:01.213050722 +0000 UTC m=+11.527862907,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.053091 master-0 kubenswrapper[4842]: E1204 21:58:06.052842 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21ff685875dc kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\" in 5.019s (5.019s including waiting). Image size: 499705918 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:04.826818012 +0000 UTC m=+15.141630197,LastTimestamp:2025-12-04 21:58:04.826818012 +0000 UTC m=+15.141630197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.058058 master-0 kubenswrapper[4842]: E1204 21:58:06.057881 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21ff6969991e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\" in 4.431s (4.431s including waiting). Image size: 509437356 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:04.844718366 +0000 UTC m=+15.159530551,LastTimestamp:2025-12-04 21:58:04.844718366 +0000 UTC m=+15.159530551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.063631 master-0 kubenswrapper[4842]: E1204 21:58:06.063432 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21ff746a9a06 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:05.02933351 +0000 UTC m=+15.344145735,LastTimestamp:2025-12-04 21:58:05.02933351 +0000 UTC m=+15.344145735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.071688 master-0 kubenswrapper[4842]: E1204 21:58:06.071374 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21ff747565d7 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:05.030041047 +0000 UTC m=+15.344853252,LastTimestamp:2025-12-04 21:58:05.030041047 +0000 UTC m=+15.344853252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.077373 master-0 kubenswrapper[4842]: E1204 21:58:06.077208 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21ff753f3575 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:05.043266933 +0000 UTC m=+15.358079148,LastTimestamp:2025-12-04 21:58:05.043266933 +0000 UTC m=+15.358079148,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.081864 master-0 kubenswrapper[4842]: E1204 21:58:06.081731 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e21ff754a14cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:05.043979471 +0000 UTC m=+15.358791686,LastTimestamp:2025-12-04 21:58:05.043979471 +0000 UTC m=+15.358791686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.086714 master-0 kubenswrapper[4842]: E1204 21:58:06.086546 4842 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21ff816518c0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:05.247076544 +0000 UTC m=+15.561888769,LastTimestamp:2025-12-04 21:58:05.247076544 +0000 UTC m=+15.561888769,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.090909 master-0 kubenswrapper[4842]: E1204 21:58:06.090814 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.187e21fe3c4aaed1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21fe3c4aaed1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.792750289 +0000 UTC m=+10.107562484,LastTimestamp:2025-12-04 21:58:05.472877072 +0000 UTC m=+15.787689277,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.095356 master-0 kubenswrapper[4842]: E1204 21:58:06.095191 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.187e21fe3d18e767\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e21fe3d18e767 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:59.806265191 +0000 UTC m=+10.121077386,LastTimestamp:2025-12-04 21:58:05.482653911 +0000 UTC m=+15.797466106,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:06.250156 master-0 kubenswrapper[4842]: I1204 21:58:06.250087 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:06.251119 master-0 kubenswrapper[4842]: I1204 21:58:06.250419 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b"} Dec 04 21:58:06.251119 master-0 kubenswrapper[4842]: I1204 21:58:06.250573 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:06.251570 master-0 kubenswrapper[4842]: I1204 21:58:06.251478 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:06.251656 master-0 kubenswrapper[4842]: I1204 21:58:06.251626 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:06.251721 master-0 kubenswrapper[4842]: I1204 21:58:06.251704 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:06.251980 master-0 kubenswrapper[4842]: I1204 21:58:06.251724 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:06.252072 master-0 kubenswrapper[4842]: I1204 21:58:06.252018 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:06.252140 master-0 kubenswrapper[4842]: I1204 21:58:06.252074 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:06.474778 master-0 kubenswrapper[4842]: W1204 21:58:06.474637 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Dec 04 21:58:06.474778 master-0 kubenswrapper[4842]: E1204 21:58:06.474730 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 04 21:58:06.647147 master-0 kubenswrapper[4842]: I1204 21:58:06.647009 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:58:06.654154 master-0 kubenswrapper[4842]: I1204 21:58:06.654097 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:58:06.710261 master-0 kubenswrapper[4842]: I1204 21:58:06.710154 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:58:06.717727 master-0 kubenswrapper[4842]: I1204 21:58:06.717610 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:58:06.909411 master-0 kubenswrapper[4842]: W1204 21:58:06.909266 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Dec 04 21:58:06.909411 master-0 kubenswrapper[4842]: E1204 21:58:06.909353 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 04 21:58:06.950283 master-0 kubenswrapper[4842]: I1204 21:58:06.950177 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:07.253404 master-0 kubenswrapper[4842]: I1204 21:58:07.253211 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:07.253404 master-0 kubenswrapper[4842]: I1204 21:58:07.253259 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:58:07.253404 master-0 kubenswrapper[4842]: I1204 21:58:07.253212 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:07.254073 master-0 kubenswrapper[4842]: I1204 21:58:07.253579 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:58:07.254861 master-0 kubenswrapper[4842]: I1204 21:58:07.254814 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:07.254861 master-0 kubenswrapper[4842]: I1204 21:58:07.254860 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:07.254945 master-0 kubenswrapper[4842]: I1204 21:58:07.254875 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:07.255111 master-0 kubenswrapper[4842]: I1204 21:58:07.255029 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:07.255402 master-0 kubenswrapper[4842]: I1204 21:58:07.255319 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:07.255402 master-0 kubenswrapper[4842]: I1204 21:58:07.255368 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:07.260604 master-0 kubenswrapper[4842]: I1204 21:58:07.260570 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 21:58:07.951146 master-0 kubenswrapper[4842]: I1204 21:58:07.951058 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:08.209126 master-0 kubenswrapper[4842]: W1204 21:58:08.208961 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:08.209126 master-0 kubenswrapper[4842]: E1204 21:58:08.209077 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 04 21:58:08.255263 master-0 kubenswrapper[4842]: I1204 21:58:08.255202 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:08.256030 master-0 kubenswrapper[4842]: I1204 21:58:08.255339 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:08.256422 master-0 kubenswrapper[4842]: I1204 21:58:08.256390 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:08.256470 master-0 kubenswrapper[4842]: I1204 21:58:08.256424 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:08.256470 master-0 kubenswrapper[4842]: I1204 21:58:08.256434 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:08.257329 master-0 kubenswrapper[4842]: I1204 21:58:08.257281 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:08.257388 master-0 kubenswrapper[4842]: I1204 21:58:08.257336 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:08.257388 master-0 kubenswrapper[4842]: I1204 21:58:08.257350 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:08.951766 master-0 kubenswrapper[4842]: I1204 21:58:08.951679 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:09.258472 master-0 kubenswrapper[4842]: I1204 21:58:09.258267 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:09.259803 master-0 kubenswrapper[4842]: I1204 21:58:09.259741 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:09.259879 master-0 kubenswrapper[4842]: I1204 21:58:09.259826 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:09.259879 master-0 kubenswrapper[4842]: I1204 21:58:09.259851 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:09.381888 master-0 kubenswrapper[4842]: I1204 21:58:09.381794 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:58:09.382165 master-0 kubenswrapper[4842]: I1204 21:58:09.382075 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:09.383630 master-0 kubenswrapper[4842]: I1204 21:58:09.383551 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:09.383630 master-0 kubenswrapper[4842]: I1204 21:58:09.383627 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:09.383895 master-0 kubenswrapper[4842]: I1204 21:58:09.383647 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:09.389330 master-0 kubenswrapper[4842]: I1204 21:58:09.389278 4842 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:58:09.581859 master-0 kubenswrapper[4842]: E1204 21:58:09.581762 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 04 21:58:09.755414 master-0 kubenswrapper[4842]: W1204 21:58:09.755330 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Dec 04 21:58:09.755414 master-0 kubenswrapper[4842]: E1204 21:58:09.755414 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 04 21:58:09.818084 master-0 kubenswrapper[4842]: I1204 21:58:09.817985 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:09.819881 master-0 kubenswrapper[4842]: I1204 21:58:09.819818 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:09.819968 master-0 kubenswrapper[4842]: I1204 21:58:09.819911 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:09.819968 master-0 kubenswrapper[4842]: I1204 21:58:09.819942 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:09.820047 master-0 kubenswrapper[4842]: I1204 21:58:09.820033 4842 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 21:58:09.828806 master-0 kubenswrapper[4842]: E1204 21:58:09.828738 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 04 21:58:09.951828 master-0 kubenswrapper[4842]: I1204 21:58:09.951634 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:10.090041 master-0 kubenswrapper[4842]: E1204 21:58:10.089875 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 04 21:58:10.260543 master-0 kubenswrapper[4842]: I1204 21:58:10.260351 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:10.260543 master-0 kubenswrapper[4842]: I1204 21:58:10.260419 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:58:10.262818 master-0 kubenswrapper[4842]: I1204 21:58:10.261743 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:10.262818 master-0 kubenswrapper[4842]: I1204 21:58:10.261815 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:10.262818 master-0 kubenswrapper[4842]: I1204 21:58:10.261829 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:10.950032 master-0 kubenswrapper[4842]: I1204 21:58:10.949839 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:11.262929 master-0 kubenswrapper[4842]: I1204 21:58:11.262764 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:11.264136 master-0 kubenswrapper[4842]: I1204 21:58:11.264075 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:11.264204 master-0 kubenswrapper[4842]: I1204 21:58:11.264143 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:11.264204 master-0 kubenswrapper[4842]: I1204 21:58:11.264163 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:11.954208 master-0 kubenswrapper[4842]: I1204 21:58:11.953995 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:12.005033 master-0 kubenswrapper[4842]: I1204 21:58:12.004960 4842 csr.go:261] certificate signing request csr-kqfrh is approved, waiting to be issued Dec 04 21:58:12.950698 master-0 kubenswrapper[4842]: I1204 21:58:12.950584 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:13.145187 master-0 kubenswrapper[4842]: I1204 21:58:13.145070 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:13.146854 master-0 kubenswrapper[4842]: I1204 21:58:13.146760 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:13.146854 master-0 kubenswrapper[4842]: I1204 21:58:13.146852 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:13.147092 master-0 kubenswrapper[4842]: I1204 21:58:13.146877 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:13.147684 master-0 kubenswrapper[4842]: I1204 21:58:13.147638 4842 scope.go:117] "RemoveContainer" containerID="717ada19e94e366402ae8b80d88baf330e3c481008a69b23e54f05dc45212171" Dec 04 21:58:13.158549 master-0 kubenswrapper[4842]: E1204 21:58:13.158349 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e21fd28fa61b9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fd28fa61b9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:55.173753273 +0000 UTC m=+5.488565458,LastTimestamp:2025-12-04 21:58:13.151679271 +0000 UTC m=+23.466491496,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:13.435782 master-0 kubenswrapper[4842]: E1204 21:58:13.435541 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e21fd34156f93\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fd34156f93 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:55.360075667 +0000 UTC m=+5.674887852,LastTimestamp:2025-12-04 21:58:13.426229136 +0000 UTC m=+23.741041351,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:13.451280 master-0 kubenswrapper[4842]: E1204 21:58:13.451110 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e21fd34be7db3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fd34be7db3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:57:55.371154867 +0000 UTC m=+5.685967062,LastTimestamp:2025-12-04 21:58:13.443255138 +0000 UTC m=+23.758067363,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:13.950814 master-0 kubenswrapper[4842]: I1204 21:58:13.950733 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:14.276732 master-0 kubenswrapper[4842]: I1204 21:58:14.276638 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/2.log" Dec 04 21:58:14.277407 master-0 kubenswrapper[4842]: I1204 21:58:14.277347 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/1.log" Dec 04 21:58:14.278571 master-0 kubenswrapper[4842]: I1204 21:58:14.278452 4842 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="b6f9e5d170c5c01abcc938a01683f90fda3ae8ba34521ff9d208045fb85cbe9d" exitCode=1 Dec 04 21:58:14.278713 master-0 kubenswrapper[4842]: I1204 21:58:14.278552 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"b6f9e5d170c5c01abcc938a01683f90fda3ae8ba34521ff9d208045fb85cbe9d"} Dec 04 21:58:14.278713 master-0 kubenswrapper[4842]: I1204 21:58:14.278670 4842 scope.go:117] "RemoveContainer" containerID="717ada19e94e366402ae8b80d88baf330e3c481008a69b23e54f05dc45212171" Dec 04 21:58:14.278957 master-0 kubenswrapper[4842]: I1204 21:58:14.278895 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:14.281461 master-0 kubenswrapper[4842]: I1204 21:58:14.281287 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:14.281461 master-0 kubenswrapper[4842]: I1204 21:58:14.281358 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:14.281461 master-0 kubenswrapper[4842]: I1204 21:58:14.281384 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:14.282199 master-0 kubenswrapper[4842]: I1204 21:58:14.282157 4842 scope.go:117] "RemoveContainer" containerID="b6f9e5d170c5c01abcc938a01683f90fda3ae8ba34521ff9d208045fb85cbe9d" Dec 04 21:58:14.282479 master-0 kubenswrapper[4842]: E1204 21:58:14.282435 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="3169f44496ed8a28c6d6a15511ab0eec" Dec 04 21:58:14.291292 master-0 kubenswrapper[4842]: E1204 21:58:14.291123 4842 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e21fe54b083eb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e21fe54b083eb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 21:58:00.202077163 +0000 UTC m=+10.516889388,LastTimestamp:2025-12-04 21:58:14.282381747 +0000 UTC m=+24.597193962,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 21:58:14.948673 master-0 kubenswrapper[4842]: I1204 21:58:14.948590 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:15.283957 master-0 kubenswrapper[4842]: I1204 21:58:15.283877 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/2.log" Dec 04 21:58:15.952146 master-0 kubenswrapper[4842]: I1204 21:58:15.952041 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:16.589982 master-0 kubenswrapper[4842]: E1204 21:58:16.589871 4842 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 04 21:58:16.829984 master-0 kubenswrapper[4842]: I1204 21:58:16.829910 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:16.831278 master-0 kubenswrapper[4842]: I1204 21:58:16.831244 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:16.831355 master-0 kubenswrapper[4842]: I1204 21:58:16.831284 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:16.831355 master-0 kubenswrapper[4842]: I1204 21:58:16.831301 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:16.831428 master-0 kubenswrapper[4842]: I1204 21:58:16.831359 4842 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 21:58:16.836209 master-0 kubenswrapper[4842]: E1204 21:58:16.836166 4842 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 04 21:58:16.951073 master-0 kubenswrapper[4842]: I1204 21:58:16.950881 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:17.951568 master-0 kubenswrapper[4842]: I1204 21:58:17.951437 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:18.950424 master-0 kubenswrapper[4842]: I1204 21:58:18.950288 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:20.963420 master-0 kubenswrapper[4842]: E1204 21:58:20.963208 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 04 21:58:20.971720 master-0 kubenswrapper[4842]: W1204 21:58:20.971654 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Dec 04 21:58:20.971720 master-0 kubenswrapper[4842]: E1204 21:58:20.971729 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 04 21:58:20.972627 master-0 kubenswrapper[4842]: I1204 21:58:20.972537 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:20.972870 master-0 kubenswrapper[4842]: I1204 21:58:20.972798 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:58:20.973105 master-0 kubenswrapper[4842]: I1204 21:58:20.973050 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:20.974794 master-0 kubenswrapper[4842]: I1204 21:58:20.974739 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:20.974918 master-0 kubenswrapper[4842]: I1204 21:58:20.974801 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:20.975213 master-0 kubenswrapper[4842]: I1204 21:58:20.975163 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:20.977725 master-0 kubenswrapper[4842]: I1204 21:58:20.977658 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 21:58:21.945193 master-0 kubenswrapper[4842]: I1204 21:58:21.945055 4842 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 04 21:58:21.953627 master-0 kubenswrapper[4842]: W1204 21:58:21.953555 4842 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Dec 04 21:58:21.953627 master-0 kubenswrapper[4842]: E1204 21:58:21.953619 4842 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 04 21:58:21.970112 master-0 kubenswrapper[4842]: I1204 21:58:21.970058 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:21.971424 master-0 kubenswrapper[4842]: I1204 21:58:21.971345 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:21.971424 master-0 kubenswrapper[4842]: I1204 21:58:21.971411 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:21.971424 master-0 kubenswrapper[4842]: I1204 21:58:21.971425 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:22.251058 master-0 kubenswrapper[4842]: I1204 21:58:22.250069 4842 csr.go:257] certificate signing request csr-kqfrh is issued Dec 04 21:58:22.853708 master-0 kubenswrapper[4842]: I1204 21:58:22.853622 4842 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 04 21:58:22.956309 master-0 kubenswrapper[4842]: I1204 21:58:22.956189 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:22.975183 master-0 kubenswrapper[4842]: I1204 21:58:22.975101 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:23.037062 master-0 kubenswrapper[4842]: I1204 21:58:23.036944 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:23.252055 master-0 kubenswrapper[4842]: I1204 21:58:23.251893 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-05 21:50:08 +0000 UTC, rotation deadline is 2025-12-05 15:27:48.785794859 +0000 UTC Dec 04 21:58:23.252055 master-0 kubenswrapper[4842]: I1204 21:58:23.251973 4842 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h29m25.533830277s for next certificate rotation Dec 04 21:58:23.297407 master-0 kubenswrapper[4842]: I1204 21:58:23.297319 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:23.297407 master-0 kubenswrapper[4842]: E1204 21:58:23.297395 4842 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 04 21:58:23.325089 master-0 kubenswrapper[4842]: I1204 21:58:23.325040 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:23.340903 master-0 kubenswrapper[4842]: I1204 21:58:23.340841 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:23.403389 master-0 kubenswrapper[4842]: I1204 21:58:23.403321 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:23.596078 master-0 kubenswrapper[4842]: E1204 21:58:23.595978 4842 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Dec 04 21:58:23.664314 master-0 kubenswrapper[4842]: I1204 21:58:23.664246 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:23.664314 master-0 kubenswrapper[4842]: E1204 21:58:23.664288 4842 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 04 21:58:23.767475 master-0 kubenswrapper[4842]: I1204 21:58:23.767375 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:23.782023 master-0 kubenswrapper[4842]: I1204 21:58:23.781950 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:23.836462 master-0 kubenswrapper[4842]: I1204 21:58:23.836371 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:23.838421 master-0 kubenswrapper[4842]: I1204 21:58:23.838198 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:23.838421 master-0 kubenswrapper[4842]: I1204 21:58:23.838273 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:23.838421 master-0 kubenswrapper[4842]: I1204 21:58:23.838292 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:23.838421 master-0 kubenswrapper[4842]: I1204 21:58:23.838390 4842 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 21:58:23.839433 master-0 kubenswrapper[4842]: I1204 21:58:23.839342 4842 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 04 21:58:23.847739 master-0 kubenswrapper[4842]: I1204 21:58:23.847559 4842 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 04 21:58:23.847739 master-0 kubenswrapper[4842]: E1204 21:58:23.847632 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Dec 04 21:58:23.860287 master-0 kubenswrapper[4842]: E1204 21:58:23.860184 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:23.960561 master-0 kubenswrapper[4842]: E1204 21:58:23.960465 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:23.964794 master-0 kubenswrapper[4842]: I1204 21:58:23.964737 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Dec 04 21:58:23.977563 master-0 kubenswrapper[4842]: I1204 21:58:23.977494 4842 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 04 21:58:24.060904 master-0 kubenswrapper[4842]: E1204 21:58:24.060787 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:24.161731 master-0 kubenswrapper[4842]: E1204 21:58:24.161607 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:24.262530 master-0 kubenswrapper[4842]: E1204 21:58:24.262407 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:24.363670 master-0 kubenswrapper[4842]: E1204 21:58:24.363566 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:24.464552 master-0 kubenswrapper[4842]: E1204 21:58:24.464379 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:24.565008 master-0 kubenswrapper[4842]: E1204 21:58:24.564902 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:24.665884 master-0 kubenswrapper[4842]: E1204 21:58:24.665789 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:24.766397 master-0 kubenswrapper[4842]: E1204 21:58:24.766327 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:24.867533 master-0 kubenswrapper[4842]: E1204 21:58:24.867399 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:24.968229 master-0 kubenswrapper[4842]: E1204 21:58:24.968153 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:25.068820 master-0 kubenswrapper[4842]: E1204 21:58:25.068663 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:25.168947 master-0 kubenswrapper[4842]: E1204 21:58:25.168865 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:25.269454 master-0 kubenswrapper[4842]: E1204 21:58:25.269367 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:25.370492 master-0 kubenswrapper[4842]: E1204 21:58:25.370314 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:25.471475 master-0 kubenswrapper[4842]: E1204 21:58:25.471336 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:25.572194 master-0 kubenswrapper[4842]: E1204 21:58:25.572082 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:25.673220 master-0 kubenswrapper[4842]: E1204 21:58:25.673061 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:25.773537 master-0 kubenswrapper[4842]: E1204 21:58:25.773457 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:25.874728 master-0 kubenswrapper[4842]: E1204 21:58:25.874621 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:25.975720 master-0 kubenswrapper[4842]: E1204 21:58:25.975485 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:26.076553 master-0 kubenswrapper[4842]: E1204 21:58:26.076397 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:26.176766 master-0 kubenswrapper[4842]: E1204 21:58:26.176639 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:26.277699 master-0 kubenswrapper[4842]: E1204 21:58:26.277560 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:26.378714 master-0 kubenswrapper[4842]: E1204 21:58:26.378608 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:26.479840 master-0 kubenswrapper[4842]: E1204 21:58:26.479723 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:26.580991 master-0 kubenswrapper[4842]: E1204 21:58:26.580819 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:26.682002 master-0 kubenswrapper[4842]: E1204 21:58:26.681925 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:26.782136 master-0 kubenswrapper[4842]: E1204 21:58:26.782084 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:26.882701 master-0 kubenswrapper[4842]: E1204 21:58:26.882570 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:26.982960 master-0 kubenswrapper[4842]: E1204 21:58:26.982871 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:27.083637 master-0 kubenswrapper[4842]: E1204 21:58:27.083479 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:27.184488 master-0 kubenswrapper[4842]: E1204 21:58:27.184254 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:27.285470 master-0 kubenswrapper[4842]: E1204 21:58:27.285360 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:27.385873 master-0 kubenswrapper[4842]: E1204 21:58:27.385760 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:27.486529 master-0 kubenswrapper[4842]: E1204 21:58:27.486292 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:27.587141 master-0 kubenswrapper[4842]: E1204 21:58:27.587027 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:27.688307 master-0 kubenswrapper[4842]: E1204 21:58:27.688194 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:27.789358 master-0 kubenswrapper[4842]: E1204 21:58:27.789253 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:27.889863 master-0 kubenswrapper[4842]: E1204 21:58:27.889736 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:27.990632 master-0 kubenswrapper[4842]: E1204 21:58:27.990496 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:28.091461 master-0 kubenswrapper[4842]: E1204 21:58:28.091278 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:28.145117 master-0 kubenswrapper[4842]: I1204 21:58:28.145015 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:28.146780 master-0 kubenswrapper[4842]: I1204 21:58:28.146730 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:28.146903 master-0 kubenswrapper[4842]: I1204 21:58:28.146787 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:28.146903 master-0 kubenswrapper[4842]: I1204 21:58:28.146806 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:28.147341 master-0 kubenswrapper[4842]: I1204 21:58:28.147298 4842 scope.go:117] "RemoveContainer" containerID="b6f9e5d170c5c01abcc938a01683f90fda3ae8ba34521ff9d208045fb85cbe9d" Dec 04 21:58:28.147653 master-0 kubenswrapper[4842]: E1204 21:58:28.147601 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="3169f44496ed8a28c6d6a15511ab0eec" Dec 04 21:58:28.191865 master-0 kubenswrapper[4842]: E1204 21:58:28.191772 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:28.292994 master-0 kubenswrapper[4842]: E1204 21:58:28.292897 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:28.393584 master-0 kubenswrapper[4842]: E1204 21:58:28.393323 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:28.494241 master-0 kubenswrapper[4842]: E1204 21:58:28.494121 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:28.594453 master-0 kubenswrapper[4842]: E1204 21:58:28.594292 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:28.695593 master-0 kubenswrapper[4842]: E1204 21:58:28.695283 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:28.796373 master-0 kubenswrapper[4842]: E1204 21:58:28.796233 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:28.896689 master-0 kubenswrapper[4842]: E1204 21:58:28.896601 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:28.997217 master-0 kubenswrapper[4842]: E1204 21:58:28.997009 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:29.097334 master-0 kubenswrapper[4842]: E1204 21:58:29.097185 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:29.198491 master-0 kubenswrapper[4842]: E1204 21:58:29.198376 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:29.299412 master-0 kubenswrapper[4842]: E1204 21:58:29.299299 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:29.400199 master-0 kubenswrapper[4842]: E1204 21:58:29.400100 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:29.500637 master-0 kubenswrapper[4842]: E1204 21:58:29.500473 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:29.601474 master-0 kubenswrapper[4842]: E1204 21:58:29.601284 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:29.702401 master-0 kubenswrapper[4842]: E1204 21:58:29.702294 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:29.803050 master-0 kubenswrapper[4842]: E1204 21:58:29.802919 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:29.903382 master-0 kubenswrapper[4842]: E1204 21:58:29.903170 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.003617 master-0 kubenswrapper[4842]: E1204 21:58:30.003497 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.104560 master-0 kubenswrapper[4842]: E1204 21:58:30.104451 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.205551 master-0 kubenswrapper[4842]: E1204 21:58:30.205293 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.305887 master-0 kubenswrapper[4842]: E1204 21:58:30.305771 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.381027 master-0 kubenswrapper[4842]: I1204 21:58:30.380928 4842 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 21:58:30.406797 master-0 kubenswrapper[4842]: E1204 21:58:30.406698 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.507344 master-0 kubenswrapper[4842]: E1204 21:58:30.507146 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.608330 master-0 kubenswrapper[4842]: E1204 21:58:30.608225 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.709237 master-0 kubenswrapper[4842]: E1204 21:58:30.709109 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.809743 master-0 kubenswrapper[4842]: E1204 21:58:30.809627 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.910739 master-0 kubenswrapper[4842]: E1204 21:58:30.910649 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:30.964173 master-0 kubenswrapper[4842]: E1204 21:58:30.964089 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 04 21:58:31.010931 master-0 kubenswrapper[4842]: E1204 21:58:31.010838 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:31.111893 master-0 kubenswrapper[4842]: E1204 21:58:31.111693 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:31.212033 master-0 kubenswrapper[4842]: E1204 21:58:31.211873 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:31.313102 master-0 kubenswrapper[4842]: E1204 21:58:31.312951 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:31.414157 master-0 kubenswrapper[4842]: E1204 21:58:31.413971 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:31.515195 master-0 kubenswrapper[4842]: E1204 21:58:31.515076 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:31.615543 master-0 kubenswrapper[4842]: E1204 21:58:31.615421 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:31.716750 master-0 kubenswrapper[4842]: E1204 21:58:31.716492 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:31.817621 master-0 kubenswrapper[4842]: E1204 21:58:31.817427 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:31.917844 master-0 kubenswrapper[4842]: E1204 21:58:31.917694 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:32.018243 master-0 kubenswrapper[4842]: E1204 21:58:32.018157 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:32.118702 master-0 kubenswrapper[4842]: E1204 21:58:32.118577 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:32.219655 master-0 kubenswrapper[4842]: E1204 21:58:32.219481 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:32.264609 master-0 kubenswrapper[4842]: I1204 21:58:32.264483 4842 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 21:58:32.320404 master-0 kubenswrapper[4842]: E1204 21:58:32.320210 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:32.421368 master-0 kubenswrapper[4842]: E1204 21:58:32.421298 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:32.522953 master-0 kubenswrapper[4842]: E1204 21:58:32.522860 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:32.623248 master-0 kubenswrapper[4842]: E1204 21:58:32.623021 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:32.723648 master-0 kubenswrapper[4842]: E1204 21:58:32.723570 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:32.824793 master-0 kubenswrapper[4842]: E1204 21:58:32.824688 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:32.925798 master-0 kubenswrapper[4842]: E1204 21:58:32.925618 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:33.026734 master-0 kubenswrapper[4842]: E1204 21:58:33.026642 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:33.127603 master-0 kubenswrapper[4842]: E1204 21:58:33.127481 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:33.228693 master-0 kubenswrapper[4842]: E1204 21:58:33.228495 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:33.328806 master-0 kubenswrapper[4842]: E1204 21:58:33.328658 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:33.429687 master-0 kubenswrapper[4842]: E1204 21:58:33.429604 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:33.530666 master-0 kubenswrapper[4842]: E1204 21:58:33.530564 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:33.630274 master-0 kubenswrapper[4842]: I1204 21:58:33.630185 4842 csr.go:261] certificate signing request csr-gbpqx is approved, waiting to be issued Dec 04 21:58:33.631402 master-0 kubenswrapper[4842]: E1204 21:58:33.631324 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:33.638776 master-0 kubenswrapper[4842]: I1204 21:58:33.638704 4842 csr.go:257] certificate signing request csr-gbpqx is issued Dec 04 21:58:33.732288 master-0 kubenswrapper[4842]: E1204 21:58:33.732206 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:33.833291 master-0 kubenswrapper[4842]: E1204 21:58:33.833166 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:33.934411 master-0 kubenswrapper[4842]: E1204 21:58:33.934303 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:34.035385 master-0 kubenswrapper[4842]: E1204 21:58:34.035286 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:34.136259 master-0 kubenswrapper[4842]: E1204 21:58:34.136042 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:34.167765 master-0 kubenswrapper[4842]: E1204 21:58:34.167687 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Dec 04 21:58:34.236785 master-0 kubenswrapper[4842]: E1204 21:58:34.236669 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:34.337635 master-0 kubenswrapper[4842]: E1204 21:58:34.337526 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:34.438641 master-0 kubenswrapper[4842]: E1204 21:58:34.438402 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:34.539023 master-0 kubenswrapper[4842]: E1204 21:58:34.538821 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:34.639244 master-0 kubenswrapper[4842]: E1204 21:58:34.639093 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:34.641462 master-0 kubenswrapper[4842]: I1204 21:58:34.641390 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-05 21:50:08 +0000 UTC, rotation deadline is 2025-12-05 18:10:31.630260134 +0000 UTC Dec 04 21:58:34.641462 master-0 kubenswrapper[4842]: I1204 21:58:34.641433 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h11m56.988832543s for next certificate rotation Dec 04 21:58:34.740640 master-0 kubenswrapper[4842]: E1204 21:58:34.740335 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:34.841793 master-0 kubenswrapper[4842]: E1204 21:58:34.841604 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:34.942257 master-0 kubenswrapper[4842]: E1204 21:58:34.942106 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:35.042712 master-0 kubenswrapper[4842]: E1204 21:58:35.042545 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:35.143817 master-0 kubenswrapper[4842]: E1204 21:58:35.143696 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:35.244918 master-0 kubenswrapper[4842]: E1204 21:58:35.244776 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:35.346120 master-0 kubenswrapper[4842]: E1204 21:58:35.345939 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:35.446227 master-0 kubenswrapper[4842]: E1204 21:58:35.446129 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:35.547199 master-0 kubenswrapper[4842]: E1204 21:58:35.547100 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:35.642206 master-0 kubenswrapper[4842]: I1204 21:58:35.642003 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-05 21:50:08 +0000 UTC, rotation deadline is 2025-12-05 14:52:59.843369544 +0000 UTC Dec 04 21:58:35.642206 master-0 kubenswrapper[4842]: I1204 21:58:35.642073 4842 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 16h54m24.201302766s for next certificate rotation Dec 04 21:58:35.647318 master-0 kubenswrapper[4842]: E1204 21:58:35.647247 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:35.748396 master-0 kubenswrapper[4842]: E1204 21:58:35.748290 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:35.848654 master-0 kubenswrapper[4842]: E1204 21:58:35.848566 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:35.949176 master-0 kubenswrapper[4842]: E1204 21:58:35.948956 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:36.049696 master-0 kubenswrapper[4842]: E1204 21:58:36.049589 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:36.149905 master-0 kubenswrapper[4842]: E1204 21:58:36.149800 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:36.250949 master-0 kubenswrapper[4842]: E1204 21:58:36.250715 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:36.351453 master-0 kubenswrapper[4842]: E1204 21:58:36.351355 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:36.451671 master-0 kubenswrapper[4842]: E1204 21:58:36.451551 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:36.552547 master-0 kubenswrapper[4842]: E1204 21:58:36.552427 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:36.653641 master-0 kubenswrapper[4842]: E1204 21:58:36.653584 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:36.754849 master-0 kubenswrapper[4842]: E1204 21:58:36.754744 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:36.855856 master-0 kubenswrapper[4842]: E1204 21:58:36.855673 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:36.956143 master-0 kubenswrapper[4842]: E1204 21:58:36.956016 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:37.056891 master-0 kubenswrapper[4842]: E1204 21:58:37.056808 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:37.158096 master-0 kubenswrapper[4842]: E1204 21:58:37.157917 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:37.258865 master-0 kubenswrapper[4842]: E1204 21:58:37.258763 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:37.359425 master-0 kubenswrapper[4842]: E1204 21:58:37.359317 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:37.460164 master-0 kubenswrapper[4842]: E1204 21:58:37.459947 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:37.560808 master-0 kubenswrapper[4842]: E1204 21:58:37.560716 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:37.661007 master-0 kubenswrapper[4842]: E1204 21:58:37.660911 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:37.761329 master-0 kubenswrapper[4842]: E1204 21:58:37.761104 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:37.862281 master-0 kubenswrapper[4842]: E1204 21:58:37.862177 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:37.962673 master-0 kubenswrapper[4842]: E1204 21:58:37.962573 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:38.063659 master-0 kubenswrapper[4842]: E1204 21:58:38.063557 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:38.164825 master-0 kubenswrapper[4842]: E1204 21:58:38.164687 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:38.265493 master-0 kubenswrapper[4842]: E1204 21:58:38.265378 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:38.366689 master-0 kubenswrapper[4842]: E1204 21:58:38.366369 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:38.467454 master-0 kubenswrapper[4842]: E1204 21:58:38.467343 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:38.568229 master-0 kubenswrapper[4842]: E1204 21:58:38.568143 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:38.669396 master-0 kubenswrapper[4842]: E1204 21:58:38.669072 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:38.770368 master-0 kubenswrapper[4842]: E1204 21:58:38.770236 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:38.871451 master-0 kubenswrapper[4842]: E1204 21:58:38.871361 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:38.972192 master-0 kubenswrapper[4842]: E1204 21:58:38.971986 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:39.072690 master-0 kubenswrapper[4842]: E1204 21:58:39.072587 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:39.144365 master-0 kubenswrapper[4842]: I1204 21:58:39.144252 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:39.145949 master-0 kubenswrapper[4842]: I1204 21:58:39.145874 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:39.145949 master-0 kubenswrapper[4842]: I1204 21:58:39.145944 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:39.146210 master-0 kubenswrapper[4842]: I1204 21:58:39.145969 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:39.146675 master-0 kubenswrapper[4842]: I1204 21:58:39.146622 4842 scope.go:117] "RemoveContainer" containerID="b6f9e5d170c5c01abcc938a01683f90fda3ae8ba34521ff9d208045fb85cbe9d" Dec 04 21:58:39.172930 master-0 kubenswrapper[4842]: E1204 21:58:39.172870 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:39.273156 master-0 kubenswrapper[4842]: E1204 21:58:39.273084 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:39.374139 master-0 kubenswrapper[4842]: E1204 21:58:39.374038 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:39.474740 master-0 kubenswrapper[4842]: E1204 21:58:39.474659 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:39.575729 master-0 kubenswrapper[4842]: E1204 21:58:39.575574 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:39.676171 master-0 kubenswrapper[4842]: E1204 21:58:39.676079 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:39.776369 master-0 kubenswrapper[4842]: E1204 21:58:39.776266 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:39.877365 master-0 kubenswrapper[4842]: E1204 21:58:39.877146 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:39.978368 master-0 kubenswrapper[4842]: E1204 21:58:39.978265 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:40.022856 master-0 kubenswrapper[4842]: I1204 21:58:40.022782 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/2.log" Dec 04 21:58:40.023428 master-0 kubenswrapper[4842]: I1204 21:58:40.023364 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"587901d613877303166a73aefe83b729a828ee57d294468839ecb48ee62967aa"} Dec 04 21:58:40.023648 master-0 kubenswrapper[4842]: I1204 21:58:40.023614 4842 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 21:58:40.024784 master-0 kubenswrapper[4842]: I1204 21:58:40.024738 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 21:58:40.024846 master-0 kubenswrapper[4842]: I1204 21:58:40.024788 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 21:58:40.024846 master-0 kubenswrapper[4842]: I1204 21:58:40.024806 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 21:58:40.078987 master-0 kubenswrapper[4842]: E1204 21:58:40.078872 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:40.180306 master-0 kubenswrapper[4842]: E1204 21:58:40.180107 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:40.281575 master-0 kubenswrapper[4842]: E1204 21:58:40.280918 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:40.381425 master-0 kubenswrapper[4842]: E1204 21:58:40.381336 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:40.482654 master-0 kubenswrapper[4842]: E1204 21:58:40.482382 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:40.583372 master-0 kubenswrapper[4842]: E1204 21:58:40.583235 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:40.684483 master-0 kubenswrapper[4842]: E1204 21:58:40.684364 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:40.785319 master-0 kubenswrapper[4842]: E1204 21:58:40.785219 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:40.886431 master-0 kubenswrapper[4842]: E1204 21:58:40.886328 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:40.964936 master-0 kubenswrapper[4842]: E1204 21:58:40.964822 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 04 21:58:40.987304 master-0 kubenswrapper[4842]: E1204 21:58:40.987218 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:41.088150 master-0 kubenswrapper[4842]: E1204 21:58:41.087930 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:41.189218 master-0 kubenswrapper[4842]: E1204 21:58:41.189105 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:41.290296 master-0 kubenswrapper[4842]: E1204 21:58:41.290187 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:41.390758 master-0 kubenswrapper[4842]: E1204 21:58:41.390428 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:41.491712 master-0 kubenswrapper[4842]: E1204 21:58:41.491605 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:41.592352 master-0 kubenswrapper[4842]: E1204 21:58:41.592287 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:41.693603 master-0 kubenswrapper[4842]: E1204 21:58:41.693366 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:41.793970 master-0 kubenswrapper[4842]: E1204 21:58:41.793879 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:41.894269 master-0 kubenswrapper[4842]: E1204 21:58:41.894180 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:41.995204 master-0 kubenswrapper[4842]: E1204 21:58:41.995025 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:42.096145 master-0 kubenswrapper[4842]: E1204 21:58:42.096036 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:42.197235 master-0 kubenswrapper[4842]: E1204 21:58:42.197141 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:42.297973 master-0 kubenswrapper[4842]: E1204 21:58:42.297874 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:42.399044 master-0 kubenswrapper[4842]: E1204 21:58:42.398938 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:42.500248 master-0 kubenswrapper[4842]: E1204 21:58:42.500151 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:42.601281 master-0 kubenswrapper[4842]: E1204 21:58:42.601064 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:42.701553 master-0 kubenswrapper[4842]: E1204 21:58:42.701441 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:42.802264 master-0 kubenswrapper[4842]: E1204 21:58:42.802163 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:42.902854 master-0 kubenswrapper[4842]: E1204 21:58:42.902609 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:43.003444 master-0 kubenswrapper[4842]: E1204 21:58:43.003353 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:43.104469 master-0 kubenswrapper[4842]: E1204 21:58:43.104378 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:43.205727 master-0 kubenswrapper[4842]: E1204 21:58:43.205532 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:43.306735 master-0 kubenswrapper[4842]: E1204 21:58:43.306592 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:43.407731 master-0 kubenswrapper[4842]: E1204 21:58:43.407595 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:43.508892 master-0 kubenswrapper[4842]: E1204 21:58:43.508709 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:43.609555 master-0 kubenswrapper[4842]: E1204 21:58:43.609391 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:43.710695 master-0 kubenswrapper[4842]: E1204 21:58:43.710565 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:43.811458 master-0 kubenswrapper[4842]: E1204 21:58:43.811356 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:43.912358 master-0 kubenswrapper[4842]: E1204 21:58:43.912272 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:44.012908 master-0 kubenswrapper[4842]: E1204 21:58:44.012800 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:44.114188 master-0 kubenswrapper[4842]: E1204 21:58:44.114006 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:44.215064 master-0 kubenswrapper[4842]: E1204 21:58:44.214993 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:44.315674 master-0 kubenswrapper[4842]: E1204 21:58:44.315573 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:44.415922 master-0 kubenswrapper[4842]: E1204 21:58:44.415709 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:44.516780 master-0 kubenswrapper[4842]: E1204 21:58:44.516681 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:44.560668 master-0 kubenswrapper[4842]: E1204 21:58:44.560570 4842 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Dec 04 21:58:44.617868 master-0 kubenswrapper[4842]: E1204 21:58:44.617790 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:44.718889 master-0 kubenswrapper[4842]: E1204 21:58:44.718722 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:44.819612 master-0 kubenswrapper[4842]: E1204 21:58:44.819531 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:44.920808 master-0 kubenswrapper[4842]: E1204 21:58:44.920699 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:45.021526 master-0 kubenswrapper[4842]: E1204 21:58:45.021398 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:45.122565 master-0 kubenswrapper[4842]: E1204 21:58:45.122433 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:45.223708 master-0 kubenswrapper[4842]: E1204 21:58:45.223574 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:45.324710 master-0 kubenswrapper[4842]: E1204 21:58:45.324375 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:45.425703 master-0 kubenswrapper[4842]: E1204 21:58:45.425582 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:45.526786 master-0 kubenswrapper[4842]: E1204 21:58:45.526612 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:45.627696 master-0 kubenswrapper[4842]: E1204 21:58:45.627321 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:45.728474 master-0 kubenswrapper[4842]: E1204 21:58:45.728321 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:45.829392 master-0 kubenswrapper[4842]: E1204 21:58:45.829213 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:45.929666 master-0 kubenswrapper[4842]: E1204 21:58:45.929357 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:46.029698 master-0 kubenswrapper[4842]: E1204 21:58:46.029584 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:46.129812 master-0 kubenswrapper[4842]: E1204 21:58:46.129748 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:46.230980 master-0 kubenswrapper[4842]: E1204 21:58:46.230748 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:46.331472 master-0 kubenswrapper[4842]: E1204 21:58:46.331362 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:46.432295 master-0 kubenswrapper[4842]: E1204 21:58:46.432177 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:46.533075 master-0 kubenswrapper[4842]: E1204 21:58:46.532986 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:46.633457 master-0 kubenswrapper[4842]: E1204 21:58:46.633365 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:46.734167 master-0 kubenswrapper[4842]: E1204 21:58:46.734056 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:46.835179 master-0 kubenswrapper[4842]: E1204 21:58:46.834989 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:46.935531 master-0 kubenswrapper[4842]: E1204 21:58:46.935420 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:47.035927 master-0 kubenswrapper[4842]: E1204 21:58:47.035821 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:47.136597 master-0 kubenswrapper[4842]: E1204 21:58:47.136403 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:47.236805 master-0 kubenswrapper[4842]: E1204 21:58:47.236710 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:47.337711 master-0 kubenswrapper[4842]: E1204 21:58:47.337594 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:47.438209 master-0 kubenswrapper[4842]: E1204 21:58:47.438044 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:47.539014 master-0 kubenswrapper[4842]: E1204 21:58:47.538909 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:47.639915 master-0 kubenswrapper[4842]: E1204 21:58:47.639780 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:47.741065 master-0 kubenswrapper[4842]: E1204 21:58:47.740842 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:47.841876 master-0 kubenswrapper[4842]: E1204 21:58:47.841760 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:47.942027 master-0 kubenswrapper[4842]: E1204 21:58:47.941911 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:48.042123 master-0 kubenswrapper[4842]: E1204 21:58:48.042051 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:48.142764 master-0 kubenswrapper[4842]: E1204 21:58:48.142663 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:48.242983 master-0 kubenswrapper[4842]: E1204 21:58:48.242866 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:48.343953 master-0 kubenswrapper[4842]: E1204 21:58:48.343718 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:48.444788 master-0 kubenswrapper[4842]: E1204 21:58:48.444668 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:48.545678 master-0 kubenswrapper[4842]: E1204 21:58:48.545572 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:48.646665 master-0 kubenswrapper[4842]: E1204 21:58:48.646450 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:48.747461 master-0 kubenswrapper[4842]: E1204 21:58:48.747342 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:48.848010 master-0 kubenswrapper[4842]: E1204 21:58:48.847898 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:48.948327 master-0 kubenswrapper[4842]: E1204 21:58:48.948122 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:49.048673 master-0 kubenswrapper[4842]: E1204 21:58:49.048593 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:49.149795 master-0 kubenswrapper[4842]: E1204 21:58:49.149668 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:49.250629 master-0 kubenswrapper[4842]: E1204 21:58:49.250445 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:49.351621 master-0 kubenswrapper[4842]: E1204 21:58:49.351147 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:49.452070 master-0 kubenswrapper[4842]: E1204 21:58:49.451937 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:49.552973 master-0 kubenswrapper[4842]: E1204 21:58:49.552837 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:49.654004 master-0 kubenswrapper[4842]: E1204 21:58:49.653904 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:49.754847 master-0 kubenswrapper[4842]: E1204 21:58:49.754754 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:49.855275 master-0 kubenswrapper[4842]: E1204 21:58:49.855093 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:49.955525 master-0 kubenswrapper[4842]: E1204 21:58:49.955423 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.056555 master-0 kubenswrapper[4842]: E1204 21:58:50.056418 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.157050 master-0 kubenswrapper[4842]: E1204 21:58:50.156851 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.258001 master-0 kubenswrapper[4842]: E1204 21:58:50.257873 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.358926 master-0 kubenswrapper[4842]: E1204 21:58:50.358853 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.459644 master-0 kubenswrapper[4842]: E1204 21:58:50.459421 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.560349 master-0 kubenswrapper[4842]: E1204 21:58:50.560240 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.661186 master-0 kubenswrapper[4842]: E1204 21:58:50.661066 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.762299 master-0 kubenswrapper[4842]: E1204 21:58:50.762148 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.862572 master-0 kubenswrapper[4842]: E1204 21:58:50.862474 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.963461 master-0 kubenswrapper[4842]: E1204 21:58:50.963369 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:50.965699 master-0 kubenswrapper[4842]: E1204 21:58:50.965647 4842 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 04 21:58:51.064426 master-0 kubenswrapper[4842]: E1204 21:58:51.064365 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:51.164822 master-0 kubenswrapper[4842]: E1204 21:58:51.164736 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:51.265025 master-0 kubenswrapper[4842]: E1204 21:58:51.264902 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:51.365657 master-0 kubenswrapper[4842]: E1204 21:58:51.365363 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:51.466749 master-0 kubenswrapper[4842]: E1204 21:58:51.466627 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:51.567545 master-0 kubenswrapper[4842]: E1204 21:58:51.567413 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:51.668807 master-0 kubenswrapper[4842]: E1204 21:58:51.668637 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:51.769434 master-0 kubenswrapper[4842]: E1204 21:58:51.769351 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:51.870011 master-0 kubenswrapper[4842]: E1204 21:58:51.869906 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:51.970833 master-0 kubenswrapper[4842]: E1204 21:58:51.970623 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:52.071044 master-0 kubenswrapper[4842]: E1204 21:58:52.070943 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:52.171426 master-0 kubenswrapper[4842]: E1204 21:58:52.171360 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:52.271854 master-0 kubenswrapper[4842]: E1204 21:58:52.271745 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:52.371994 master-0 kubenswrapper[4842]: E1204 21:58:52.371909 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:52.473152 master-0 kubenswrapper[4842]: E1204 21:58:52.473056 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:52.574150 master-0 kubenswrapper[4842]: E1204 21:58:52.573902 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:52.674824 master-0 kubenswrapper[4842]: E1204 21:58:52.674699 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:52.775715 master-0 kubenswrapper[4842]: E1204 21:58:52.775619 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:52.876716 master-0 kubenswrapper[4842]: E1204 21:58:52.876527 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:52.977396 master-0 kubenswrapper[4842]: E1204 21:58:52.977307 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:53.078323 master-0 kubenswrapper[4842]: E1204 21:58:53.078240 4842 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 04 21:58:53.111453 master-0 kubenswrapper[4842]: I1204 21:58:53.111355 4842 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 21:58:53.981292 master-0 kubenswrapper[4842]: I1204 21:58:53.981188 4842 apiserver.go:52] "Watching apiserver" Dec 04 21:58:53.986108 master-0 kubenswrapper[4842]: I1204 21:58:53.986051 4842 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 21:58:53.986392 master-0 kubenswrapper[4842]: I1204 21:58:53.986327 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-79767b7ff9-8lq7w","assisted-installer/assisted-installer-controller-mxfnl","openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj"] Dec 04 21:58:53.986965 master-0 kubenswrapper[4842]: I1204 21:58:53.986849 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:53.986965 master-0 kubenswrapper[4842]: I1204 21:58:53.986936 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:53.987318 master-0 kubenswrapper[4842]: I1204 21:58:53.986855 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:53.989999 master-0 kubenswrapper[4842]: I1204 21:58:53.989962 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 21:58:53.991178 master-0 kubenswrapper[4842]: I1204 21:58:53.991110 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 21:58:53.991540 master-0 kubenswrapper[4842]: I1204 21:58:53.991468 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Dec 04 21:58:53.991763 master-0 kubenswrapper[4842]: I1204 21:58:53.991712 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Dec 04 21:58:53.992027 master-0 kubenswrapper[4842]: I1204 21:58:53.991946 4842 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Dec 04 21:58:53.992817 master-0 kubenswrapper[4842]: I1204 21:58:53.992787 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 21:58:53.992917 master-0 kubenswrapper[4842]: I1204 21:58:53.992796 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 21:58:53.993033 master-0 kubenswrapper[4842]: I1204 21:58:53.992956 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 21:58:53.993110 master-0 kubenswrapper[4842]: I1204 21:58:53.993030 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Dec 04 21:58:53.993693 master-0 kubenswrapper[4842]: I1204 21:58:53.993664 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 21:58:54.047884 master-0 kubenswrapper[4842]: I1204 21:58:54.047733 4842 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 04 21:58:54.102488 master-0 kubenswrapper[4842]: I1204 21:58:54.102400 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-sno-bootstrap-files\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.102808 master-0 kubenswrapper[4842]: I1204 21:58:54.102494 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74b7c644-ad97-4009-aac7-550edabc55ae-service-ca\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.102808 master-0 kubenswrapper[4842]: I1204 21:58:54.102580 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lclkg\" (UniqueName: \"kubernetes.io/projected/871cb002-67f4-43aa-a41d-7a5b2f340059-kube-api-access-lclkg\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:54.102808 master-0 kubenswrapper[4842]: I1204 21:58:54.102621 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.102808 master-0 kubenswrapper[4842]: I1204 21:58:54.102657 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfft8\" (UniqueName: \"kubernetes.io/projected/9160fec1-743a-470e-b48f-95a7ddf1c0b2-kube-api-access-lfft8\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.102932 master-0 kubenswrapper[4842]: I1204 21:58:54.102785 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/871cb002-67f4-43aa-a41d-7a5b2f340059-metrics-tls\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:54.102932 master-0 kubenswrapper[4842]: I1204 21:58:54.102884 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b7c644-ad97-4009-aac7-550edabc55ae-kube-api-access\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.102932 master-0 kubenswrapper[4842]: I1204 21:58:54.102924 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.103044 master-0 kubenswrapper[4842]: I1204 21:58:54.102970 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-ca-bundle\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.103044 master-0 kubenswrapper[4842]: I1204 21:58:54.103009 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-var-run-resolv-conf\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.103101 master-0 kubenswrapper[4842]: I1204 21:58:54.103048 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/871cb002-67f4-43aa-a41d-7a5b2f340059-host-etc-kube\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:54.103101 master-0 kubenswrapper[4842]: I1204 21:58:54.103085 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-resolv-conf\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.103169 master-0 kubenswrapper[4842]: I1204 21:58:54.103123 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.204368 master-0 kubenswrapper[4842]: I1204 21:58:54.204272 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.204713 master-0 kubenswrapper[4842]: I1204 21:58:54.204534 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfft8\" (UniqueName: \"kubernetes.io/projected/9160fec1-743a-470e-b48f-95a7ddf1c0b2-kube-api-access-lfft8\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.204713 master-0 kubenswrapper[4842]: I1204 21:58:54.204578 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/871cb002-67f4-43aa-a41d-7a5b2f340059-metrics-tls\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:54.204912 master-0 kubenswrapper[4842]: I1204 21:58:54.204822 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b7c644-ad97-4009-aac7-550edabc55ae-kube-api-access\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.204992 master-0 kubenswrapper[4842]: I1204 21:58:54.204928 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/871cb002-67f4-43aa-a41d-7a5b2f340059-host-etc-kube\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:54.204992 master-0 kubenswrapper[4842]: I1204 21:58:54.204967 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.205069 master-0 kubenswrapper[4842]: E1204 21:58:54.204855 4842 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 21:58:54.205108 master-0 kubenswrapper[4842]: E1204 21:58:54.205089 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 21:58:54.705055759 +0000 UTC m=+65.019868134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 21:58:54.205222 master-0 kubenswrapper[4842]: I1204 21:58:54.205159 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/871cb002-67f4-43aa-a41d-7a5b2f340059-host-etc-kube\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:54.205827 master-0 kubenswrapper[4842]: I1204 21:58:54.205255 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-ca-bundle\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.205827 master-0 kubenswrapper[4842]: I1204 21:58:54.205310 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.205827 master-0 kubenswrapper[4842]: I1204 21:58:54.205410 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-var-run-resolv-conf\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.205827 master-0 kubenswrapper[4842]: I1204 21:58:54.205471 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-resolv-conf\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.205827 master-0 kubenswrapper[4842]: I1204 21:58:54.205531 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.205827 master-0 kubenswrapper[4842]: I1204 21:58:54.205639 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-resolv-conf\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.205827 master-0 kubenswrapper[4842]: I1204 21:58:54.205710 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-var-run-resolv-conf\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.205827 master-0 kubenswrapper[4842]: I1204 21:58:54.205715 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-sno-bootstrap-files\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.205827 master-0 kubenswrapper[4842]: I1204 21:58:54.205772 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.205827 master-0 kubenswrapper[4842]: I1204 21:58:54.205822 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-sno-bootstrap-files\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.206363 master-0 kubenswrapper[4842]: I1204 21:58:54.205846 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74b7c644-ad97-4009-aac7-550edabc55ae-service-ca\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.206363 master-0 kubenswrapper[4842]: I1204 21:58:54.205932 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lclkg\" (UniqueName: \"kubernetes.io/projected/871cb002-67f4-43aa-a41d-7a5b2f340059-kube-api-access-lclkg\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:54.206363 master-0 kubenswrapper[4842]: I1204 21:58:54.205448 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-ca-bundle\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.206453 master-0 kubenswrapper[4842]: I1204 21:58:54.206386 4842 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 04 21:58:54.207968 master-0 kubenswrapper[4842]: I1204 21:58:54.207917 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74b7c644-ad97-4009-aac7-550edabc55ae-service-ca\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.213625 master-0 kubenswrapper[4842]: I1204 21:58:54.213578 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/871cb002-67f4-43aa-a41d-7a5b2f340059-metrics-tls\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:54.242151 master-0 kubenswrapper[4842]: I1204 21:58:54.241987 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b7c644-ad97-4009-aac7-550edabc55ae-kube-api-access\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.246049 master-0 kubenswrapper[4842]: I1204 21:58:54.246004 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfft8\" (UniqueName: \"kubernetes.io/projected/9160fec1-743a-470e-b48f-95a7ddf1c0b2-kube-api-access-lfft8\") pod \"assisted-installer-controller-mxfnl\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.251477 master-0 kubenswrapper[4842]: I1204 21:58:54.251409 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lclkg\" (UniqueName: \"kubernetes.io/projected/871cb002-67f4-43aa-a41d-7a5b2f340059-kube-api-access-lclkg\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:54.316465 master-0 kubenswrapper[4842]: I1204 21:58:54.316390 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 21:58:54.335779 master-0 kubenswrapper[4842]: I1204 21:58:54.335388 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:58:54.339166 master-0 kubenswrapper[4842]: W1204 21:58:54.339114 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod871cb002_67f4_43aa_a41d_7a5b2f340059.slice/crio-10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc WatchSource:0}: Error finding container 10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc: Status 404 returned error can't find the container with id 10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc Dec 04 21:58:54.350860 master-0 kubenswrapper[4842]: W1204 21:58:54.350809 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9160fec1_743a_470e_b48f_95a7ddf1c0b2.slice/crio-102176ffeb29c005b794a7107abe8379110f11cc84e8df32b1ece06e515aee64 WatchSource:0}: Error finding container 102176ffeb29c005b794a7107abe8379110f11cc84e8df32b1ece06e515aee64: Status 404 returned error can't find the container with id 102176ffeb29c005b794a7107abe8379110f11cc84e8df32b1ece06e515aee64 Dec 04 21:58:54.709445 master-0 kubenswrapper[4842]: I1204 21:58:54.709352 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:54.709800 master-0 kubenswrapper[4842]: E1204 21:58:54.709673 4842 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 21:58:54.709849 master-0 kubenswrapper[4842]: E1204 21:58:54.709800 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 21:58:55.709769994 +0000 UTC m=+66.024582179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 21:58:55.063151 master-0 kubenswrapper[4842]: I1204 21:58:55.063074 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-mxfnl" event={"ID":"9160fec1-743a-470e-b48f-95a7ddf1c0b2","Type":"ContainerStarted","Data":"102176ffeb29c005b794a7107abe8379110f11cc84e8df32b1ece06e515aee64"} Dec 04 21:58:55.065106 master-0 kubenswrapper[4842]: I1204 21:58:55.065040 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" event={"ID":"871cb002-67f4-43aa-a41d-7a5b2f340059","Type":"ContainerStarted","Data":"10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc"} Dec 04 21:58:55.717683 master-0 kubenswrapper[4842]: I1204 21:58:55.717579 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:55.717968 master-0 kubenswrapper[4842]: E1204 21:58:55.717792 4842 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 21:58:55.717968 master-0 kubenswrapper[4842]: E1204 21:58:55.717909 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 21:58:57.71788843 +0000 UTC m=+68.032700615 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 21:58:57.733734 master-0 kubenswrapper[4842]: I1204 21:58:57.733556 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:58:57.734440 master-0 kubenswrapper[4842]: E1204 21:58:57.733809 4842 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 21:58:57.734440 master-0 kubenswrapper[4842]: E1204 21:58:57.733926 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 21:59:01.733900966 +0000 UTC m=+72.048713151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 21:58:57.998110 master-0 kubenswrapper[4842]: I1204 21:58:57.997972 4842 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 21:59:01.082850 master-0 kubenswrapper[4842]: I1204 21:59:01.082750 4842 generic.go:334] "Generic (PLEG): container finished" podID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerID="719d3f66cbdb2170aefa60d42b234f7eb81fd7d5f45e585cd2b86f0e36930c80" exitCode=0 Dec 04 21:59:01.083870 master-0 kubenswrapper[4842]: I1204 21:59:01.082883 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-mxfnl" event={"ID":"9160fec1-743a-470e-b48f-95a7ddf1c0b2","Type":"ContainerDied","Data":"719d3f66cbdb2170aefa60d42b234f7eb81fd7d5f45e585cd2b86f0e36930c80"} Dec 04 21:59:01.084817 master-0 kubenswrapper[4842]: I1204 21:59:01.084753 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" event={"ID":"871cb002-67f4-43aa-a41d-7a5b2f340059","Type":"ContainerStarted","Data":"9d7fd4b64c7f9d10b43359385a6360e49aa71c5085c781ef53642cd82a85d004"} Dec 04 21:59:01.764092 master-0 kubenswrapper[4842]: I1204 21:59:01.764007 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:59:01.764402 master-0 kubenswrapper[4842]: E1204 21:59:01.764279 4842 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 21:59:01.764458 master-0 kubenswrapper[4842]: E1204 21:59:01.764400 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 21:59:09.764374005 +0000 UTC m=+80.079186190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 21:59:02.106374 master-0 kubenswrapper[4842]: I1204 21:59:02.106345 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:59:02.172405 master-0 kubenswrapper[4842]: I1204 21:59:02.172320 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" podStartSLOduration=32.457453355 podStartE2EDuration="38.172273042s" podCreationTimestamp="2025-12-04 21:58:24 +0000 UTC" firstStartedPulling="2025-12-04 21:58:54.341624854 +0000 UTC m=+64.656437039" lastFinishedPulling="2025-12-04 21:59:00.056444511 +0000 UTC m=+70.371256726" observedRunningTime="2025-12-04 21:59:01.208477417 +0000 UTC m=+71.523289672" watchObservedRunningTime="2025-12-04 21:59:02.172273042 +0000 UTC m=+72.487085227" Dec 04 21:59:02.268702 master-0 kubenswrapper[4842]: I1204 21:59:02.268599 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-var-run-resolv-conf\") pod \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " Dec 04 21:59:02.268702 master-0 kubenswrapper[4842]: I1204 21:59:02.268685 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-sno-bootstrap-files\") pod \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " Dec 04 21:59:02.268702 master-0 kubenswrapper[4842]: I1204 21:59:02.268713 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-ca-bundle\") pod \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " Dec 04 21:59:02.269190 master-0 kubenswrapper[4842]: I1204 21:59:02.268753 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfft8\" (UniqueName: \"kubernetes.io/projected/9160fec1-743a-470e-b48f-95a7ddf1c0b2-kube-api-access-lfft8\") pod \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " Dec 04 21:59:02.269190 master-0 kubenswrapper[4842]: I1204 21:59:02.268783 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-resolv-conf\") pod \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\" (UID: \"9160fec1-743a-470e-b48f-95a7ddf1c0b2\") " Dec 04 21:59:02.269190 master-0 kubenswrapper[4842]: I1204 21:59:02.268743 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "9160fec1-743a-470e-b48f-95a7ddf1c0b2" (UID: "9160fec1-743a-470e-b48f-95a7ddf1c0b2"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 21:59:02.269190 master-0 kubenswrapper[4842]: I1204 21:59:02.268886 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "9160fec1-743a-470e-b48f-95a7ddf1c0b2" (UID: "9160fec1-743a-470e-b48f-95a7ddf1c0b2"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 21:59:02.269190 master-0 kubenswrapper[4842]: I1204 21:59:02.268866 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "9160fec1-743a-470e-b48f-95a7ddf1c0b2" (UID: "9160fec1-743a-470e-b48f-95a7ddf1c0b2"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 21:59:02.269190 master-0 kubenswrapper[4842]: I1204 21:59:02.268845 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "9160fec1-743a-470e-b48f-95a7ddf1c0b2" (UID: "9160fec1-743a-470e-b48f-95a7ddf1c0b2"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 21:59:02.275646 master-0 kubenswrapper[4842]: I1204 21:59:02.275458 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9160fec1-743a-470e-b48f-95a7ddf1c0b2-kube-api-access-lfft8" (OuterVolumeSpecName: "kube-api-access-lfft8") pod "9160fec1-743a-470e-b48f-95a7ddf1c0b2" (UID: "9160fec1-743a-470e-b48f-95a7ddf1c0b2"). InnerVolumeSpecName "kube-api-access-lfft8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 21:59:02.370032 master-0 kubenswrapper[4842]: I1204 21:59:02.369839 4842 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Dec 04 21:59:02.370032 master-0 kubenswrapper[4842]: I1204 21:59:02.369907 4842 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 21:59:02.370032 master-0 kubenswrapper[4842]: I1204 21:59:02.369930 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfft8\" (UniqueName: \"kubernetes.io/projected/9160fec1-743a-470e-b48f-95a7ddf1c0b2-kube-api-access-lfft8\") on node \"master-0\" DevicePath \"\"" Dec 04 21:59:02.370032 master-0 kubenswrapper[4842]: I1204 21:59:02.369944 4842 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Dec 04 21:59:02.370032 master-0 kubenswrapper[4842]: I1204 21:59:02.369956 4842 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/9160fec1-743a-470e-b48f-95a7ddf1c0b2-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Dec 04 21:59:03.090421 master-0 kubenswrapper[4842]: I1204 21:59:03.090359 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-mxfnl" event={"ID":"9160fec1-743a-470e-b48f-95a7ddf1c0b2","Type":"ContainerDied","Data":"102176ffeb29c005b794a7107abe8379110f11cc84e8df32b1ece06e515aee64"} Dec 04 21:59:03.090421 master-0 kubenswrapper[4842]: I1204 21:59:03.090405 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="102176ffeb29c005b794a7107abe8379110f11cc84e8df32b1ece06e515aee64" Dec 04 21:59:03.091162 master-0 kubenswrapper[4842]: I1204 21:59:03.091120 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 21:59:04.996643 master-0 kubenswrapper[4842]: I1204 21:59:04.996558 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-jjvqz"] Dec 04 21:59:04.996643 master-0 kubenswrapper[4842]: E1204 21:59:04.996661 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerName="assisted-installer-controller" Dec 04 21:59:04.996643 master-0 kubenswrapper[4842]: I1204 21:59:04.996674 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerName="assisted-installer-controller" Dec 04 21:59:04.998133 master-0 kubenswrapper[4842]: I1204 21:59:04.996721 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerName="assisted-installer-controller" Dec 04 21:59:04.998133 master-0 kubenswrapper[4842]: I1204 21:59:04.996928 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-jjvqz" Dec 04 21:59:05.090541 master-0 kubenswrapper[4842]: I1204 21:59:05.090443 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkjf\" (UniqueName: \"kubernetes.io/projected/5c492425-adf8-424f-ac19-f465071857f9-kube-api-access-qgkjf\") pod \"mtu-prober-jjvqz\" (UID: \"5c492425-adf8-424f-ac19-f465071857f9\") " pod="openshift-network-operator/mtu-prober-jjvqz" Dec 04 21:59:05.190861 master-0 kubenswrapper[4842]: I1204 21:59:05.190750 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkjf\" (UniqueName: \"kubernetes.io/projected/5c492425-adf8-424f-ac19-f465071857f9-kube-api-access-qgkjf\") pod \"mtu-prober-jjvqz\" (UID: \"5c492425-adf8-424f-ac19-f465071857f9\") " pod="openshift-network-operator/mtu-prober-jjvqz" Dec 04 21:59:05.216785 master-0 kubenswrapper[4842]: I1204 21:59:05.216690 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkjf\" (UniqueName: \"kubernetes.io/projected/5c492425-adf8-424f-ac19-f465071857f9-kube-api-access-qgkjf\") pod \"mtu-prober-jjvqz\" (UID: \"5c492425-adf8-424f-ac19-f465071857f9\") " pod="openshift-network-operator/mtu-prober-jjvqz" Dec 04 21:59:05.320309 master-0 kubenswrapper[4842]: I1204 21:59:05.320217 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-jjvqz" Dec 04 21:59:06.102381 master-0 kubenswrapper[4842]: I1204 21:59:06.101792 4842 generic.go:334] "Generic (PLEG): container finished" podID="5c492425-adf8-424f-ac19-f465071857f9" containerID="d2ec9d7da1c0e81ac2a2563a5da4eba0b637698001afaf92060cbb9b07bcf2c4" exitCode=0 Dec 04 21:59:06.102381 master-0 kubenswrapper[4842]: I1204 21:59:06.101867 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-jjvqz" event={"ID":"5c492425-adf8-424f-ac19-f465071857f9","Type":"ContainerDied","Data":"d2ec9d7da1c0e81ac2a2563a5da4eba0b637698001afaf92060cbb9b07bcf2c4"} Dec 04 21:59:06.102381 master-0 kubenswrapper[4842]: I1204 21:59:06.101925 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-jjvqz" event={"ID":"5c492425-adf8-424f-ac19-f465071857f9","Type":"ContainerStarted","Data":"a6c99a1def9360d6a4883701478a0eaa20d1a5711ce6f5867cfe014cc60feead"} Dec 04 21:59:07.130043 master-0 kubenswrapper[4842]: I1204 21:59:07.129997 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-jjvqz" Dec 04 21:59:07.207609 master-0 kubenswrapper[4842]: I1204 21:59:07.207566 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgkjf\" (UniqueName: \"kubernetes.io/projected/5c492425-adf8-424f-ac19-f465071857f9-kube-api-access-qgkjf\") pod \"5c492425-adf8-424f-ac19-f465071857f9\" (UID: \"5c492425-adf8-424f-ac19-f465071857f9\") " Dec 04 21:59:07.213000 master-0 kubenswrapper[4842]: I1204 21:59:07.212923 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c492425-adf8-424f-ac19-f465071857f9-kube-api-access-qgkjf" (OuterVolumeSpecName: "kube-api-access-qgkjf") pod "5c492425-adf8-424f-ac19-f465071857f9" (UID: "5c492425-adf8-424f-ac19-f465071857f9"). InnerVolumeSpecName "kube-api-access-qgkjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 21:59:07.308435 master-0 kubenswrapper[4842]: I1204 21:59:07.308335 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgkjf\" (UniqueName: \"kubernetes.io/projected/5c492425-adf8-424f-ac19-f465071857f9-kube-api-access-qgkjf\") on node \"master-0\" DevicePath \"\"" Dec 04 21:59:08.110544 master-0 kubenswrapper[4842]: I1204 21:59:08.110447 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-jjvqz" event={"ID":"5c492425-adf8-424f-ac19-f465071857f9","Type":"ContainerDied","Data":"a6c99a1def9360d6a4883701478a0eaa20d1a5711ce6f5867cfe014cc60feead"} Dec 04 21:59:08.110911 master-0 kubenswrapper[4842]: I1204 21:59:08.110580 4842 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6c99a1def9360d6a4883701478a0eaa20d1a5711ce6f5867cfe014cc60feead" Dec 04 21:59:08.110911 master-0 kubenswrapper[4842]: I1204 21:59:08.110589 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-jjvqz" Dec 04 21:59:09.829539 master-0 kubenswrapper[4842]: I1204 21:59:09.829436 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:59:09.830819 master-0 kubenswrapper[4842]: E1204 21:59:09.829689 4842 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 21:59:09.830819 master-0 kubenswrapper[4842]: E1204 21:59:09.829782 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 21:59:25.829750278 +0000 UTC m=+96.144562493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 21:59:10.017141 master-0 kubenswrapper[4842]: I1204 21:59:10.017024 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-jjvqz"] Dec 04 21:59:10.023833 master-0 kubenswrapper[4842]: I1204 21:59:10.023744 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-jjvqz"] Dec 04 21:59:10.150330 master-0 kubenswrapper[4842]: I1204 21:59:10.150157 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c492425-adf8-424f-ac19-f465071857f9" path="/var/lib/kubelet/pods/5c492425-adf8-424f-ac19-f465071857f9/volumes" Dec 04 21:59:13.159561 master-0 kubenswrapper[4842]: W1204 21:59:13.159435 4842 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Dec 04 21:59:13.160403 master-0 kubenswrapper[4842]: I1204 21:59:13.159703 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 04 21:59:16.110137 master-0 kubenswrapper[4842]: I1204 21:59:16.110052 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-dgpw9"] Dec 04 21:59:16.111195 master-0 kubenswrapper[4842]: E1204 21:59:16.110188 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c492425-adf8-424f-ac19-f465071857f9" containerName="prober" Dec 04 21:59:16.111195 master-0 kubenswrapper[4842]: I1204 21:59:16.110211 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c492425-adf8-424f-ac19-f465071857f9" containerName="prober" Dec 04 21:59:16.111195 master-0 kubenswrapper[4842]: I1204 21:59:16.110253 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c492425-adf8-424f-ac19-f465071857f9" containerName="prober" Dec 04 21:59:16.111195 master-0 kubenswrapper[4842]: I1204 21:59:16.110641 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.115123 master-0 kubenswrapper[4842]: I1204 21:59:16.114425 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 21:59:16.115123 master-0 kubenswrapper[4842]: I1204 21:59:16.114461 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 21:59:16.115411 master-0 kubenswrapper[4842]: I1204 21:59:16.114888 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 21:59:16.115871 master-0 kubenswrapper[4842]: I1204 21:59:16.115833 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 21:59:16.277342 master-0 kubenswrapper[4842]: I1204 21:59:16.277257 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-socket-dir-parent\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.277342 master-0 kubenswrapper[4842]: I1204 21:59:16.277321 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-hostroot\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.277640 master-0 kubenswrapper[4842]: I1204 21:59:16.277456 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-daemon-config\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.277640 master-0 kubenswrapper[4842]: I1204 21:59:16.277605 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-netns\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.277702 master-0 kubenswrapper[4842]: I1204 21:59:16.277656 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-system-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.277702 master-0 kubenswrapper[4842]: I1204 21:59:16.277691 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-os-release\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.277771 master-0 kubenswrapper[4842]: I1204 21:59:16.277730 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-cni-binary-copy\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.277918 master-0 kubenswrapper[4842]: I1204 21:59:16.277855 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-conf-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.277955 master-0 kubenswrapper[4842]: I1204 21:59:16.277936 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-etc-kubernetes\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.277989 master-0 kubenswrapper[4842]: I1204 21:59:16.277966 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-bin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.278025 master-0 kubenswrapper[4842]: I1204 21:59:16.277990 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-multus-certs\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.278025 master-0 kubenswrapper[4842]: I1204 21:59:16.278020 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.278082 master-0 kubenswrapper[4842]: I1204 21:59:16.278044 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-kubelet\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.278122 master-0 kubenswrapper[4842]: I1204 21:59:16.278107 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-cnibin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.278168 master-0 kubenswrapper[4842]: I1204 21:59:16.278122 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-k8s-cni-cncf-io\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.278168 master-0 kubenswrapper[4842]: I1204 21:59:16.278139 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-multus\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.278168 master-0 kubenswrapper[4842]: I1204 21:59:16.278153 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcgg9\" (UniqueName: \"kubernetes.io/projected/6c8c45e0-2342-499b-aa6b-339b6a722a87-kube-api-access-gcgg9\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.292949 master-0 kubenswrapper[4842]: I1204 21:59:16.292865 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=3.292838996 podStartE2EDuration="3.292838996s" podCreationTimestamp="2025-12-04 21:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 21:59:16.146071016 +0000 UTC m=+86.460883221" watchObservedRunningTime="2025-12-04 21:59:16.292838996 +0000 UTC m=+86.607651181" Dec 04 21:59:16.293151 master-0 kubenswrapper[4842]: I1204 21:59:16.293086 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5tpnf"] Dec 04 21:59:16.293646 master-0 kubenswrapper[4842]: I1204 21:59:16.293621 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.295730 master-0 kubenswrapper[4842]: I1204 21:59:16.295704 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 04 21:59:16.296776 master-0 kubenswrapper[4842]: I1204 21:59:16.296749 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 21:59:16.379453 master-0 kubenswrapper[4842]: I1204 21:59:16.379210 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-bin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.379453 master-0 kubenswrapper[4842]: I1204 21:59:16.379269 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-multus-certs\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.379453 master-0 kubenswrapper[4842]: I1204 21:59:16.379297 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-cnibin\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.379453 master-0 kubenswrapper[4842]: I1204 21:59:16.379322 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.379453 master-0 kubenswrapper[4842]: I1204 21:59:16.379339 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-kubelet\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379489 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-system-cni-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379467 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-bin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379535 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-os-release\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379659 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-kubelet\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379700 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-multus-certs\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379654 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379842 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-cnibin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379885 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-k8s-cni-cncf-io\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379908 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-cnibin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379922 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-multus\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379943 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-k8s-cni-cncf-io\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379963 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcgg9\" (UniqueName: \"kubernetes.io/projected/6c8c45e0-2342-499b-aa6b-339b6a722a87-kube-api-access-gcgg9\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.379972 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-multus\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.380006 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.380138 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-socket-dir-parent\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.380134 master-0 kubenswrapper[4842]: I1204 21:59:16.380178 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-hostroot\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380219 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-daemon-config\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380261 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380310 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j8fr\" (UniqueName: \"kubernetes.io/projected/76fd9f44-4365-4271-8772-025655c50334-kube-api-access-9j8fr\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380361 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-system-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380398 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-os-release\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380433 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-cni-binary-copy\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380468 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-netns\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380534 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-binary-copy\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380575 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-etc-kubernetes\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380648 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-hostroot\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380707 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-socket-dir-parent\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380732 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-etc-kubernetes\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380742 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-system-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380767 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-netns\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380798 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380837 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-conf-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380866 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-os-release\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.381663 master-0 kubenswrapper[4842]: I1204 21:59:16.380920 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-conf-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.383160 master-0 kubenswrapper[4842]: I1204 21:59:16.381541 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-daemon-config\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.383160 master-0 kubenswrapper[4842]: I1204 21:59:16.381984 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-cni-binary-copy\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.403780 master-0 kubenswrapper[4842]: I1204 21:59:16.403694 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcgg9\" (UniqueName: \"kubernetes.io/projected/6c8c45e0-2342-499b-aa6b-339b6a722a87-kube-api-access-gcgg9\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.435649 master-0 kubenswrapper[4842]: I1204 21:59:16.435541 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dgpw9" Dec 04 21:59:16.481733 master-0 kubenswrapper[4842]: I1204 21:59:16.481672 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-system-cni-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.481733 master-0 kubenswrapper[4842]: I1204 21:59:16.481727 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-os-release\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.482007 master-0 kubenswrapper[4842]: I1204 21:59:16.481833 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-system-cni-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.482007 master-0 kubenswrapper[4842]: I1204 21:59:16.481864 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.482130 master-0 kubenswrapper[4842]: I1204 21:59:16.482018 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-os-release\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.482206 master-0 kubenswrapper[4842]: I1204 21:59:16.482142 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.482393 master-0 kubenswrapper[4842]: I1204 21:59:16.482338 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j8fr\" (UniqueName: \"kubernetes.io/projected/76fd9f44-4365-4271-8772-025655c50334-kube-api-access-9j8fr\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.482581 master-0 kubenswrapper[4842]: I1204 21:59:16.482400 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-binary-copy\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.482581 master-0 kubenswrapper[4842]: I1204 21:59:16.482437 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.482581 master-0 kubenswrapper[4842]: I1204 21:59:16.482472 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-cnibin\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.482581 master-0 kubenswrapper[4842]: I1204 21:59:16.482561 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-cnibin\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.483196 master-0 kubenswrapper[4842]: I1204 21:59:16.483105 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.483645 master-0 kubenswrapper[4842]: I1204 21:59:16.483587 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.483739 master-0 kubenswrapper[4842]: I1204 21:59:16.483610 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.484026 master-0 kubenswrapper[4842]: I1204 21:59:16.483958 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-binary-copy\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.514110 master-0 kubenswrapper[4842]: I1204 21:59:16.514000 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j8fr\" (UniqueName: \"kubernetes.io/projected/76fd9f44-4365-4271-8772-025655c50334-kube-api-access-9j8fr\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.603948 master-0 kubenswrapper[4842]: I1204 21:59:16.603870 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 21:59:16.616574 master-0 kubenswrapper[4842]: W1204 21:59:16.616511 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76fd9f44_4365_4271_8772_025655c50334.slice/crio-d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc WatchSource:0}: Error finding container d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc: Status 404 returned error can't find the container with id d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc Dec 04 21:59:17.085047 master-0 kubenswrapper[4842]: I1204 21:59:17.084983 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9pfhj"] Dec 04 21:59:17.085552 master-0 kubenswrapper[4842]: I1204 21:59:17.085502 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:17.085660 master-0 kubenswrapper[4842]: E1204 21:59:17.085623 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:17.136198 master-0 kubenswrapper[4842]: I1204 21:59:17.136139 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dgpw9" event={"ID":"6c8c45e0-2342-499b-aa6b-339b6a722a87","Type":"ContainerStarted","Data":"3aa682501427a1a26306dd6e6ffbe29276935fb92a5916c957736c383157a162"} Dec 04 21:59:17.138111 master-0 kubenswrapper[4842]: I1204 21:59:17.138010 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerStarted","Data":"d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc"} Dec 04 21:59:17.190048 master-0 kubenswrapper[4842]: I1204 21:59:17.189954 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbpk\" (UniqueName: \"kubernetes.io/projected/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-kube-api-access-xdbpk\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:17.190442 master-0 kubenswrapper[4842]: I1204 21:59:17.190121 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:17.291199 master-0 kubenswrapper[4842]: I1204 21:59:17.291123 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:17.291554 master-0 kubenswrapper[4842]: E1204 21:59:17.291334 4842 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:17.291554 master-0 kubenswrapper[4842]: I1204 21:59:17.291358 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbpk\" (UniqueName: \"kubernetes.io/projected/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-kube-api-access-xdbpk\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:17.291554 master-0 kubenswrapper[4842]: E1204 21:59:17.291443 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 21:59:17.791413544 +0000 UTC m=+88.106225729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:17.314721 master-0 kubenswrapper[4842]: I1204 21:59:17.314666 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbpk\" (UniqueName: \"kubernetes.io/projected/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-kube-api-access-xdbpk\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:17.794534 master-0 kubenswrapper[4842]: I1204 21:59:17.794450 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:17.794820 master-0 kubenswrapper[4842]: E1204 21:59:17.794716 4842 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:17.794870 master-0 kubenswrapper[4842]: E1204 21:59:17.794856 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 21:59:18.794817409 +0000 UTC m=+89.109629624 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:18.801291 master-0 kubenswrapper[4842]: I1204 21:59:18.801194 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:18.802172 master-0 kubenswrapper[4842]: E1204 21:59:18.801540 4842 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:18.802172 master-0 kubenswrapper[4842]: E1204 21:59:18.801666 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 21:59:20.801641307 +0000 UTC m=+91.116453492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:19.145394 master-0 kubenswrapper[4842]: I1204 21:59:19.145267 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:19.145629 master-0 kubenswrapper[4842]: E1204 21:59:19.145452 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:20.150086 master-0 kubenswrapper[4842]: I1204 21:59:20.149963 4842 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="20fdbd8f60e4052a44e37c80c735da9d3ff66c7350cb568fd169c055622f648f" exitCode=0 Dec 04 21:59:20.151297 master-0 kubenswrapper[4842]: I1204 21:59:20.151120 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"20fdbd8f60e4052a44e37c80c735da9d3ff66c7350cb568fd169c055622f648f"} Dec 04 21:59:20.819313 master-0 kubenswrapper[4842]: I1204 21:59:20.819246 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:20.819799 master-0 kubenswrapper[4842]: E1204 21:59:20.819447 4842 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:20.819799 master-0 kubenswrapper[4842]: E1204 21:59:20.819593 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 21:59:24.819564757 +0000 UTC m=+95.134376982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:21.144831 master-0 kubenswrapper[4842]: I1204 21:59:21.144673 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:21.145147 master-0 kubenswrapper[4842]: E1204 21:59:21.144847 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:23.144474 master-0 kubenswrapper[4842]: I1204 21:59:23.144371 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:23.145193 master-0 kubenswrapper[4842]: E1204 21:59:23.144603 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:24.853218 master-0 kubenswrapper[4842]: I1204 21:59:24.853160 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:24.854081 master-0 kubenswrapper[4842]: E1204 21:59:24.853338 4842 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:24.854081 master-0 kubenswrapper[4842]: E1204 21:59:24.853403 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 21:59:32.853384714 +0000 UTC m=+103.168196899 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:25.145436 master-0 kubenswrapper[4842]: I1204 21:59:25.145266 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:25.145689 master-0 kubenswrapper[4842]: E1204 21:59:25.145470 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:25.862556 master-0 kubenswrapper[4842]: I1204 21:59:25.862454 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:59:25.863380 master-0 kubenswrapper[4842]: E1204 21:59:25.862652 4842 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 21:59:25.863380 master-0 kubenswrapper[4842]: E1204 21:59:25.862738 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 21:59:57.862710549 +0000 UTC m=+128.177522734 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 21:59:27.144377 master-0 kubenswrapper[4842]: I1204 21:59:27.144275 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:27.145058 master-0 kubenswrapper[4842]: E1204 21:59:27.144571 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:28.497538 master-0 kubenswrapper[4842]: I1204 21:59:28.497447 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs"] Dec 04 21:59:28.498339 master-0 kubenswrapper[4842]: I1204 21:59:28.498299 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.502581 master-0 kubenswrapper[4842]: I1204 21:59:28.502059 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 21:59:28.502581 master-0 kubenswrapper[4842]: I1204 21:59:28.502105 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 21:59:28.502934 master-0 kubenswrapper[4842]: I1204 21:59:28.502693 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 21:59:28.503682 master-0 kubenswrapper[4842]: I1204 21:59:28.503433 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 21:59:28.503892 master-0 kubenswrapper[4842]: I1204 21:59:28.503858 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 21:59:28.586664 master-0 kubenswrapper[4842]: I1204 21:59:28.586580 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wh6b\" (UniqueName: \"kubernetes.io/projected/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-kube-api-access-9wh6b\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.586964 master-0 kubenswrapper[4842]: I1204 21:59:28.586687 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.586964 master-0 kubenswrapper[4842]: I1204 21:59:28.586723 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.586964 master-0 kubenswrapper[4842]: I1204 21:59:28.586753 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.688894 master-0 kubenswrapper[4842]: I1204 21:59:28.688805 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wh6b\" (UniqueName: \"kubernetes.io/projected/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-kube-api-access-9wh6b\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.689111 master-0 kubenswrapper[4842]: I1204 21:59:28.688982 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.689111 master-0 kubenswrapper[4842]: I1204 21:59:28.689049 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.689193 master-0 kubenswrapper[4842]: I1204 21:59:28.689104 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.690315 master-0 kubenswrapper[4842]: I1204 21:59:28.690269 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.690722 master-0 kubenswrapper[4842]: I1204 21:59:28.690665 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.695044 master-0 kubenswrapper[4842]: I1204 21:59:28.695000 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.704889 master-0 kubenswrapper[4842]: I1204 21:59:28.704409 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g6f8c"] Dec 04 21:59:28.705695 master-0 kubenswrapper[4842]: I1204 21:59:28.705659 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.708811 master-0 kubenswrapper[4842]: I1204 21:59:28.708411 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 21:59:28.708811 master-0 kubenswrapper[4842]: I1204 21:59:28.708558 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 21:59:28.717162 master-0 kubenswrapper[4842]: I1204 21:59:28.717088 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wh6b\" (UniqueName: \"kubernetes.io/projected/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-kube-api-access-9wh6b\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.815107 master-0 kubenswrapper[4842]: I1204 21:59:28.815049 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 21:59:28.891909 master-0 kubenswrapper[4842]: I1204 21:59:28.891850 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-netns\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.891909 master-0 kubenswrapper[4842]: I1204 21:59:28.891912 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-systemd\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892181 master-0 kubenswrapper[4842]: I1204 21:59:28.892068 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-var-lib-openvswitch\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892181 master-0 kubenswrapper[4842]: I1204 21:59:28.892135 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-ovn\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892181 master-0 kubenswrapper[4842]: I1204 21:59:28.892175 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovn-node-metrics-cert\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892272 master-0 kubenswrapper[4842]: I1204 21:59:28.892207 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xxsc\" (UniqueName: \"kubernetes.io/projected/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-kube-api-access-8xxsc\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892304 master-0 kubenswrapper[4842]: I1204 21:59:28.892276 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-ovn-kubernetes\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892361 master-0 kubenswrapper[4842]: I1204 21:59:28.892328 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-netd\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892400 master-0 kubenswrapper[4842]: I1204 21:59:28.892376 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-env-overrides\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892463 master-0 kubenswrapper[4842]: I1204 21:59:28.892433 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-bin\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892535 master-0 kubenswrapper[4842]: I1204 21:59:28.892482 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-config\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892603 master-0 kubenswrapper[4842]: I1204 21:59:28.892578 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-etc-openvswitch\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892766 master-0 kubenswrapper[4842]: I1204 21:59:28.892629 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-log-socket\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892766 master-0 kubenswrapper[4842]: I1204 21:59:28.892675 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892766 master-0 kubenswrapper[4842]: I1204 21:59:28.892727 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-kubelet\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892766 master-0 kubenswrapper[4842]: I1204 21:59:28.892758 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-slash\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892873 master-0 kubenswrapper[4842]: I1204 21:59:28.892799 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-script-lib\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892873 master-0 kubenswrapper[4842]: I1204 21:59:28.892836 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-node-log\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892930 master-0 kubenswrapper[4842]: I1204 21:59:28.892905 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-systemd-units\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.892992 master-0 kubenswrapper[4842]: I1204 21:59:28.892958 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-openvswitch\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.993552 master-0 kubenswrapper[4842]: I1204 21:59:28.993487 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-script-lib\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.993552 master-0 kubenswrapper[4842]: I1204 21:59:28.993548 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-node-log\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.993870 master-0 kubenswrapper[4842]: I1204 21:59:28.993588 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-systemd-units\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.993870 master-0 kubenswrapper[4842]: I1204 21:59:28.993792 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-openvswitch\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.993960 master-0 kubenswrapper[4842]: I1204 21:59:28.993880 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-netns\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.993960 master-0 kubenswrapper[4842]: I1204 21:59:28.993926 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-systemd-units\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994039 master-0 kubenswrapper[4842]: I1204 21:59:28.994011 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-systemd\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994075 master-0 kubenswrapper[4842]: I1204 21:59:28.994015 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-openvswitch\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994075 master-0 kubenswrapper[4842]: I1204 21:59:28.994058 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-node-log\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994138 master-0 kubenswrapper[4842]: I1204 21:59:28.994041 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xxsc\" (UniqueName: \"kubernetes.io/projected/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-kube-api-access-8xxsc\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994138 master-0 kubenswrapper[4842]: I1204 21:59:28.994102 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-netns\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994197 master-0 kubenswrapper[4842]: I1204 21:59:28.994144 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-var-lib-openvswitch\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994197 master-0 kubenswrapper[4842]: I1204 21:59:28.994161 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-systemd\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994197 master-0 kubenswrapper[4842]: I1204 21:59:28.994171 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-ovn\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994281 master-0 kubenswrapper[4842]: I1204 21:59:28.994199 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-var-lib-openvswitch\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994281 master-0 kubenswrapper[4842]: I1204 21:59:28.994202 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovn-node-metrics-cert\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994281 master-0 kubenswrapper[4842]: I1204 21:59:28.994241 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-ovn-kubernetes\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994281 master-0 kubenswrapper[4842]: I1204 21:59:28.994272 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-netd\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994387 master-0 kubenswrapper[4842]: I1204 21:59:28.994295 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-env-overrides\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994387 master-0 kubenswrapper[4842]: I1204 21:59:28.994319 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-bin\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994387 master-0 kubenswrapper[4842]: I1204 21:59:28.994341 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-config\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994387 master-0 kubenswrapper[4842]: I1204 21:59:28.994369 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-etc-openvswitch\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994530 master-0 kubenswrapper[4842]: I1204 21:59:28.994393 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-log-socket\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994530 master-0 kubenswrapper[4842]: I1204 21:59:28.994417 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994530 master-0 kubenswrapper[4842]: I1204 21:59:28.994447 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-kubelet\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994530 master-0 kubenswrapper[4842]: I1204 21:59:28.994473 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-slash\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994530 master-0 kubenswrapper[4842]: I1204 21:59:28.994482 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-ovn\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994779 master-0 kubenswrapper[4842]: I1204 21:59:28.994548 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-slash\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994779 master-0 kubenswrapper[4842]: I1204 21:59:28.994594 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-ovn-kubernetes\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994779 master-0 kubenswrapper[4842]: I1204 21:59:28.994601 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-log-socket\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994779 master-0 kubenswrapper[4842]: I1204 21:59:28.994617 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-etc-openvswitch\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994779 master-0 kubenswrapper[4842]: I1204 21:59:28.994642 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-netd\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994779 master-0 kubenswrapper[4842]: I1204 21:59:28.994657 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994779 master-0 kubenswrapper[4842]: I1204 21:59:28.994689 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-kubelet\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.994779 master-0 kubenswrapper[4842]: I1204 21:59:28.994721 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-bin\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.995040 master-0 kubenswrapper[4842]: I1204 21:59:28.994897 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-script-lib\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.995398 master-0 kubenswrapper[4842]: I1204 21:59:28.995357 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-env-overrides\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.995485 master-0 kubenswrapper[4842]: I1204 21:59:28.995453 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-config\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:28.998124 master-0 kubenswrapper[4842]: I1204 21:59:28.998086 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovn-node-metrics-cert\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:29.014668 master-0 kubenswrapper[4842]: I1204 21:59:29.014611 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xxsc\" (UniqueName: \"kubernetes.io/projected/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-kube-api-access-8xxsc\") pod \"ovnkube-node-g6f8c\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:29.022492 master-0 kubenswrapper[4842]: I1204 21:59:29.022445 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 21:59:29.144826 master-0 kubenswrapper[4842]: I1204 21:59:29.144679 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:29.145070 master-0 kubenswrapper[4842]: E1204 21:59:29.144854 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:30.157335 master-0 kubenswrapper[4842]: I1204 21:59:30.157098 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 04 21:59:31.144762 master-0 kubenswrapper[4842]: I1204 21:59:31.144690 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:31.145091 master-0 kubenswrapper[4842]: E1204 21:59:31.144872 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:31.556651 master-0 kubenswrapper[4842]: W1204 21:59:31.556611 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57ba9e92_a587_4ccb_84d2_2ac60f420ec0.slice/crio-2790dad3d78d4a3d56be2a118c350f47bfd8d53723c6fb4f92908cfb1c9d89a4 WatchSource:0}: Error finding container 2790dad3d78d4a3d56be2a118c350f47bfd8d53723c6fb4f92908cfb1c9d89a4: Status 404 returned error can't find the container with id 2790dad3d78d4a3d56be2a118c350f47bfd8d53723c6fb4f92908cfb1c9d89a4 Dec 04 21:59:31.687135 master-0 kubenswrapper[4842]: I1204 21:59:31.685883 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-6jkkl"] Dec 04 21:59:31.687135 master-0 kubenswrapper[4842]: I1204 21:59:31.686386 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:31.687135 master-0 kubenswrapper[4842]: E1204 21:59:31.686460 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:31.700422 master-0 kubenswrapper[4842]: I1204 21:59:31.700334 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=1.7003057579999998 podStartE2EDuration="1.700305758s" podCreationTimestamp="2025-12-04 21:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 21:59:31.699488735 +0000 UTC m=+102.014300940" watchObservedRunningTime="2025-12-04 21:59:31.700305758 +0000 UTC m=+102.015117943" Dec 04 21:59:31.819227 master-0 kubenswrapper[4842]: I1204 21:59:31.818576 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:31.920286 master-0 kubenswrapper[4842]: I1204 21:59:31.919832 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:31.945943 master-0 kubenswrapper[4842]: E1204 21:59:31.945619 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 21:59:31.945943 master-0 kubenswrapper[4842]: E1204 21:59:31.945658 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 21:59:31.945943 master-0 kubenswrapper[4842]: E1204 21:59:31.945676 4842 projected.go:194] Error preparing data for projected volume kube-api-access-gfhgj for pod openshift-network-diagnostics/network-check-target-6jkkl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:31.945943 master-0 kubenswrapper[4842]: E1204 21:59:31.945742 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj podName:510a595a-21bf-48fc-85cd-707bc8f5536f nodeName:}" failed. No retries permitted until 2025-12-04 21:59:32.445719297 +0000 UTC m=+102.760531502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gfhgj" (UniqueName: "kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj") pod "network-check-target-6jkkl" (UID: "510a595a-21bf-48fc-85cd-707bc8f5536f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:32.159815 master-0 kubenswrapper[4842]: I1204 21:59:32.159664 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 04 21:59:32.180893 master-0 kubenswrapper[4842]: I1204 21:59:32.180841 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerStarted","Data":"2790dad3d78d4a3d56be2a118c350f47bfd8d53723c6fb4f92908cfb1c9d89a4"} Dec 04 21:59:32.182884 master-0 kubenswrapper[4842]: I1204 21:59:32.182751 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerStarted","Data":"9b118c3eb1526e32a59593fb41286a1e5da44aab9049917049f670cf866c2e43"} Dec 04 21:59:32.182884 master-0 kubenswrapper[4842]: I1204 21:59:32.182834 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerStarted","Data":"c382efd8856765d7e3a7c1f5148c4c397e023bc0d7fa9282b9bc8277f0af2687"} Dec 04 21:59:32.185655 master-0 kubenswrapper[4842]: I1204 21:59:32.185593 4842 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="3903951768e93b52af44e2ee6090549f67bc30f2eeffd34acda2b5e56323b0df" exitCode=0 Dec 04 21:59:32.185919 master-0 kubenswrapper[4842]: I1204 21:59:32.185792 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"3903951768e93b52af44e2ee6090549f67bc30f2eeffd34acda2b5e56323b0df"} Dec 04 21:59:32.189082 master-0 kubenswrapper[4842]: I1204 21:59:32.189006 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dgpw9" event={"ID":"6c8c45e0-2342-499b-aa6b-339b6a722a87","Type":"ContainerStarted","Data":"412b36a625c3b7b5d3033bd3f5f3ec14a8a2f1b82af2acf7233fc8da02c22531"} Dec 04 21:59:32.207823 master-0 kubenswrapper[4842]: I1204 21:59:32.207710 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=0.207477342 podStartE2EDuration="207.477342ms" podCreationTimestamp="2025-12-04 21:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 21:59:32.207375359 +0000 UTC m=+102.522187554" watchObservedRunningTime="2025-12-04 21:59:32.207477342 +0000 UTC m=+102.522289537" Dec 04 21:59:32.255674 master-0 kubenswrapper[4842]: I1204 21:59:32.255543 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dgpw9" podStartSLOduration=1.080629925 podStartE2EDuration="16.255492427s" podCreationTimestamp="2025-12-04 21:59:16 +0000 UTC" firstStartedPulling="2025-12-04 21:59:16.453856263 +0000 UTC m=+86.768668488" lastFinishedPulling="2025-12-04 21:59:31.628718805 +0000 UTC m=+101.943530990" observedRunningTime="2025-12-04 21:59:32.254702546 +0000 UTC m=+102.569514771" watchObservedRunningTime="2025-12-04 21:59:32.255492427 +0000 UTC m=+102.570304632" Dec 04 21:59:32.525942 master-0 kubenswrapper[4842]: I1204 21:59:32.525883 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:32.526245 master-0 kubenswrapper[4842]: E1204 21:59:32.526192 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 21:59:32.526245 master-0 kubenswrapper[4842]: E1204 21:59:32.526232 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 21:59:32.526344 master-0 kubenswrapper[4842]: E1204 21:59:32.526251 4842 projected.go:194] Error preparing data for projected volume kube-api-access-gfhgj for pod openshift-network-diagnostics/network-check-target-6jkkl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:32.526344 master-0 kubenswrapper[4842]: E1204 21:59:32.526336 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj podName:510a595a-21bf-48fc-85cd-707bc8f5536f nodeName:}" failed. No retries permitted until 2025-12-04 21:59:33.526310312 +0000 UTC m=+103.841122627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gfhgj" (UniqueName: "kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj") pod "network-check-target-6jkkl" (UID: "510a595a-21bf-48fc-85cd-707bc8f5536f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:32.929602 master-0 kubenswrapper[4842]: I1204 21:59:32.929471 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:32.930465 master-0 kubenswrapper[4842]: E1204 21:59:32.929721 4842 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:32.930465 master-0 kubenswrapper[4842]: E1204 21:59:32.929831 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 21:59:48.929802221 +0000 UTC m=+119.244614426 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:33.144909 master-0 kubenswrapper[4842]: I1204 21:59:33.144846 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:33.145340 master-0 kubenswrapper[4842]: I1204 21:59:33.144905 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:33.145340 master-0 kubenswrapper[4842]: E1204 21:59:33.145019 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:33.145340 master-0 kubenswrapper[4842]: E1204 21:59:33.145170 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:33.535164 master-0 kubenswrapper[4842]: I1204 21:59:33.535061 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:33.535423 master-0 kubenswrapper[4842]: E1204 21:59:33.535346 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 21:59:33.535423 master-0 kubenswrapper[4842]: E1204 21:59:33.535386 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 21:59:33.535423 master-0 kubenswrapper[4842]: E1204 21:59:33.535401 4842 projected.go:194] Error preparing data for projected volume kube-api-access-gfhgj for pod openshift-network-diagnostics/network-check-target-6jkkl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:33.535578 master-0 kubenswrapper[4842]: E1204 21:59:33.535481 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj podName:510a595a-21bf-48fc-85cd-707bc8f5536f nodeName:}" failed. No retries permitted until 2025-12-04 21:59:35.535458503 +0000 UTC m=+105.850270838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gfhgj" (UniqueName: "kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj") pod "network-check-target-6jkkl" (UID: "510a595a-21bf-48fc-85cd-707bc8f5536f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:34.161607 master-0 kubenswrapper[4842]: I1204 21:59:34.161482 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 04 21:59:34.291619 master-0 kubenswrapper[4842]: I1204 21:59:34.291545 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-nk92d"] Dec 04 21:59:34.292079 master-0 kubenswrapper[4842]: I1204 21:59:34.292048 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.296037 master-0 kubenswrapper[4842]: I1204 21:59:34.296002 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 21:59:34.296258 master-0 kubenswrapper[4842]: I1204 21:59:34.296238 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 21:59:34.296970 master-0 kubenswrapper[4842]: I1204 21:59:34.296938 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 21:59:34.297160 master-0 kubenswrapper[4842]: I1204 21:59:34.297105 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 21:59:34.297160 master-0 kubenswrapper[4842]: I1204 21:59:34.297137 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 21:59:34.312853 master-0 kubenswrapper[4842]: I1204 21:59:34.312318 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=0.312290891 podStartE2EDuration="312.290891ms" podCreationTimestamp="2025-12-04 21:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 21:59:34.308419548 +0000 UTC m=+104.623231743" watchObservedRunningTime="2025-12-04 21:59:34.312290891 +0000 UTC m=+104.627103066" Dec 04 21:59:34.444212 master-0 kubenswrapper[4842]: I1204 21:59:34.444167 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgt75\" (UniqueName: \"kubernetes.io/projected/634c1df6-de4d-4e26-8c71-d39311cae0ce-kube-api-access-xgt75\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.444305 master-0 kubenswrapper[4842]: I1204 21:59:34.444242 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-ovnkube-identity-cm\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.444305 master-0 kubenswrapper[4842]: I1204 21:59:34.444276 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-env-overrides\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.444381 master-0 kubenswrapper[4842]: I1204 21:59:34.444321 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/634c1df6-de4d-4e26-8c71-d39311cae0ce-webhook-cert\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.545366 master-0 kubenswrapper[4842]: I1204 21:59:34.545297 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/634c1df6-de4d-4e26-8c71-d39311cae0ce-webhook-cert\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.545663 master-0 kubenswrapper[4842]: I1204 21:59:34.545384 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgt75\" (UniqueName: \"kubernetes.io/projected/634c1df6-de4d-4e26-8c71-d39311cae0ce-kube-api-access-xgt75\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.545663 master-0 kubenswrapper[4842]: I1204 21:59:34.545426 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-ovnkube-identity-cm\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.545663 master-0 kubenswrapper[4842]: I1204 21:59:34.545451 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-env-overrides\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.546529 master-0 kubenswrapper[4842]: I1204 21:59:34.546484 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-env-overrides\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.547522 master-0 kubenswrapper[4842]: I1204 21:59:34.547413 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-ovnkube-identity-cm\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.550353 master-0 kubenswrapper[4842]: I1204 21:59:34.550304 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/634c1df6-de4d-4e26-8c71-d39311cae0ce-webhook-cert\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.568019 master-0 kubenswrapper[4842]: I1204 21:59:34.567981 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgt75\" (UniqueName: \"kubernetes.io/projected/634c1df6-de4d-4e26-8c71-d39311cae0ce-kube-api-access-xgt75\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:34.612780 master-0 kubenswrapper[4842]: I1204 21:59:34.612708 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 21:59:35.145181 master-0 kubenswrapper[4842]: I1204 21:59:35.145027 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:35.145181 master-0 kubenswrapper[4842]: I1204 21:59:35.145088 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:35.145473 master-0 kubenswrapper[4842]: E1204 21:59:35.145197 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:35.145758 master-0 kubenswrapper[4842]: E1204 21:59:35.145693 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:35.198537 master-0 kubenswrapper[4842]: I1204 21:59:35.198465 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerStarted","Data":"d20bd8f7df5ba066210630b0736a496814188135f419c1b214059e7501c8fbf9"} Dec 04 21:59:35.201762 master-0 kubenswrapper[4842]: I1204 21:59:35.201730 4842 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="eade6c63cfbfd85793c4e11745edd4d5a786bcef37074f29af89908e936863d7" exitCode=0 Dec 04 21:59:35.201830 master-0 kubenswrapper[4842]: I1204 21:59:35.201769 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"eade6c63cfbfd85793c4e11745edd4d5a786bcef37074f29af89908e936863d7"} Dec 04 21:59:35.555027 master-0 kubenswrapper[4842]: I1204 21:59:35.554947 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:35.555263 master-0 kubenswrapper[4842]: E1204 21:59:35.555197 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 21:59:35.555263 master-0 kubenswrapper[4842]: E1204 21:59:35.555256 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 21:59:35.555318 master-0 kubenswrapper[4842]: E1204 21:59:35.555278 4842 projected.go:194] Error preparing data for projected volume kube-api-access-gfhgj for pod openshift-network-diagnostics/network-check-target-6jkkl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:35.555424 master-0 kubenswrapper[4842]: E1204 21:59:35.555387 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj podName:510a595a-21bf-48fc-85cd-707bc8f5536f nodeName:}" failed. No retries permitted until 2025-12-04 21:59:39.555343695 +0000 UTC m=+109.870156030 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gfhgj" (UniqueName: "kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj") pod "network-check-target-6jkkl" (UID: "510a595a-21bf-48fc-85cd-707bc8f5536f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:37.144314 master-0 kubenswrapper[4842]: I1204 21:59:37.144230 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:37.144934 master-0 kubenswrapper[4842]: I1204 21:59:37.144245 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:37.144934 master-0 kubenswrapper[4842]: E1204 21:59:37.144396 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:37.144934 master-0 kubenswrapper[4842]: E1204 21:59:37.144570 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:37.212125 master-0 kubenswrapper[4842]: I1204 21:59:37.212041 4842 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="189119b91f6e6ef0f62e51f0cc69d03fbbc0144ce142853e62f56609d2029b1d" exitCode=0 Dec 04 21:59:37.212125 master-0 kubenswrapper[4842]: I1204 21:59:37.212107 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"189119b91f6e6ef0f62e51f0cc69d03fbbc0144ce142853e62f56609d2029b1d"} Dec 04 21:59:39.144423 master-0 kubenswrapper[4842]: I1204 21:59:39.144355 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:39.145404 master-0 kubenswrapper[4842]: I1204 21:59:39.144362 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:39.145404 master-0 kubenswrapper[4842]: E1204 21:59:39.144626 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:39.145404 master-0 kubenswrapper[4842]: E1204 21:59:39.144525 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:39.593360 master-0 kubenswrapper[4842]: I1204 21:59:39.593276 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:39.593723 master-0 kubenswrapper[4842]: E1204 21:59:39.593547 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 21:59:39.593723 master-0 kubenswrapper[4842]: E1204 21:59:39.593584 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 21:59:39.593723 master-0 kubenswrapper[4842]: E1204 21:59:39.593597 4842 projected.go:194] Error preparing data for projected volume kube-api-access-gfhgj for pod openshift-network-diagnostics/network-check-target-6jkkl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:39.593723 master-0 kubenswrapper[4842]: E1204 21:59:39.593662 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj podName:510a595a-21bf-48fc-85cd-707bc8f5536f nodeName:}" failed. No retries permitted until 2025-12-04 21:59:47.593640301 +0000 UTC m=+117.908452486 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gfhgj" (UniqueName: "kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj") pod "network-check-target-6jkkl" (UID: "510a595a-21bf-48fc-85cd-707bc8f5536f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:41.144867 master-0 kubenswrapper[4842]: I1204 21:59:41.144746 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:41.145868 master-0 kubenswrapper[4842]: I1204 21:59:41.144777 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:41.145868 master-0 kubenswrapper[4842]: E1204 21:59:41.144956 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:41.145868 master-0 kubenswrapper[4842]: E1204 21:59:41.145080 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:43.144485 master-0 kubenswrapper[4842]: I1204 21:59:43.144416 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:43.145107 master-0 kubenswrapper[4842]: I1204 21:59:43.144458 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:43.145107 master-0 kubenswrapper[4842]: E1204 21:59:43.144638 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:43.145107 master-0 kubenswrapper[4842]: E1204 21:59:43.144871 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:45.144269 master-0 kubenswrapper[4842]: I1204 21:59:45.144220 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:45.144269 master-0 kubenswrapper[4842]: I1204 21:59:45.144251 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:45.144987 master-0 kubenswrapper[4842]: E1204 21:59:45.144363 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:45.144987 master-0 kubenswrapper[4842]: E1204 21:59:45.144435 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:46.723575 master-0 kubenswrapper[4842]: I1204 21:59:46.723359 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Dec 04 21:59:47.144994 master-0 kubenswrapper[4842]: I1204 21:59:47.144901 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:47.145228 master-0 kubenswrapper[4842]: E1204 21:59:47.145073 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:47.145228 master-0 kubenswrapper[4842]: I1204 21:59:47.145142 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:47.145228 master-0 kubenswrapper[4842]: E1204 21:59:47.145184 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:47.667765 master-0 kubenswrapper[4842]: I1204 21:59:47.667695 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:47.668066 master-0 kubenswrapper[4842]: E1204 21:59:47.668018 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 21:59:47.668114 master-0 kubenswrapper[4842]: E1204 21:59:47.668068 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 21:59:47.668114 master-0 kubenswrapper[4842]: E1204 21:59:47.668089 4842 projected.go:194] Error preparing data for projected volume kube-api-access-gfhgj for pod openshift-network-diagnostics/network-check-target-6jkkl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:47.668189 master-0 kubenswrapper[4842]: E1204 21:59:47.668177 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj podName:510a595a-21bf-48fc-85cd-707bc8f5536f nodeName:}" failed. No retries permitted until 2025-12-04 22:00:03.668148498 +0000 UTC m=+133.982960693 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gfhgj" (UniqueName: "kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj") pod "network-check-target-6jkkl" (UID: "510a595a-21bf-48fc-85cd-707bc8f5536f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 21:59:48.980259 master-0 kubenswrapper[4842]: I1204 21:59:48.980199 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:48.981011 master-0 kubenswrapper[4842]: E1204 21:59:48.980396 4842 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:48.981011 master-0 kubenswrapper[4842]: E1204 21:59:48.980493 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 22:00:20.980470182 +0000 UTC m=+151.295282367 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 21:59:49.144794 master-0 kubenswrapper[4842]: I1204 21:59:49.144695 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:49.145098 master-0 kubenswrapper[4842]: I1204 21:59:49.144743 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:49.145098 master-0 kubenswrapper[4842]: E1204 21:59:49.144887 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:49.145098 master-0 kubenswrapper[4842]: E1204 21:59:49.145017 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:49.975103 master-0 kubenswrapper[4842]: E1204 21:59:49.975037 4842 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 04 21:59:50.992005 master-0 kubenswrapper[4842]: E1204 21:59:50.991872 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 21:59:51.144670 master-0 kubenswrapper[4842]: I1204 21:59:51.144535 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:51.144670 master-0 kubenswrapper[4842]: I1204 21:59:51.144594 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:51.145236 master-0 kubenswrapper[4842]: E1204 21:59:51.144773 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:51.145236 master-0 kubenswrapper[4842]: E1204 21:59:51.144891 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:53.145032 master-0 kubenswrapper[4842]: I1204 21:59:53.144939 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:53.145704 master-0 kubenswrapper[4842]: E1204 21:59:53.145138 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:53.145704 master-0 kubenswrapper[4842]: I1204 21:59:53.145238 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:53.145704 master-0 kubenswrapper[4842]: E1204 21:59:53.145299 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:55.145079 master-0 kubenswrapper[4842]: I1204 21:59:55.144965 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:55.145775 master-0 kubenswrapper[4842]: I1204 21:59:55.145129 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:55.145775 master-0 kubenswrapper[4842]: E1204 21:59:55.145204 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:55.145775 master-0 kubenswrapper[4842]: E1204 21:59:55.145369 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:55.993816 master-0 kubenswrapper[4842]: E1204 21:59:55.993751 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 21:59:57.144873 master-0 kubenswrapper[4842]: I1204 21:59:57.144687 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:57.144873 master-0 kubenswrapper[4842]: I1204 21:59:57.144723 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:57.156096 master-0 kubenswrapper[4842]: E1204 21:59:57.144889 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:57.156096 master-0 kubenswrapper[4842]: E1204 21:59:57.145176 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:57.271207 master-0 kubenswrapper[4842]: I1204 21:59:57.271042 4842 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="3574d633e7308db3b6dd662bd037451e5d0ed5c34c61a73c66397c77d3caf66e" exitCode=0 Dec 04 21:59:57.271207 master-0 kubenswrapper[4842]: I1204 21:59:57.271111 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"3574d633e7308db3b6dd662bd037451e5d0ed5c34c61a73c66397c77d3caf66e"} Dec 04 21:59:57.273229 master-0 kubenswrapper[4842]: I1204 21:59:57.273200 4842 generic.go:334] "Generic (PLEG): container finished" podID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerID="f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414" exitCode=0 Dec 04 21:59:57.273333 master-0 kubenswrapper[4842]: I1204 21:59:57.273280 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerDied","Data":"f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414"} Dec 04 21:59:57.275966 master-0 kubenswrapper[4842]: I1204 21:59:57.275925 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerStarted","Data":"8235211a3898b8961786603441645f7da3fef63f8a04f95fcc274a44a7765453"} Dec 04 21:59:57.290597 master-0 kubenswrapper[4842]: I1204 21:59:57.290527 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerStarted","Data":"a679264390b031ae4f297359e8c908ad01e2a92651d2cb70742a5a02fd398618"} Dec 04 21:59:57.290690 master-0 kubenswrapper[4842]: I1204 21:59:57.290604 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerStarted","Data":"b255dc2ba6c02f78e7fa3f3206067dc5c657701a0d9a3acc7e7566b70c0f286c"} Dec 04 21:59:57.302196 master-0 kubenswrapper[4842]: I1204 21:59:57.302099 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=12.302069743 podStartE2EDuration="12.302069743s" podCreationTimestamp="2025-12-04 21:59:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 21:59:52.312134964 +0000 UTC m=+122.626947209" watchObservedRunningTime="2025-12-04 21:59:57.302069743 +0000 UTC m=+127.616881938" Dec 04 21:59:57.350838 master-0 kubenswrapper[4842]: I1204 21:59:57.350683 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-nk92d" podStartSLOduration=1.044155845 podStartE2EDuration="23.350647793s" podCreationTimestamp="2025-12-04 21:59:34 +0000 UTC" firstStartedPulling="2025-12-04 21:59:34.631233854 +0000 UTC m=+104.946046049" lastFinishedPulling="2025-12-04 21:59:56.937725812 +0000 UTC m=+127.252537997" observedRunningTime="2025-12-04 21:59:57.348744622 +0000 UTC m=+127.663556817" watchObservedRunningTime="2025-12-04 21:59:57.350647793 +0000 UTC m=+127.665460058" Dec 04 21:59:57.365547 master-0 kubenswrapper[4842]: I1204 21:59:57.365438 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" podStartSLOduration=4.211538909 podStartE2EDuration="29.365406975s" podCreationTimestamp="2025-12-04 21:59:28 +0000 UTC" firstStartedPulling="2025-12-04 21:59:31.760108876 +0000 UTC m=+102.074921071" lastFinishedPulling="2025-12-04 21:59:56.913976952 +0000 UTC m=+127.228789137" observedRunningTime="2025-12-04 21:59:57.365250361 +0000 UTC m=+127.680062556" watchObservedRunningTime="2025-12-04 21:59:57.365406975 +0000 UTC m=+127.680219160" Dec 04 21:59:57.864205 master-0 kubenswrapper[4842]: I1204 21:59:57.863638 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 21:59:57.864378 master-0 kubenswrapper[4842]: E1204 21:59:57.864003 4842 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 21:59:57.864378 master-0 kubenswrapper[4842]: E1204 21:59:57.864341 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 22:01:01.864306129 +0000 UTC m=+192.179118354 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 21:59:58.300833 master-0 kubenswrapper[4842]: I1204 21:59:58.300671 4842 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="758bcdf683109d822a1017f454c5645fc9f981b1015625c2d5ef493072ef4678" exitCode=0 Dec 04 21:59:58.300833 master-0 kubenswrapper[4842]: I1204 21:59:58.300748 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"758bcdf683109d822a1017f454c5645fc9f981b1015625c2d5ef493072ef4678"} Dec 04 21:59:58.313153 master-0 kubenswrapper[4842]: I1204 21:59:58.312653 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerStarted","Data":"4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082"} Dec 04 21:59:58.313153 master-0 kubenswrapper[4842]: I1204 21:59:58.312794 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerStarted","Data":"4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7"} Dec 04 21:59:58.313153 master-0 kubenswrapper[4842]: I1204 21:59:58.313005 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerStarted","Data":"ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c"} Dec 04 21:59:58.313153 master-0 kubenswrapper[4842]: I1204 21:59:58.313049 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerStarted","Data":"74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63"} Dec 04 21:59:58.313153 master-0 kubenswrapper[4842]: I1204 21:59:58.313086 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerStarted","Data":"4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52"} Dec 04 21:59:58.313153 master-0 kubenswrapper[4842]: I1204 21:59:58.313130 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerStarted","Data":"378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216"} Dec 04 21:59:59.144968 master-0 kubenswrapper[4842]: I1204 21:59:59.144834 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 21:59:59.144968 master-0 kubenswrapper[4842]: I1204 21:59:59.144927 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 21:59:59.145380 master-0 kubenswrapper[4842]: E1204 21:59:59.145063 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 21:59:59.145380 master-0 kubenswrapper[4842]: E1204 21:59:59.145300 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 21:59:59.321057 master-0 kubenswrapper[4842]: I1204 21:59:59.320987 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerStarted","Data":"bcaaf6a96d954f901cc05fe39c7b7764e445e886db16581ddfb04f2c4ced3d82"} Dec 04 21:59:59.345220 master-0 kubenswrapper[4842]: I1204 21:59:59.345135 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" podStartSLOduration=3.100164797 podStartE2EDuration="43.34510052s" podCreationTimestamp="2025-12-04 21:59:16 +0000 UTC" firstStartedPulling="2025-12-04 21:59:16.61938673 +0000 UTC m=+86.934198945" lastFinishedPulling="2025-12-04 21:59:56.864322493 +0000 UTC m=+127.179134668" observedRunningTime="2025-12-04 21:59:59.34432298 +0000 UTC m=+129.659135195" watchObservedRunningTime="2025-12-04 21:59:59.34510052 +0000 UTC m=+129.659912735" Dec 04 22:00:00.331940 master-0 kubenswrapper[4842]: I1204 22:00:00.331860 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerStarted","Data":"059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17"} Dec 04 22:00:00.994634 master-0 kubenswrapper[4842]: E1204 22:00:00.994563 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 22:00:01.061223 master-0 kubenswrapper[4842]: I1204 22:00:01.061151 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g6f8c"] Dec 04 22:00:01.145019 master-0 kubenswrapper[4842]: I1204 22:00:01.144941 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:01.145019 master-0 kubenswrapper[4842]: I1204 22:00:01.145007 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:01.145412 master-0 kubenswrapper[4842]: E1204 22:00:01.145150 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:01.145412 master-0 kubenswrapper[4842]: E1204 22:00:01.145274 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:03.144723 master-0 kubenswrapper[4842]: I1204 22:00:03.144177 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:03.144723 master-0 kubenswrapper[4842]: I1204 22:00:03.144177 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:03.146002 master-0 kubenswrapper[4842]: E1204 22:00:03.144891 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:03.146002 master-0 kubenswrapper[4842]: E1204 22:00:03.145038 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:03.348725 master-0 kubenswrapper[4842]: I1204 22:00:03.348598 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerStarted","Data":"1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b"} Dec 04 22:00:03.349460 master-0 kubenswrapper[4842]: I1204 22:00:03.349086 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovn-controller" containerID="cri-o://378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216" gracePeriod=30 Dec 04 22:00:03.349460 master-0 kubenswrapper[4842]: I1204 22:00:03.349258 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="sbdb" containerID="cri-o://059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17" gracePeriod=30 Dec 04 22:00:03.349460 master-0 kubenswrapper[4842]: I1204 22:00:03.349286 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c" gracePeriod=30 Dec 04 22:00:03.349460 master-0 kubenswrapper[4842]: I1204 22:00:03.349372 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="kube-rbac-proxy-node" containerID="cri-o://74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63" gracePeriod=30 Dec 04 22:00:03.349460 master-0 kubenswrapper[4842]: I1204 22:00:03.349336 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovn-acl-logging" containerID="cri-o://4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52" gracePeriod=30 Dec 04 22:00:03.349731 master-0 kubenswrapper[4842]: I1204 22:00:03.349487 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="northd" containerID="cri-o://4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7" gracePeriod=30 Dec 04 22:00:03.349731 master-0 kubenswrapper[4842]: I1204 22:00:03.349551 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 22:00:03.349731 master-0 kubenswrapper[4842]: I1204 22:00:03.349541 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="nbdb" containerID="cri-o://4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082" gracePeriod=30 Dec 04 22:00:03.349881 master-0 kubenswrapper[4842]: I1204 22:00:03.349634 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 22:00:03.349881 master-0 kubenswrapper[4842]: I1204 22:00:03.349809 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 22:00:03.355091 master-0 kubenswrapper[4842]: E1204 22:00:03.354881 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 04 22:00:03.356954 master-0 kubenswrapper[4842]: E1204 22:00:03.355857 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 04 22:00:03.357458 master-0 kubenswrapper[4842]: E1204 22:00:03.357388 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 04 22:00:03.359391 master-0 kubenswrapper[4842]: E1204 22:00:03.359305 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 04 22:00:03.360813 master-0 kubenswrapper[4842]: E1204 22:00:03.360718 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Dec 04 22:00:03.360874 master-0 kubenswrapper[4842]: E1204 22:00:03.360819 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="sbdb" Dec 04 22:00:03.362722 master-0 kubenswrapper[4842]: E1204 22:00:03.362117 4842 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Dec 04 22:00:03.362722 master-0 kubenswrapper[4842]: E1204 22:00:03.362201 4842 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="nbdb" Dec 04 22:00:03.378043 master-0 kubenswrapper[4842]: I1204 22:00:03.377794 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podStartSLOduration=9.983326479 podStartE2EDuration="35.377778626s" podCreationTimestamp="2025-12-04 21:59:28 +0000 UTC" firstStartedPulling="2025-12-04 21:59:31.560432512 +0000 UTC m=+101.875244717" lastFinishedPulling="2025-12-04 21:59:56.954884639 +0000 UTC m=+127.269696864" observedRunningTime="2025-12-04 22:00:03.377095448 +0000 UTC m=+133.691907643" watchObservedRunningTime="2025-12-04 22:00:03.377778626 +0000 UTC m=+133.692590821" Dec 04 22:00:03.400459 master-0 kubenswrapper[4842]: I1204 22:00:03.399846 4842 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovnkube-controller" containerID="cri-o://1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b" gracePeriod=30 Dec 04 22:00:03.727857 master-0 kubenswrapper[4842]: I1204 22:00:03.727649 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:03.728089 master-0 kubenswrapper[4842]: E1204 22:00:03.727922 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 22:00:03.728089 master-0 kubenswrapper[4842]: E1204 22:00:03.727965 4842 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 22:00:03.728089 master-0 kubenswrapper[4842]: E1204 22:00:03.727987 4842 projected.go:194] Error preparing data for projected volume kube-api-access-gfhgj for pod openshift-network-diagnostics/network-check-target-6jkkl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 22:00:03.728089 master-0 kubenswrapper[4842]: E1204 22:00:03.728082 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj podName:510a595a-21bf-48fc-85cd-707bc8f5536f nodeName:}" failed. No retries permitted until 2025-12-04 22:00:35.728054491 +0000 UTC m=+166.042866706 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gfhgj" (UniqueName: "kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj") pod "network-check-target-6jkkl" (UID: "510a595a-21bf-48fc-85cd-707bc8f5536f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 22:00:04.357085 master-0 kubenswrapper[4842]: I1204 22:00:04.357014 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/kube-rbac-proxy-ovn-metrics/0.log" Dec 04 22:00:04.358532 master-0 kubenswrapper[4842]: I1204 22:00:04.357793 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/kube-rbac-proxy-node/0.log" Dec 04 22:00:04.358621 master-0 kubenswrapper[4842]: I1204 22:00:04.358494 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/ovn-acl-logging/0.log" Dec 04 22:00:04.359438 master-0 kubenswrapper[4842]: I1204 22:00:04.359362 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/ovn-controller/0.log" Dec 04 22:00:04.360083 master-0 kubenswrapper[4842]: I1204 22:00:04.360034 4842 generic.go:334] "Generic (PLEG): container finished" podID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerID="059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17" exitCode=0 Dec 04 22:00:04.360083 master-0 kubenswrapper[4842]: I1204 22:00:04.360078 4842 generic.go:334] "Generic (PLEG): container finished" podID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerID="4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082" exitCode=0 Dec 04 22:00:04.360256 master-0 kubenswrapper[4842]: I1204 22:00:04.360094 4842 generic.go:334] "Generic (PLEG): container finished" podID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerID="4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7" exitCode=0 Dec 04 22:00:04.360256 master-0 kubenswrapper[4842]: I1204 22:00:04.360110 4842 generic.go:334] "Generic (PLEG): container finished" podID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerID="ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c" exitCode=143 Dec 04 22:00:04.360256 master-0 kubenswrapper[4842]: I1204 22:00:04.360124 4842 generic.go:334] "Generic (PLEG): container finished" podID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerID="74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63" exitCode=143 Dec 04 22:00:04.360256 master-0 kubenswrapper[4842]: I1204 22:00:04.360139 4842 generic.go:334] "Generic (PLEG): container finished" podID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerID="4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52" exitCode=143 Dec 04 22:00:04.360256 master-0 kubenswrapper[4842]: I1204 22:00:04.360136 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerDied","Data":"059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17"} Dec 04 22:00:04.360256 master-0 kubenswrapper[4842]: I1204 22:00:04.360237 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerDied","Data":"4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082"} Dec 04 22:00:04.360857 master-0 kubenswrapper[4842]: I1204 22:00:04.360337 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerDied","Data":"4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7"} Dec 04 22:00:04.360857 master-0 kubenswrapper[4842]: I1204 22:00:04.360408 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerDied","Data":"ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c"} Dec 04 22:00:04.360857 master-0 kubenswrapper[4842]: I1204 22:00:04.360428 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerDied","Data":"74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63"} Dec 04 22:00:04.360857 master-0 kubenswrapper[4842]: I1204 22:00:04.360475 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerDied","Data":"4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52"} Dec 04 22:00:04.360857 master-0 kubenswrapper[4842]: I1204 22:00:04.360154 4842 generic.go:334] "Generic (PLEG): container finished" podID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerID="378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216" exitCode=143 Dec 04 22:00:04.360857 master-0 kubenswrapper[4842]: I1204 22:00:04.360495 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerDied","Data":"378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216"} Dec 04 22:00:04.706719 master-0 kubenswrapper[4842]: I1204 22:00:04.706623 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/ovnkube-controller/0.log" Dec 04 22:00:04.714484 master-0 kubenswrapper[4842]: I1204 22:00:04.714410 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/kube-rbac-proxy-ovn-metrics/0.log" Dec 04 22:00:04.715362 master-0 kubenswrapper[4842]: I1204 22:00:04.715319 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/kube-rbac-proxy-node/0.log" Dec 04 22:00:04.716162 master-0 kubenswrapper[4842]: I1204 22:00:04.716105 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/ovn-acl-logging/0.log" Dec 04 22:00:04.717163 master-0 kubenswrapper[4842]: I1204 22:00:04.717120 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/ovn-controller/0.log" Dec 04 22:00:04.718136 master-0 kubenswrapper[4842]: I1204 22:00:04.718073 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 22:00:04.737668 master-0 kubenswrapper[4842]: I1204 22:00:04.737534 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-systemd-units\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.737668 master-0 kubenswrapper[4842]: I1204 22:00:04.737612 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-config\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.737668 master-0 kubenswrapper[4842]: I1204 22:00:04.737659 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-openvswitch\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.737701 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-var-lib-openvswitch\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.737737 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.737780 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xxsc\" (UniqueName: \"kubernetes.io/projected/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-kube-api-access-8xxsc\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.737860 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-ovn\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.737904 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovn-node-metrics-cert\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.737938 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-script-lib\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.737973 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-ovn-kubernetes\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.738009 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-etc-openvswitch\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.738037 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-slash\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.738076 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-netns\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.738192 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.738226 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-bin\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.738244 master-0 kubenswrapper[4842]: I1204 22:00:04.738259 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738306 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738304 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-log-socket\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738351 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-systemd\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738374 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-node-log\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738388 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-log-socket" (OuterVolumeSpecName: "log-socket") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738404 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-env-overrides\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738426 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-kubelet\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738456 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738485 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738554 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-slash" (OuterVolumeSpecName: "host-slash") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738650 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738660 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738677 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738715 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-node-log" (OuterVolumeSpecName: "node-log") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738697 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.739034 master-0 kubenswrapper[4842]: I1204 22:00:04.738788 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.738866 4842 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-netd\") pod \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\" (UID: \"57ba9e92-a587-4ccb-84d2-2ac60f420ec0\") " Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.738913 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739365 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739467 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739488 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739143 4842 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739601 4842 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739630 4842 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-ovn\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739656 4842 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739679 4842 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739698 4842 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-slash\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739718 4842 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-run-netns\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739737 4842 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739755 4842 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-log-socket\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739778 4842 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-kubelet\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739796 4842 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-node-log\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739816 4842 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-systemd-units\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.740013 master-0 kubenswrapper[4842]: I1204 22:00:04.739835 4842 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.745156 master-0 kubenswrapper[4842]: I1204 22:00:04.745091 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-kube-api-access-8xxsc" (OuterVolumeSpecName: "kube-api-access-8xxsc") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "kube-api-access-8xxsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:00:04.745307 master-0 kubenswrapper[4842]: I1204 22:00:04.745250 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:00:04.747565 master-0 kubenswrapper[4842]: I1204 22:00:04.747521 4842 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "57ba9e92-a587-4ccb-84d2-2ac60f420ec0" (UID: "57ba9e92-a587-4ccb-84d2-2ac60f420ec0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:04.778805 master-0 kubenswrapper[4842]: I1204 22:00:04.778713 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8nxc5"] Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: E1204 22:00:04.778873 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="kubecfg-setup" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.778900 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="kubecfg-setup" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: E1204 22:00:04.778914 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovnkube-controller" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.778928 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovnkube-controller" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: E1204 22:00:04.778943 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovn-acl-logging" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.778956 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovn-acl-logging" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: E1204 22:00:04.778972 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="northd" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.778984 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="northd" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: E1204 22:00:04.778996 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovn-controller" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.779008 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovn-controller" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: E1204 22:00:04.779021 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.779033 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: E1204 22:00:04.779049 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="sbdb" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.779064 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="sbdb" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: E1204 22:00:04.779077 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="kube-rbac-proxy-node" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.779091 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="kube-rbac-proxy-node" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: E1204 22:00:04.779105 4842 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="nbdb" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.779117 4842 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="nbdb" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.779180 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="nbdb" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.779197 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovnkube-controller" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.779211 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.779224 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="northd" Dec 04 22:00:04.779206 master-0 kubenswrapper[4842]: I1204 22:00:04.779239 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="sbdb" Dec 04 22:00:04.780092 master-0 kubenswrapper[4842]: I1204 22:00:04.779253 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovn-acl-logging" Dec 04 22:00:04.780092 master-0 kubenswrapper[4842]: I1204 22:00:04.779266 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="kube-rbac-proxy-node" Dec 04 22:00:04.780092 master-0 kubenswrapper[4842]: I1204 22:00:04.779281 4842 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerName="ovn-controller" Dec 04 22:00:04.780538 master-0 kubenswrapper[4842]: I1204 22:00:04.780467 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.840428 master-0 kubenswrapper[4842]: I1204 22:00:04.840309 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-var-lib-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.840428 master-0 kubenswrapper[4842]: I1204 22:00:04.840395 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-ovn\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.840428 master-0 kubenswrapper[4842]: I1204 22:00:04.840438 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-script-lib\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.840906 master-0 kubenswrapper[4842]: I1204 22:00:04.840600 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-config\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.840906 master-0 kubenswrapper[4842]: I1204 22:00:04.840737 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovn-node-metrics-cert\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.840906 master-0 kubenswrapper[4842]: I1204 22:00:04.840839 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-etc-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841092 master-0 kubenswrapper[4842]: I1204 22:00:04.840949 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841092 master-0 kubenswrapper[4842]: I1204 22:00:04.841005 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-slash\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841092 master-0 kubenswrapper[4842]: I1204 22:00:04.841039 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841092 master-0 kubenswrapper[4842]: I1204 22:00:04.841077 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-node-log\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841316 master-0 kubenswrapper[4842]: I1204 22:00:04.841160 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-systemd-units\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841316 master-0 kubenswrapper[4842]: I1204 22:00:04.841207 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-log-socket\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841316 master-0 kubenswrapper[4842]: I1204 22:00:04.841280 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-env-overrides\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841483 master-0 kubenswrapper[4842]: I1204 22:00:04.841340 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4rft\" (UniqueName: \"kubernetes.io/projected/59d3d0d8-1a2a-4d14-8312-d33818acba88-kube-api-access-d4rft\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841483 master-0 kubenswrapper[4842]: I1204 22:00:04.841432 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-netns\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841642 master-0 kubenswrapper[4842]: I1204 22:00:04.841496 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841642 master-0 kubenswrapper[4842]: I1204 22:00:04.841620 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-systemd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841752 master-0 kubenswrapper[4842]: I1204 22:00:04.841676 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-kubelet\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841752 master-0 kubenswrapper[4842]: I1204 22:00:04.841727 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-netd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841866 master-0 kubenswrapper[4842]: I1204 22:00:04.841784 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-bin\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.841930 master-0 kubenswrapper[4842]: I1204 22:00:04.841900 4842 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-env-overrides\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.841997 master-0 kubenswrapper[4842]: I1204 22:00:04.841934 4842 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.841997 master-0 kubenswrapper[4842]: I1204 22:00:04.841965 4842 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.842108 master-0 kubenswrapper[4842]: I1204 22:00:04.841999 4842 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xxsc\" (UniqueName: \"kubernetes.io/projected/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-kube-api-access-8xxsc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.842108 master-0 kubenswrapper[4842]: I1204 22:00:04.842021 4842 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.842108 master-0 kubenswrapper[4842]: I1204 22:00:04.842041 4842 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.842108 master-0 kubenswrapper[4842]: I1204 22:00:04.842067 4842 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57ba9e92-a587-4ccb-84d2-2ac60f420ec0-run-systemd\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:04.943062 master-0 kubenswrapper[4842]: I1204 22:00:04.942843 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovn-node-metrics-cert\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.943062 master-0 kubenswrapper[4842]: I1204 22:00:04.942929 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-etc-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.943062 master-0 kubenswrapper[4842]: I1204 22:00:04.942996 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-slash\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.943062 master-0 kubenswrapper[4842]: I1204 22:00:04.943031 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.943710 master-0 kubenswrapper[4842]: I1204 22:00:04.943125 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.943710 master-0 kubenswrapper[4842]: I1204 22:00:04.943210 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-etc-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.943710 master-0 kubenswrapper[4842]: I1204 22:00:04.943212 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-slash\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.943710 master-0 kubenswrapper[4842]: I1204 22:00:04.943393 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-node-log\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.943710 master-0 kubenswrapper[4842]: I1204 22:00:04.943436 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.943710 master-0 kubenswrapper[4842]: I1204 22:00:04.943656 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-node-log\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944179 master-0 kubenswrapper[4842]: I1204 22:00:04.943786 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-systemd-units\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944179 master-0 kubenswrapper[4842]: I1204 22:00:04.943877 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-log-socket\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944179 master-0 kubenswrapper[4842]: I1204 22:00:04.943960 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-systemd-units\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944179 master-0 kubenswrapper[4842]: I1204 22:00:04.944062 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-env-overrides\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944179 master-0 kubenswrapper[4842]: I1204 22:00:04.944161 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-log-socket\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944222 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4rft\" (UniqueName: \"kubernetes.io/projected/59d3d0d8-1a2a-4d14-8312-d33818acba88-kube-api-access-d4rft\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944268 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-netns\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944300 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944341 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-systemd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944375 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-kubelet\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944416 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944460 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944546 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-netd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944587 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-bin\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944589 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-kubelet\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.944586 master-0 kubenswrapper[4842]: I1204 22:00:04.944543 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-netns\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.945673 master-0 kubenswrapper[4842]: I1204 22:00:04.944660 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-bin\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.945673 master-0 kubenswrapper[4842]: I1204 22:00:04.944755 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-var-lib-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.945673 master-0 kubenswrapper[4842]: I1204 22:00:04.944799 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-ovn\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.945673 master-0 kubenswrapper[4842]: I1204 22:00:04.944835 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-script-lib\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.945673 master-0 kubenswrapper[4842]: I1204 22:00:04.944875 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-config\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.945673 master-0 kubenswrapper[4842]: I1204 22:00:04.944880 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-var-lib-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.945673 master-0 kubenswrapper[4842]: I1204 22:00:04.944982 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-ovn\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.945673 master-0 kubenswrapper[4842]: I1204 22:00:04.945100 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-systemd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.945673 master-0 kubenswrapper[4842]: I1204 22:00:04.945141 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-env-overrides\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.945673 master-0 kubenswrapper[4842]: I1204 22:00:04.945209 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-netd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.946322 master-0 kubenswrapper[4842]: I1204 22:00:04.946271 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-script-lib\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.946400 master-0 kubenswrapper[4842]: I1204 22:00:04.946350 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-config\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.947437 master-0 kubenswrapper[4842]: I1204 22:00:04.947358 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovn-node-metrics-cert\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:04.978547 master-0 kubenswrapper[4842]: I1204 22:00:04.978405 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4rft\" (UniqueName: \"kubernetes.io/projected/59d3d0d8-1a2a-4d14-8312-d33818acba88-kube-api-access-d4rft\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:05.101772 master-0 kubenswrapper[4842]: I1204 22:00:05.101638 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:05.121547 master-0 kubenswrapper[4842]: W1204 22:00:05.121435 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59d3d0d8_1a2a_4d14_8312_d33818acba88.slice/crio-d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b WatchSource:0}: Error finding container d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b: Status 404 returned error can't find the container with id d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b Dec 04 22:00:05.144554 master-0 kubenswrapper[4842]: I1204 22:00:05.144416 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:05.144702 master-0 kubenswrapper[4842]: I1204 22:00:05.144445 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:05.144791 master-0 kubenswrapper[4842]: E1204 22:00:05.144714 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:05.144908 master-0 kubenswrapper[4842]: E1204 22:00:05.144842 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:05.367044 master-0 kubenswrapper[4842]: I1204 22:00:05.366951 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/ovnkube-controller/0.log" Dec 04 22:00:05.369853 master-0 kubenswrapper[4842]: I1204 22:00:05.369786 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/kube-rbac-proxy-ovn-metrics/0.log" Dec 04 22:00:05.370602 master-0 kubenswrapper[4842]: I1204 22:00:05.370551 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/kube-rbac-proxy-node/0.log" Dec 04 22:00:05.371428 master-0 kubenswrapper[4842]: I1204 22:00:05.371346 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/ovn-acl-logging/0.log" Dec 04 22:00:05.372438 master-0 kubenswrapper[4842]: I1204 22:00:05.372242 4842 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g6f8c_57ba9e92-a587-4ccb-84d2-2ac60f420ec0/ovn-controller/0.log" Dec 04 22:00:05.373361 master-0 kubenswrapper[4842]: I1204 22:00:05.373152 4842 generic.go:334] "Generic (PLEG): container finished" podID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" containerID="1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b" exitCode=1 Dec 04 22:00:05.373361 master-0 kubenswrapper[4842]: I1204 22:00:05.373273 4842 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" Dec 04 22:00:05.373361 master-0 kubenswrapper[4842]: I1204 22:00:05.373278 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerDied","Data":"1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b"} Dec 04 22:00:05.374388 master-0 kubenswrapper[4842]: I1204 22:00:05.373391 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g6f8c" event={"ID":"57ba9e92-a587-4ccb-84d2-2ac60f420ec0","Type":"ContainerDied","Data":"2790dad3d78d4a3d56be2a118c350f47bfd8d53723c6fb4f92908cfb1c9d89a4"} Dec 04 22:00:05.374388 master-0 kubenswrapper[4842]: I1204 22:00:05.373499 4842 scope.go:117] "RemoveContainer" containerID="1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b" Dec 04 22:00:05.375796 master-0 kubenswrapper[4842]: I1204 22:00:05.375732 4842 generic.go:334] "Generic (PLEG): container finished" podID="59d3d0d8-1a2a-4d14-8312-d33818acba88" containerID="d14cbc85e41a76d9831e3cb322a42ef6928588924655708cdbc5b0d0983944d9" exitCode=0 Dec 04 22:00:05.375796 master-0 kubenswrapper[4842]: I1204 22:00:05.375791 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerDied","Data":"d14cbc85e41a76d9831e3cb322a42ef6928588924655708cdbc5b0d0983944d9"} Dec 04 22:00:05.375936 master-0 kubenswrapper[4842]: I1204 22:00:05.375832 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b"} Dec 04 22:00:05.393114 master-0 kubenswrapper[4842]: I1204 22:00:05.392840 4842 scope.go:117] "RemoveContainer" containerID="059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17" Dec 04 22:00:05.407492 master-0 kubenswrapper[4842]: I1204 22:00:05.407430 4842 scope.go:117] "RemoveContainer" containerID="4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082" Dec 04 22:00:05.420723 master-0 kubenswrapper[4842]: I1204 22:00:05.420651 4842 scope.go:117] "RemoveContainer" containerID="4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7" Dec 04 22:00:05.437787 master-0 kubenswrapper[4842]: I1204 22:00:05.437725 4842 scope.go:117] "RemoveContainer" containerID="ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c" Dec 04 22:00:05.438037 master-0 kubenswrapper[4842]: I1204 22:00:05.437930 4842 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g6f8c"] Dec 04 22:00:05.444583 master-0 kubenswrapper[4842]: I1204 22:00:05.444464 4842 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-g6f8c"] Dec 04 22:00:05.474043 master-0 kubenswrapper[4842]: I1204 22:00:05.473991 4842 scope.go:117] "RemoveContainer" containerID="74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63" Dec 04 22:00:05.498703 master-0 kubenswrapper[4842]: I1204 22:00:05.498660 4842 scope.go:117] "RemoveContainer" containerID="4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52" Dec 04 22:00:05.509858 master-0 kubenswrapper[4842]: I1204 22:00:05.509824 4842 scope.go:117] "RemoveContainer" containerID="378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216" Dec 04 22:00:05.524250 master-0 kubenswrapper[4842]: I1204 22:00:05.524170 4842 scope.go:117] "RemoveContainer" containerID="f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414" Dec 04 22:00:05.538536 master-0 kubenswrapper[4842]: I1204 22:00:05.538464 4842 scope.go:117] "RemoveContainer" containerID="1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b" Dec 04 22:00:05.540026 master-0 kubenswrapper[4842]: E1204 22:00:05.539989 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b\": container with ID starting with 1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b not found: ID does not exist" containerID="1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b" Dec 04 22:00:05.540196 master-0 kubenswrapper[4842]: I1204 22:00:05.540040 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b"} err="failed to get container status \"1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b\": rpc error: code = NotFound desc = could not find container \"1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b\": container with ID starting with 1084bbe00f1f72a694ea80c40660157145c5cf825657c201aa1aaa315350466b not found: ID does not exist" Dec 04 22:00:05.540196 master-0 kubenswrapper[4842]: I1204 22:00:05.540174 4842 scope.go:117] "RemoveContainer" containerID="059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17" Dec 04 22:00:05.541296 master-0 kubenswrapper[4842]: E1204 22:00:05.541219 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17\": container with ID starting with 059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17 not found: ID does not exist" containerID="059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17" Dec 04 22:00:05.541363 master-0 kubenswrapper[4842]: I1204 22:00:05.541308 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17"} err="failed to get container status \"059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17\": rpc error: code = NotFound desc = could not find container \"059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17\": container with ID starting with 059eb9dc05920b78641314d90ae0954bc26827da5dffdfc75cbaefe54f6b2e17 not found: ID does not exist" Dec 04 22:00:05.541409 master-0 kubenswrapper[4842]: I1204 22:00:05.541365 4842 scope.go:117] "RemoveContainer" containerID="4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082" Dec 04 22:00:05.542109 master-0 kubenswrapper[4842]: E1204 22:00:05.542053 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082\": container with ID starting with 4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082 not found: ID does not exist" containerID="4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082" Dec 04 22:00:05.542206 master-0 kubenswrapper[4842]: I1204 22:00:05.542104 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082"} err="failed to get container status \"4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082\": rpc error: code = NotFound desc = could not find container \"4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082\": container with ID starting with 4c4cadc6a908f04c3631e2a8585964e80753bf38af209ed8ca828b642bfe9082 not found: ID does not exist" Dec 04 22:00:05.542206 master-0 kubenswrapper[4842]: I1204 22:00:05.542133 4842 scope.go:117] "RemoveContainer" containerID="4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7" Dec 04 22:00:05.542661 master-0 kubenswrapper[4842]: E1204 22:00:05.542573 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7\": container with ID starting with 4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7 not found: ID does not exist" containerID="4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7" Dec 04 22:00:05.542784 master-0 kubenswrapper[4842]: I1204 22:00:05.542642 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7"} err="failed to get container status \"4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7\": rpc error: code = NotFound desc = could not find container \"4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7\": container with ID starting with 4a1374f7630bde51e9868746455e89db1fcb30c220b75215f08d5526e2ab35d7 not found: ID does not exist" Dec 04 22:00:05.542784 master-0 kubenswrapper[4842]: I1204 22:00:05.542688 4842 scope.go:117] "RemoveContainer" containerID="ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c" Dec 04 22:00:05.543489 master-0 kubenswrapper[4842]: E1204 22:00:05.543448 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c\": container with ID starting with ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c not found: ID does not exist" containerID="ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c" Dec 04 22:00:05.543592 master-0 kubenswrapper[4842]: I1204 22:00:05.543482 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c"} err="failed to get container status \"ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c\": rpc error: code = NotFound desc = could not find container \"ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c\": container with ID starting with ce3f262ee2e394cec492be78d6649f0769f15069ed9ca4231f4cf784b1d70a7c not found: ID does not exist" Dec 04 22:00:05.543592 master-0 kubenswrapper[4842]: I1204 22:00:05.543529 4842 scope.go:117] "RemoveContainer" containerID="74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63" Dec 04 22:00:05.543997 master-0 kubenswrapper[4842]: E1204 22:00:05.543964 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63\": container with ID starting with 74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63 not found: ID does not exist" containerID="74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63" Dec 04 22:00:05.543997 master-0 kubenswrapper[4842]: I1204 22:00:05.543988 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63"} err="failed to get container status \"74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63\": rpc error: code = NotFound desc = could not find container \"74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63\": container with ID starting with 74ca0d1f907277aabb4fb6acd354ea9b08681b56cef93f4b152a90df87d71f63 not found: ID does not exist" Dec 04 22:00:05.544158 master-0 kubenswrapper[4842]: I1204 22:00:05.544001 4842 scope.go:117] "RemoveContainer" containerID="4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52" Dec 04 22:00:05.544677 master-0 kubenswrapper[4842]: E1204 22:00:05.544622 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52\": container with ID starting with 4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52 not found: ID does not exist" containerID="4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52" Dec 04 22:00:05.544677 master-0 kubenswrapper[4842]: I1204 22:00:05.544654 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52"} err="failed to get container status \"4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52\": rpc error: code = NotFound desc = could not find container \"4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52\": container with ID starting with 4e0ca2fe2ccd49ff749cd2597db40779e3631801f280f18e4eeb4b5e976fbf52 not found: ID does not exist" Dec 04 22:00:05.544677 master-0 kubenswrapper[4842]: I1204 22:00:05.544674 4842 scope.go:117] "RemoveContainer" containerID="378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216" Dec 04 22:00:05.545013 master-0 kubenswrapper[4842]: E1204 22:00:05.544964 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216\": container with ID starting with 378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216 not found: ID does not exist" containerID="378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216" Dec 04 22:00:05.545013 master-0 kubenswrapper[4842]: I1204 22:00:05.544992 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216"} err="failed to get container status \"378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216\": rpc error: code = NotFound desc = could not find container \"378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216\": container with ID starting with 378dad06c51d9f2812afdcabaa14c7b5a8baae8a5a5ca41b529d484d7c392216 not found: ID does not exist" Dec 04 22:00:05.545013 master-0 kubenswrapper[4842]: I1204 22:00:05.545009 4842 scope.go:117] "RemoveContainer" containerID="f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414" Dec 04 22:00:05.545434 master-0 kubenswrapper[4842]: E1204 22:00:05.545384 4842 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414\": container with ID starting with f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414 not found: ID does not exist" containerID="f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414" Dec 04 22:00:05.545434 master-0 kubenswrapper[4842]: I1204 22:00:05.545421 4842 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414"} err="failed to get container status \"f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414\": rpc error: code = NotFound desc = could not find container \"f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414\": container with ID starting with f6d2649a1afea7b877574869dd3e20e9669816b4085e6160a79a4fec74b31414 not found: ID does not exist" Dec 04 22:00:05.996416 master-0 kubenswrapper[4842]: E1204 22:00:05.996312 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 22:00:06.152226 master-0 kubenswrapper[4842]: I1204 22:00:06.152132 4842 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ba9e92-a587-4ccb-84d2-2ac60f420ec0" path="/var/lib/kubelet/pods/57ba9e92-a587-4ccb-84d2-2ac60f420ec0/volumes" Dec 04 22:00:06.387987 master-0 kubenswrapper[4842]: I1204 22:00:06.387885 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"12c4aa15e4e79d5f90b97077c400d618cdd6a7f09f25df0096cef1db7225b99d"} Dec 04 22:00:06.387987 master-0 kubenswrapper[4842]: I1204 22:00:06.387940 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"7290093cc147531531f377286d1c48e3803031a7cc41744c297aa00505901855"} Dec 04 22:00:06.387987 master-0 kubenswrapper[4842]: I1204 22:00:06.387958 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"072d9d8a34bba009b433a8865da7ea50c856bf5a8fcc704a213e14db6134cc03"} Dec 04 22:00:06.387987 master-0 kubenswrapper[4842]: I1204 22:00:06.387972 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"55e5df41db2945971119fe0a034f6c7c7f38f1e44c695ddf59539c8fa0491a30"} Dec 04 22:00:06.387987 master-0 kubenswrapper[4842]: I1204 22:00:06.387985 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"5e65b113fd0cc6bd898a4b738d10907e4d7312801f7a18d6e95d69cd06443a6c"} Dec 04 22:00:06.387987 master-0 kubenswrapper[4842]: I1204 22:00:06.387999 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"d74bdf11b81a168a1eb50f57289a14094033b1e6fe3938a39885cff3f029fbfd"} Dec 04 22:00:07.144741 master-0 kubenswrapper[4842]: I1204 22:00:07.144634 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:07.145007 master-0 kubenswrapper[4842]: I1204 22:00:07.144759 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:07.145222 master-0 kubenswrapper[4842]: E1204 22:00:07.145170 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:07.145387 master-0 kubenswrapper[4842]: E1204 22:00:07.145315 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:08.402824 master-0 kubenswrapper[4842]: I1204 22:00:08.402244 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"4164cab9c50a981b577e39cbb489f1522a739da479bf036162f662ad7cf84d9e"} Dec 04 22:00:09.145420 master-0 kubenswrapper[4842]: I1204 22:00:09.145269 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:09.145420 master-0 kubenswrapper[4842]: I1204 22:00:09.145329 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:09.145932 master-0 kubenswrapper[4842]: E1204 22:00:09.145478 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:09.145932 master-0 kubenswrapper[4842]: E1204 22:00:09.145826 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:10.997618 master-0 kubenswrapper[4842]: E1204 22:00:10.997362 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 22:00:11.145280 master-0 kubenswrapper[4842]: I1204 22:00:11.145187 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:11.145676 master-0 kubenswrapper[4842]: I1204 22:00:11.145209 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:11.145676 master-0 kubenswrapper[4842]: E1204 22:00:11.145496 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:11.145676 master-0 kubenswrapper[4842]: E1204 22:00:11.145606 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:11.419023 master-0 kubenswrapper[4842]: I1204 22:00:11.418939 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"ff0f980e1e849c21f6412c540c1a8c9abeff149bce406310f67bdb69c4eae768"} Dec 04 22:00:11.419561 master-0 kubenswrapper[4842]: I1204 22:00:11.419472 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:11.419732 master-0 kubenswrapper[4842]: I1204 22:00:11.419636 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:11.493890 master-0 kubenswrapper[4842]: I1204 22:00:11.493805 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:11.797690 master-0 kubenswrapper[4842]: I1204 22:00:11.797563 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" podStartSLOduration=7.7974912530000005 podStartE2EDuration="7.797491253s" podCreationTimestamp="2025-12-04 22:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:00:11.795632943 +0000 UTC m=+142.110445118" watchObservedRunningTime="2025-12-04 22:00:11.797491253 +0000 UTC m=+142.112303488" Dec 04 22:00:12.278240 master-0 kubenswrapper[4842]: I1204 22:00:12.278149 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6jkkl"] Dec 04 22:00:12.279191 master-0 kubenswrapper[4842]: I1204 22:00:12.278336 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:12.279191 master-0 kubenswrapper[4842]: E1204 22:00:12.278444 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:12.281597 master-0 kubenswrapper[4842]: I1204 22:00:12.281443 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9pfhj"] Dec 04 22:00:12.281961 master-0 kubenswrapper[4842]: I1204 22:00:12.281640 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:12.281961 master-0 kubenswrapper[4842]: E1204 22:00:12.281756 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:12.423213 master-0 kubenswrapper[4842]: I1204 22:00:12.423126 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:12.451771 master-0 kubenswrapper[4842]: I1204 22:00:12.451680 4842 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:14.145130 master-0 kubenswrapper[4842]: I1204 22:00:14.144952 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:14.146402 master-0 kubenswrapper[4842]: E1204 22:00:14.145171 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:14.146402 master-0 kubenswrapper[4842]: I1204 22:00:14.145291 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:14.146402 master-0 kubenswrapper[4842]: E1204 22:00:14.145632 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:15.999651 master-0 kubenswrapper[4842]: E1204 22:00:15.999551 4842 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 22:00:16.144379 master-0 kubenswrapper[4842]: I1204 22:00:16.144255 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:16.144756 master-0 kubenswrapper[4842]: E1204 22:00:16.144490 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:16.144932 master-0 kubenswrapper[4842]: I1204 22:00:16.144851 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:16.145111 master-0 kubenswrapper[4842]: E1204 22:00:16.145047 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:18.144623 master-0 kubenswrapper[4842]: I1204 22:00:18.144462 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:18.145279 master-0 kubenswrapper[4842]: I1204 22:00:18.144521 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:18.145279 master-0 kubenswrapper[4842]: E1204 22:00:18.144719 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:18.145279 master-0 kubenswrapper[4842]: E1204 22:00:18.144899 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:20.145108 master-0 kubenswrapper[4842]: I1204 22:00:20.145031 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:20.145656 master-0 kubenswrapper[4842]: I1204 22:00:20.145127 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:20.146438 master-0 kubenswrapper[4842]: E1204 22:00:20.146301 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9pfhj" podUID="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" Dec 04 22:00:20.146438 master-0 kubenswrapper[4842]: E1204 22:00:20.146422 4842 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6jkkl" podUID="510a595a-21bf-48fc-85cd-707bc8f5536f" Dec 04 22:00:21.015141 master-0 kubenswrapper[4842]: I1204 22:00:21.014631 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:21.015526 master-0 kubenswrapper[4842]: E1204 22:00:21.014863 4842 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 22:00:21.015526 master-0 kubenswrapper[4842]: E1204 22:00:21.015300 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 22:01:25.015263211 +0000 UTC m=+215.330075436 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 22:00:22.145039 master-0 kubenswrapper[4842]: I1204 22:00:22.144806 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:22.146565 master-0 kubenswrapper[4842]: I1204 22:00:22.144806 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:22.147915 master-0 kubenswrapper[4842]: I1204 22:00:22.147849 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 22:00:22.148430 master-0 kubenswrapper[4842]: I1204 22:00:22.148371 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 22:00:22.148645 master-0 kubenswrapper[4842]: I1204 22:00:22.148442 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 22:00:26.872331 master-0 kubenswrapper[4842]: I1204 22:00:26.872227 4842 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Dec 04 22:00:26.918742 master-0 kubenswrapper[4842]: I1204 22:00:26.918614 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-f797b99b6-m9m4h"] Dec 04 22:00:26.919332 master-0 kubenswrapper[4842]: I1204 22:00:26.919265 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:26.921095 master-0 kubenswrapper[4842]: I1204 22:00:26.921006 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk"] Dec 04 22:00:26.922093 master-0 kubenswrapper[4842]: I1204 22:00:26.922030 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:26.922699 master-0 kubenswrapper[4842]: I1204 22:00:26.922638 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 22:00:26.923382 master-0 kubenswrapper[4842]: I1204 22:00:26.923323 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 22:00:26.925950 master-0 kubenswrapper[4842]: I1204 22:00:26.925880 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 22:00:26.926111 master-0 kubenswrapper[4842]: I1204 22:00:26.925978 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 22:00:26.926460 master-0 kubenswrapper[4842]: I1204 22:00:26.926415 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 22:00:26.926654 master-0 kubenswrapper[4842]: I1204 22:00:26.926602 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 22:00:26.927284 master-0 kubenswrapper[4842]: I1204 22:00:26.927245 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 22:00:26.927903 master-0 kubenswrapper[4842]: I1204 22:00:26.927870 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz"] Dec 04 22:00:26.928672 master-0 kubenswrapper[4842]: I1204 22:00:26.928641 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:26.930187 master-0 kubenswrapper[4842]: I1204 22:00:26.928739 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p"] Dec 04 22:00:26.930977 master-0 kubenswrapper[4842]: I1204 22:00:26.930942 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5"] Dec 04 22:00:26.931669 master-0 kubenswrapper[4842]: I1204 22:00:26.931638 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:26.932064 master-0 kubenswrapper[4842]: I1204 22:00:26.932003 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:26.932633 master-0 kubenswrapper[4842]: I1204 22:00:26.932606 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk"] Dec 04 22:00:26.933259 master-0 kubenswrapper[4842]: I1204 22:00:26.933223 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" Dec 04 22:00:26.937566 master-0 kubenswrapper[4842]: I1204 22:00:26.935674 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt"] Dec 04 22:00:26.937566 master-0 kubenswrapper[4842]: I1204 22:00:26.936376 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:26.937566 master-0 kubenswrapper[4842]: I1204 22:00:26.937175 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b"] Dec 04 22:00:26.938173 master-0 kubenswrapper[4842]: I1204 22:00:26.937785 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 04 22:00:26.938173 master-0 kubenswrapper[4842]: I1204 22:00:26.937794 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 22:00:26.938173 master-0 kubenswrapper[4842]: I1204 22:00:26.937842 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:26.938173 master-0 kubenswrapper[4842]: I1204 22:00:26.937832 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 22:00:26.938173 master-0 kubenswrapper[4842]: I1204 22:00:26.938161 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 04 22:00:26.938657 master-0 kubenswrapper[4842]: I1204 22:00:26.938207 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 22:00:26.938657 master-0 kubenswrapper[4842]: I1204 22:00:26.938193 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 04 22:00:26.938657 master-0 kubenswrapper[4842]: I1204 22:00:26.938103 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 22:00:26.938657 master-0 kubenswrapper[4842]: I1204 22:00:26.938105 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 22:00:26.938657 master-0 kubenswrapper[4842]: I1204 22:00:26.938392 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 22:00:26.939073 master-0 kubenswrapper[4842]: I1204 22:00:26.938794 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 22:00:26.955544 master-0 kubenswrapper[4842]: I1204 22:00:26.951836 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb"] Dec 04 22:00:26.955544 master-0 kubenswrapper[4842]: I1204 22:00:26.952724 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq"] Dec 04 22:00:26.955544 master-0 kubenswrapper[4842]: I1204 22:00:26.953184 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 22:00:26.955544 master-0 kubenswrapper[4842]: I1204 22:00:26.953842 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:26.955544 master-0 kubenswrapper[4842]: I1204 22:00:26.953915 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 22:00:26.955544 master-0 kubenswrapper[4842]: I1204 22:00:26.954886 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf"] Dec 04 22:00:26.956441 master-0 kubenswrapper[4842]: I1204 22:00:26.955601 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:26.959566 master-0 kubenswrapper[4842]: I1204 22:00:26.959490 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 04 22:00:26.971547 master-0 kubenswrapper[4842]: I1204 22:00:26.959834 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:26.980594 master-0 kubenswrapper[4842]: I1204 22:00:26.980166 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 04 22:00:26.982162 master-0 kubenswrapper[4842]: I1204 22:00:26.980859 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-7c56cf9b74-sshsd"] Dec 04 22:00:26.982162 master-0 kubenswrapper[4842]: I1204 22:00:26.981178 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 22:00:26.982162 master-0 kubenswrapper[4842]: I1204 22:00:26.981692 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn"] Dec 04 22:00:26.982162 master-0 kubenswrapper[4842]: I1204 22:00:26.982124 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-8649c48786-qlkgh"] Dec 04 22:00:26.982443 master-0 kubenswrapper[4842]: I1204 22:00:26.982297 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:26.982443 master-0 kubenswrapper[4842]: I1204 22:00:26.982145 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:26.982443 master-0 kubenswrapper[4842]: I1204 22:00:26.982411 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 22:00:26.982715 master-0 kubenswrapper[4842]: I1204 22:00:26.982683 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 04 22:00:26.982878 master-0 kubenswrapper[4842]: I1204 22:00:26.982819 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.985731 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.985963 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.986160 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg"] Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.986381 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.986434 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.986444 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb"] Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.986568 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.986903 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.987294 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz"] Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.987659 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.987786 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.987978 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.988044 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.988151 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.988181 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.988341 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx"] Dec 04 22:00:26.988532 master-0 kubenswrapper[4842]: I1204 22:00:26.988433 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 22:00:26.989411 master-0 kubenswrapper[4842]: I1204 22:00:26.988651 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 22:00:26.989411 master-0 kubenswrapper[4842]: I1204 22:00:26.988899 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:26.989411 master-0 kubenswrapper[4842]: I1204 22:00:26.988922 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:00:26.989597 master-0 kubenswrapper[4842]: I1204 22:00:26.989545 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:26.993135 master-0 kubenswrapper[4842]: I1204 22:00:26.993088 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp"] Dec 04 22:00:26.993815 master-0 kubenswrapper[4842]: I1204 22:00:26.993705 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:26.994105 master-0 kubenswrapper[4842]: I1204 22:00:26.994075 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz"] Dec 04 22:00:26.995983 master-0 kubenswrapper[4842]: I1204 22:00:26.995950 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 22:00:26.996411 master-0 kubenswrapper[4842]: I1204 22:00:26.996377 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 22:00:26.997950 master-0 kubenswrapper[4842]: I1204 22:00:26.997195 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 22:00:27.002525 master-0 kubenswrapper[4842]: I1204 22:00:27.000745 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 22:00:27.002525 master-0 kubenswrapper[4842]: I1204 22:00:27.000841 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 22:00:27.002525 master-0 kubenswrapper[4842]: I1204 22:00:27.000997 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 22:00:27.002525 master-0 kubenswrapper[4842]: I1204 22:00:27.001147 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 22:00:27.002525 master-0 kubenswrapper[4842]: I1204 22:00:27.001450 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 22:00:27.002525 master-0 kubenswrapper[4842]: I1204 22:00:27.001660 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 22:00:27.002525 master-0 kubenswrapper[4842]: I1204 22:00:27.001791 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 22:00:27.002525 master-0 kubenswrapper[4842]: I1204 22:00:27.001919 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 22:00:27.002525 master-0 kubenswrapper[4842]: I1204 22:00:27.002049 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 04 22:00:27.002911 master-0 kubenswrapper[4842]: I1204 22:00:27.002699 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 22:00:27.002911 master-0 kubenswrapper[4842]: I1204 22:00:27.002889 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 22:00:27.003136 master-0 kubenswrapper[4842]: I1204 22:00:27.003041 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 22:00:27.003187 master-0 kubenswrapper[4842]: I1204 22:00:27.003173 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.003307 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.003467 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.003805 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.004225 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.004435 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.004642 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p"] Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.004697 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5"] Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.004913 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.005159 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.005275 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.005674 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.005822 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.006024 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.006114 4842 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.006146 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn"] Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.006316 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 22:00:27.007046 master-0 kubenswrapper[4842]: I1204 22:00:27.006661 4842 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-c747h"] Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007116 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007331 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwk6f\" (UniqueName: \"kubernetes.io/projected/46229484-5fa1-4595-94a0-44477abae90e-kube-api-access-jwk6f\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007389 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w592\" (UniqueName: \"kubernetes.io/projected/813f3ee7-35b5-4ee8-b453-00d16d910eae-kube-api-access-8w592\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007439 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46229484-5fa1-4595-94a0-44477abae90e-config\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007477 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mttq\" (UniqueName: \"kubernetes.io/projected/465637a4-42be-4a65-a859-7af699960138-kube-api-access-4mttq\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007538 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007577 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007619 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007659 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007700 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f893663c-7c1e-4eda-9839-99c1c0440304-serving-cert\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007739 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46229484-5fa1-4595-94a0-44477abae90e-serving-cert\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007798 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/465637a4-42be-4a65-a859-7af699960138-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007833 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r57bb\" (UniqueName: \"kubernetes.io/projected/e37d318a-5bf8-46ed-b6de-494102738da7-kube-api-access-r57bb\") pod \"csi-snapshot-controller-operator-6bc8656fdc-xhndk\" (UID: \"e37d318a-5bf8-46ed-b6de-494102738da7\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007868 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-config\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.008090 master-0 kubenswrapper[4842]: I1204 22:00:27.007926 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:27.008806 master-0 kubenswrapper[4842]: I1204 22:00:27.008096 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8d54\" (UniqueName: \"kubernetes.io/projected/f893663c-7c1e-4eda-9839-99c1c0440304-kube-api-access-g8d54\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.008806 master-0 kubenswrapper[4842]: I1204 22:00:27.008126 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/465637a4-42be-4a65-a859-7af699960138-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:27.008806 master-0 kubenswrapper[4842]: I1204 22:00:27.008149 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfklr\" (UniqueName: \"kubernetes.io/projected/c6a5d14d-0409-4024-b0a8-200fa2594185-kube-api-access-bfklr\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:27.008806 master-0 kubenswrapper[4842]: I1204 22:00:27.008231 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk"] Dec 04 22:00:27.010088 master-0 kubenswrapper[4842]: I1204 22:00:27.009708 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 04 22:00:27.010088 master-0 kubenswrapper[4842]: I1204 22:00:27.009807 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt"] Dec 04 22:00:27.010088 master-0 kubenswrapper[4842]: I1204 22:00:27.009940 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 22:00:27.011139 master-0 kubenswrapper[4842]: I1204 22:00:27.011084 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb"] Dec 04 22:00:27.012462 master-0 kubenswrapper[4842]: I1204 22:00:27.012010 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 22:00:27.022652 master-0 kubenswrapper[4842]: I1204 22:00:27.020041 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg"] Dec 04 22:00:27.022652 master-0 kubenswrapper[4842]: I1204 22:00:27.021029 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 04 22:00:27.025377 master-0 kubenswrapper[4842]: I1204 22:00:27.025127 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 22:00:27.028458 master-0 kubenswrapper[4842]: I1204 22:00:27.028411 4842 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 22:00:27.028821 master-0 kubenswrapper[4842]: I1204 22:00:27.028776 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz"] Dec 04 22:00:27.030166 master-0 kubenswrapper[4842]: I1204 22:00:27.030128 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b"] Dec 04 22:00:27.031078 master-0 kubenswrapper[4842]: I1204 22:00:27.031044 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-8649c48786-qlkgh"] Dec 04 22:00:27.031830 master-0 kubenswrapper[4842]: I1204 22:00:27.031799 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp"] Dec 04 22:00:27.032752 master-0 kubenswrapper[4842]: I1204 22:00:27.032722 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx"] Dec 04 22:00:27.033540 master-0 kubenswrapper[4842]: I1204 22:00:27.033491 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-7c56cf9b74-sshsd"] Dec 04 22:00:27.035288 master-0 kubenswrapper[4842]: I1204 22:00:27.035245 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq"] Dec 04 22:00:27.035346 master-0 kubenswrapper[4842]: I1204 22:00:27.035306 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf"] Dec 04 22:00:27.035951 master-0 kubenswrapper[4842]: I1204 22:00:27.035920 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb"] Dec 04 22:00:27.036889 master-0 kubenswrapper[4842]: I1204 22:00:27.036857 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-f797b99b6-m9m4h"] Dec 04 22:00:27.037680 master-0 kubenswrapper[4842]: I1204 22:00:27.037650 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk"] Dec 04 22:00:27.110208 master-0 kubenswrapper[4842]: I1204 22:00:27.109362 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvrr5\" (UniqueName: \"kubernetes.io/projected/0beb871c-3bf1-471c-a028-746a650267bf-kube-api-access-dvrr5\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.110208 master-0 kubenswrapper[4842]: I1204 22:00:27.109491 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e065179e-634a-4cbe-bb59-5b01c514e4de-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:27.110208 master-0 kubenswrapper[4842]: I1204 22:00:27.109567 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:27.110208 master-0 kubenswrapper[4842]: I1204 22:00:27.109624 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8d54\" (UniqueName: \"kubernetes.io/projected/f893663c-7c1e-4eda-9839-99c1c0440304-kube-api-access-g8d54\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.110208 master-0 kubenswrapper[4842]: I1204 22:00:27.109662 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfklr\" (UniqueName: \"kubernetes.io/projected/c6a5d14d-0409-4024-b0a8-200fa2594185-kube-api-access-bfklr\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:27.110208 master-0 kubenswrapper[4842]: I1204 22:00:27.109707 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-host-slash\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.110208 master-0 kubenswrapper[4842]: I1204 22:00:27.109743 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:27.110208 master-0 kubenswrapper[4842]: I1204 22:00:27.109783 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsxkk\" (UniqueName: \"kubernetes.io/projected/690b447a-19c0-4925-bc9d-d0c86a83a377-kube-api-access-wsxkk\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:27.110208 master-0 kubenswrapper[4842]: I1204 22:00:27.109818 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwk6f\" (UniqueName: \"kubernetes.io/projected/46229484-5fa1-4595-94a0-44477abae90e-kube-api-access-jwk6f\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:27.110208 master-0 kubenswrapper[4842]: I1204 22:00:27.109855 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35821f48-b000-4915-847f-a739b6efc5ee-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.111112 master-0 kubenswrapper[4842]: I1204 22:00:27.110238 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46229484-5fa1-4595-94a0-44477abae90e-config\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:27.111112 master-0 kubenswrapper[4842]: I1204 22:00:27.110312 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.111112 master-0 kubenswrapper[4842]: I1204 22:00:27.110815 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:27.111112 master-0 kubenswrapper[4842]: I1204 22:00:27.110841 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24648a41-875f-4e98-8b21-3bdd38dffa32-config\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:27.111112 master-0 kubenswrapper[4842]: I1204 22:00:27.110864 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.111112 master-0 kubenswrapper[4842]: I1204 22:00:27.110883 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690b447a-19c0-4925-bc9d-d0c86a83a377-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:27.111112 master-0 kubenswrapper[4842]: I1204 22:00:27.110901 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-bound-sa-token\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.111112 master-0 kubenswrapper[4842]: I1204 22:00:27.110921 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e065179e-634a-4cbe-bb59-5b01c514e4de-serving-cert\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:27.111112 master-0 kubenswrapper[4842]: I1204 22:00:27.110966 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56f25fad-089d-4df6-abb1-10d4c76750f1-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111143 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a544105a-5bec-456a-aef6-c160943c1f67-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111166 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111186 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111207 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46229484-5fa1-4595-94a0-44477abae90e-serving-cert\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111226 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111247 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5nkh\" (UniqueName: \"kubernetes.io/projected/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-kube-api-access-g5nkh\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111270 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r57bb\" (UniqueName: \"kubernetes.io/projected/e37d318a-5bf8-46ed-b6de-494102738da7-kube-api-access-r57bb\") pod \"csi-snapshot-controller-operator-6bc8656fdc-xhndk\" (UID: \"e37d318a-5bf8-46ed-b6de-494102738da7\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111292 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-config\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111337 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24648a41-875f-4e98-8b21-3bdd38dffa32-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111358 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f091088-2166-4026-9fa6-62bd83407edb-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111379 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111399 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-config\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111422 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/465637a4-42be-4a65-a859-7af699960138-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111442 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.111709 master-0 kubenswrapper[4842]: I1204 22:00:27.111464 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111484 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/addddaac-a31a-4dbf-b78f-87225b11b463-trusted-ca\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111524 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24648a41-875f-4e98-8b21-3bdd38dffa32-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111545 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-serving-cert\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111567 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4cct\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-kube-api-access-m4cct\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111587 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-iptables-alerter-script\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111607 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5t2f\" (UniqueName: \"kubernetes.io/projected/7f091088-2166-4026-9fa6-62bd83407edb-kube-api-access-s5t2f\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111624 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f25fad-089d-4df6-abb1-10d4c76750f1-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111632 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46229484-5fa1-4595-94a0-44477abae90e-config\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111645 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w592\" (UniqueName: \"kubernetes.io/projected/813f3ee7-35b5-4ee8-b453-00d16d910eae-kube-api-access-8w592\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111665 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7l9n\" (UniqueName: \"kubernetes.io/projected/ceb419e4-d804-4111-b8d8-8436cc2ee617-kube-api-access-c7l9n\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111688 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0beb871c-3bf1-471c-a028-746a650267bf-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111708 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-client\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111732 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mttq\" (UniqueName: \"kubernetes.io/projected/465637a4-42be-4a65-a859-7af699960138-kube-api-access-4mttq\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111753 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f091088-2166-4026-9fa6-62bd83407edb-config\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:27.113168 master-0 kubenswrapper[4842]: I1204 22:00:27.111778 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr65l\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-kube-api-access-lr65l\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111795 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e065179e-634a-4cbe-bb59-5b01c514e4de-config\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111815 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111833 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a544105a-5bec-456a-aef6-c160943c1f67-config\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111855 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111883 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111905 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scht6\" (UniqueName: \"kubernetes.io/projected/a544105a-5bec-456a-aef6-c160943c1f67-kube-api-access-scht6\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111927 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2ndk\" (UniqueName: \"kubernetes.io/projected/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-kube-api-access-w2ndk\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111945 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqqt\" (UniqueName: \"kubernetes.io/projected/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-kube-api-access-8wqqt\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111937 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111970 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcw8f\" (UniqueName: \"kubernetes.io/projected/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-kube-api-access-kcw8f\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.111992 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f893663c-7c1e-4eda-9839-99c1c0440304-serving-cert\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: I1204 22:00:27.112013 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690b447a-19c0-4925-bc9d-d0c86a83a377-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: E1204 22:00:27.112110 4842 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 04 22:00:27.114823 master-0 kubenswrapper[4842]: E1204 22:00:27.112158 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics podName:c6a5d14d-0409-4024-b0a8-200fa2594185 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:27.612138039 +0000 UTC m=+157.926950224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-m9m4h" (UID: "c6a5d14d-0409-4024-b0a8-200fa2594185") : secret "marketplace-operator-metrics" not found Dec 04 22:00:27.116817 master-0 kubenswrapper[4842]: I1204 22:00:27.112681 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f25fad-089d-4df6-abb1-10d4c76750f1-config\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:27.116817 master-0 kubenswrapper[4842]: I1204 22:00:27.112725 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:27.116817 master-0 kubenswrapper[4842]: I1204 22:00:27.112791 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/465637a4-42be-4a65-a859-7af699960138-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:27.116817 master-0 kubenswrapper[4842]: I1204 22:00:27.112830 4842 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.116817 master-0 kubenswrapper[4842]: I1204 22:00:27.112908 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-config\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.116817 master-0 kubenswrapper[4842]: I1204 22:00:27.113234 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.116817 master-0 kubenswrapper[4842]: E1204 22:00:27.113876 4842 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:27.116817 master-0 kubenswrapper[4842]: E1204 22:00:27.113904 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:27.613894815 +0000 UTC m=+157.928706990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:27.116817 master-0 kubenswrapper[4842]: I1204 22:00:27.114283 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:27.116817 master-0 kubenswrapper[4842]: I1204 22:00:27.115004 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/465637a4-42be-4a65-a859-7af699960138-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:27.118617 master-0 kubenswrapper[4842]: I1204 22:00:27.118492 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f893663c-7c1e-4eda-9839-99c1c0440304-serving-cert\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.119106 master-0 kubenswrapper[4842]: I1204 22:00:27.119046 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/465637a4-42be-4a65-a859-7af699960138-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:27.119358 master-0 kubenswrapper[4842]: I1204 22:00:27.119296 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46229484-5fa1-4595-94a0-44477abae90e-serving-cert\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:27.132122 master-0 kubenswrapper[4842]: I1204 22:00:27.132004 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfklr\" (UniqueName: \"kubernetes.io/projected/c6a5d14d-0409-4024-b0a8-200fa2594185-kube-api-access-bfklr\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:27.135478 master-0 kubenswrapper[4842]: I1204 22:00:27.135444 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8d54\" (UniqueName: \"kubernetes.io/projected/f893663c-7c1e-4eda-9839-99c1c0440304-kube-api-access-g8d54\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.135871 master-0 kubenswrapper[4842]: I1204 22:00:27.135833 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwk6f\" (UniqueName: \"kubernetes.io/projected/46229484-5fa1-4595-94a0-44477abae90e-kube-api-access-jwk6f\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:27.138290 master-0 kubenswrapper[4842]: I1204 22:00:27.138219 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r57bb\" (UniqueName: \"kubernetes.io/projected/e37d318a-5bf8-46ed-b6de-494102738da7-kube-api-access-r57bb\") pod \"csi-snapshot-controller-operator-6bc8656fdc-xhndk\" (UID: \"e37d318a-5bf8-46ed-b6de-494102738da7\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" Dec 04 22:00:27.140675 master-0 kubenswrapper[4842]: I1204 22:00:27.140641 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mttq\" (UniqueName: \"kubernetes.io/projected/465637a4-42be-4a65-a859-7af699960138-kube-api-access-4mttq\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:27.143173 master-0 kubenswrapper[4842]: I1204 22:00:27.142744 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w592\" (UniqueName: \"kubernetes.io/projected/813f3ee7-35b5-4ee8-b453-00d16d910eae-kube-api-access-8w592\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:27.213714 master-0 kubenswrapper[4842]: I1204 22:00:27.213662 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-bound-sa-token\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.213786 master-0 kubenswrapper[4842]: I1204 22:00:27.213718 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24648a41-875f-4e98-8b21-3bdd38dffa32-config\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:27.213786 master-0 kubenswrapper[4842]: I1204 22:00:27.213743 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690b447a-19c0-4925-bc9d-d0c86a83a377-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:27.213786 master-0 kubenswrapper[4842]: I1204 22:00:27.213769 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e065179e-634a-4cbe-bb59-5b01c514e4de-serving-cert\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:27.213874 master-0 kubenswrapper[4842]: I1204 22:00:27.213795 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56f25fad-089d-4df6-abb1-10d4c76750f1-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:27.213874 master-0 kubenswrapper[4842]: I1204 22:00:27.213820 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a544105a-5bec-456a-aef6-c160943c1f67-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:27.213874 master-0 kubenswrapper[4842]: I1204 22:00:27.213840 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.213874 master-0 kubenswrapper[4842]: I1204 22:00:27.213866 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nkh\" (UniqueName: \"kubernetes.io/projected/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-kube-api-access-g5nkh\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:27.213978 master-0 kubenswrapper[4842]: I1204 22:00:27.213888 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.213978 master-0 kubenswrapper[4842]: I1204 22:00:27.213911 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.213978 master-0 kubenswrapper[4842]: I1204 22:00:27.213935 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24648a41-875f-4e98-8b21-3bdd38dffa32-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:27.213978 master-0 kubenswrapper[4842]: I1204 22:00:27.213956 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-config\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.214092 master-0 kubenswrapper[4842]: I1204 22:00:27.213982 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f091088-2166-4026-9fa6-62bd83407edb-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:27.214092 master-0 kubenswrapper[4842]: I1204 22:00:27.214005 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.214092 master-0 kubenswrapper[4842]: I1204 22:00:27.214029 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.214092 master-0 kubenswrapper[4842]: I1204 22:00:27.214050 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/addddaac-a31a-4dbf-b78f-87225b11b463-trusted-ca\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.214092 master-0 kubenswrapper[4842]: I1204 22:00:27.214079 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24648a41-875f-4e98-8b21-3bdd38dffa32-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:27.214224 master-0 kubenswrapper[4842]: I1204 22:00:27.214127 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-serving-cert\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.214224 master-0 kubenswrapper[4842]: I1204 22:00:27.214151 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7l9n\" (UniqueName: \"kubernetes.io/projected/ceb419e4-d804-4111-b8d8-8436cc2ee617-kube-api-access-c7l9n\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.214224 master-0 kubenswrapper[4842]: I1204 22:00:27.214171 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cct\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-kube-api-access-m4cct\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.214224 master-0 kubenswrapper[4842]: I1204 22:00:27.214191 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-iptables-alerter-script\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.214224 master-0 kubenswrapper[4842]: I1204 22:00:27.214212 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5t2f\" (UniqueName: \"kubernetes.io/projected/7f091088-2166-4026-9fa6-62bd83407edb-kube-api-access-s5t2f\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:27.214357 master-0 kubenswrapper[4842]: I1204 22:00:27.214235 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f25fad-089d-4df6-abb1-10d4c76750f1-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:27.214357 master-0 kubenswrapper[4842]: I1204 22:00:27.214259 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-client\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.214357 master-0 kubenswrapper[4842]: I1204 22:00:27.214279 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0beb871c-3bf1-471c-a028-746a650267bf-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.214357 master-0 kubenswrapper[4842]: I1204 22:00:27.214302 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f091088-2166-4026-9fa6-62bd83407edb-config\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:27.214357 master-0 kubenswrapper[4842]: I1204 22:00:27.214323 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e065179e-634a-4cbe-bb59-5b01c514e4de-config\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:27.214357 master-0 kubenswrapper[4842]: I1204 22:00:27.214345 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr65l\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-kube-api-access-lr65l\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.214591 master-0 kubenswrapper[4842]: I1204 22:00:27.214381 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a544105a-5bec-456a-aef6-c160943c1f67-config\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:27.214591 master-0 kubenswrapper[4842]: I1204 22:00:27.214416 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scht6\" (UniqueName: \"kubernetes.io/projected/a544105a-5bec-456a-aef6-c160943c1f67-kube-api-access-scht6\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:27.214591 master-0 kubenswrapper[4842]: I1204 22:00:27.214435 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ndk\" (UniqueName: \"kubernetes.io/projected/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-kube-api-access-w2ndk\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.214591 master-0 kubenswrapper[4842]: I1204 22:00:27.214457 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqqt\" (UniqueName: \"kubernetes.io/projected/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-kube-api-access-8wqqt\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:27.214591 master-0 kubenswrapper[4842]: I1204 22:00:27.214482 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcw8f\" (UniqueName: \"kubernetes.io/projected/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-kube-api-access-kcw8f\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:27.214591 master-0 kubenswrapper[4842]: I1204 22:00:27.214523 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690b447a-19c0-4925-bc9d-d0c86a83a377-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:27.214591 master-0 kubenswrapper[4842]: I1204 22:00:27.214544 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f25fad-089d-4df6-abb1-10d4c76750f1-config\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:27.214591 master-0 kubenswrapper[4842]: I1204 22:00:27.214565 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.214591 master-0 kubenswrapper[4842]: I1204 22:00:27.214588 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:27.214828 master-0 kubenswrapper[4842]: I1204 22:00:27.214638 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvrr5\" (UniqueName: \"kubernetes.io/projected/0beb871c-3bf1-471c-a028-746a650267bf-kube-api-access-dvrr5\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.214828 master-0 kubenswrapper[4842]: I1204 22:00:27.214664 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e065179e-634a-4cbe-bb59-5b01c514e4de-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:27.214828 master-0 kubenswrapper[4842]: I1204 22:00:27.214700 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:27.214828 master-0 kubenswrapper[4842]: I1204 22:00:27.214732 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:27.214828 master-0 kubenswrapper[4842]: I1204 22:00:27.214753 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-host-slash\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.214828 master-0 kubenswrapper[4842]: I1204 22:00:27.214779 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxkk\" (UniqueName: \"kubernetes.io/projected/690b447a-19c0-4925-bc9d-d0c86a83a377-kube-api-access-wsxkk\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:27.214828 master-0 kubenswrapper[4842]: I1204 22:00:27.214801 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35821f48-b000-4915-847f-a739b6efc5ee-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.214828 master-0 kubenswrapper[4842]: I1204 22:00:27.214823 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.215029 master-0 kubenswrapper[4842]: I1204 22:00:27.214845 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:27.216149 master-0 kubenswrapper[4842]: I1204 22:00:27.216111 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.216997 master-0 kubenswrapper[4842]: I1204 22:00:27.216722 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e065179e-634a-4cbe-bb59-5b01c514e4de-config\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:27.217407 master-0 kubenswrapper[4842]: I1204 22:00:27.217327 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a544105a-5bec-456a-aef6-c160943c1f67-config\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:27.218083 master-0 kubenswrapper[4842]: E1204 22:00:27.218040 4842 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:27.218159 master-0 kubenswrapper[4842]: E1204 22:00:27.218136 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:27.718117094 +0000 UTC m=+158.032929479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:27.218216 master-0 kubenswrapper[4842]: I1204 22:00:27.218168 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f25fad-089d-4df6-abb1-10d4c76750f1-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:27.218254 master-0 kubenswrapper[4842]: I1204 22:00:27.218214 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0beb871c-3bf1-471c-a028-746a650267bf-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.218350 master-0 kubenswrapper[4842]: I1204 22:00:27.218055 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690b447a-19c0-4925-bc9d-d0c86a83a377-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:27.218422 master-0 kubenswrapper[4842]: E1204 22:00:27.218399 4842 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 04 22:00:27.218454 master-0 kubenswrapper[4842]: E1204 22:00:27.218440 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:27.718428682 +0000 UTC m=+158.033240857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "node-tuning-operator-tls" not found Dec 04 22:00:27.218618 master-0 kubenswrapper[4842]: I1204 22:00:27.218567 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.219301 master-0 kubenswrapper[4842]: E1204 22:00:27.217483 4842 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:27.219421 master-0 kubenswrapper[4842]: E1204 22:00:27.219373 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls podName:addddaac-a31a-4dbf-b78f-87225b11b463 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:27.719347397 +0000 UTC m=+158.034159582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls") pod "ingress-operator-8649c48786-qlkgh" (UID: "addddaac-a31a-4dbf-b78f-87225b11b463") : secret "metrics-tls" not found Dec 04 22:00:27.219536 master-0 kubenswrapper[4842]: E1204 22:00:27.219492 4842 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 04 22:00:27.219592 master-0 kubenswrapper[4842]: E1204 22:00:27.219566 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs podName:5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:27.719549772 +0000 UTC m=+158.034361957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs") pod "multus-admission-controller-7dfc5b745f-nk4gb" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf") : secret "multus-admission-controller-secret" not found Dec 04 22:00:27.219673 master-0 kubenswrapper[4842]: E1204 22:00:27.219653 4842 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:27.219720 master-0 kubenswrapper[4842]: E1204 22:00:27.219704 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls podName:ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e nodeName:}" failed. No retries permitted until 2025-12-04 22:00:27.719693876 +0000 UTC m=+158.034506061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls") pod "dns-operator-7c56cf9b74-sshsd" (UID: "ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e") : secret "metrics-tls" not found Dec 04 22:00:27.220123 master-0 kubenswrapper[4842]: I1204 22:00:27.220079 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-iptables-alerter-script\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.220254 master-0 kubenswrapper[4842]: I1204 22:00:27.220192 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/addddaac-a31a-4dbf-b78f-87225b11b463-trusted-ca\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.220300 master-0 kubenswrapper[4842]: E1204 22:00:27.220230 4842 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:27.220340 master-0 kubenswrapper[4842]: I1204 22:00:27.220309 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-host-slash\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.220379 master-0 kubenswrapper[4842]: E1204 22:00:27.220343 4842 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 04 22:00:27.220433 master-0 kubenswrapper[4842]: E1204 22:00:27.220351 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls podName:512ba6af-11ad-4217-a1ce-a2ab3ef67ec5 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:27.720321522 +0000 UTC m=+158.035133747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-rn6cz" (UID: "512ba6af-11ad-4217-a1ce-a2ab3ef67ec5") : secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:27.220487 master-0 kubenswrapper[4842]: E1204 22:00:27.220461 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls podName:35821f48-b000-4915-847f-a739b6efc5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:00:27.720437495 +0000 UTC m=+158.035249690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-r7wcq" (UID: "35821f48-b000-4915-847f-a739b6efc5ee") : secret "image-registry-operator-tls" not found Dec 04 22:00:27.220610 master-0 kubenswrapper[4842]: I1204 22:00:27.220561 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24648a41-875f-4e98-8b21-3bdd38dffa32-config\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:27.220824 master-0 kubenswrapper[4842]: I1204 22:00:27.220757 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-client\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.220882 master-0 kubenswrapper[4842]: I1204 22:00:27.220823 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f091088-2166-4026-9fa6-62bd83407edb-config\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:27.221375 master-0 kubenswrapper[4842]: I1204 22:00:27.221242 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-config\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.221969 master-0 kubenswrapper[4842]: I1204 22:00:27.221926 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f25fad-089d-4df6-abb1-10d4c76750f1-config\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:27.222165 master-0 kubenswrapper[4842]: I1204 22:00:27.222115 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:27.222233 master-0 kubenswrapper[4842]: I1204 22:00:27.222179 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35821f48-b000-4915-847f-a739b6efc5ee-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.224784 master-0 kubenswrapper[4842]: I1204 22:00:27.224079 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24648a41-875f-4e98-8b21-3bdd38dffa32-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:27.224784 master-0 kubenswrapper[4842]: I1204 22:00:27.224081 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690b447a-19c0-4925-bc9d-d0c86a83a377-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:27.224784 master-0 kubenswrapper[4842]: I1204 22:00:27.224662 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f091088-2166-4026-9fa6-62bd83407edb-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:27.226239 master-0 kubenswrapper[4842]: I1204 22:00:27.225470 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-serving-cert\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.234058 master-0 kubenswrapper[4842]: I1204 22:00:27.234014 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e065179e-634a-4cbe-bb59-5b01c514e4de-serving-cert\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:27.235553 master-0 kubenswrapper[4842]: I1204 22:00:27.235515 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a544105a-5bec-456a-aef6-c160943c1f67-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:27.236863 master-0 kubenswrapper[4842]: I1204 22:00:27.236812 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqqt\" (UniqueName: \"kubernetes.io/projected/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-kube-api-access-8wqqt\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:27.237042 master-0 kubenswrapper[4842]: I1204 22:00:27.237005 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ndk\" (UniqueName: \"kubernetes.io/projected/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-kube-api-access-w2ndk\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.237864 master-0 kubenswrapper[4842]: I1204 22:00:27.237825 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcw8f\" (UniqueName: \"kubernetes.io/projected/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-kube-api-access-kcw8f\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:27.239063 master-0 kubenswrapper[4842]: I1204 22:00:27.239012 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nkh\" (UniqueName: \"kubernetes.io/projected/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-kube-api-access-g5nkh\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:27.239662 master-0 kubenswrapper[4842]: I1204 22:00:27.239626 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7l9n\" (UniqueName: \"kubernetes.io/projected/ceb419e4-d804-4111-b8d8-8436cc2ee617-kube-api-access-c7l9n\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.242280 master-0 kubenswrapper[4842]: I1204 22:00:27.242230 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.245439 master-0 kubenswrapper[4842]: I1204 22:00:27.245388 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scht6\" (UniqueName: \"kubernetes.io/projected/a544105a-5bec-456a-aef6-c160943c1f67-kube-api-access-scht6\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:27.254706 master-0 kubenswrapper[4842]: I1204 22:00:27.254644 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5t2f\" (UniqueName: \"kubernetes.io/projected/7f091088-2166-4026-9fa6-62bd83407edb-kube-api-access-s5t2f\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:27.280340 master-0 kubenswrapper[4842]: I1204 22:00:27.280218 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvrr5\" (UniqueName: \"kubernetes.io/projected/0beb871c-3bf1-471c-a028-746a650267bf-kube-api-access-dvrr5\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.300470 master-0 kubenswrapper[4842]: I1204 22:00:27.300396 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56f25fad-089d-4df6-abb1-10d4c76750f1-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:27.321008 master-0 kubenswrapper[4842]: I1204 22:00:27.320906 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e065179e-634a-4cbe-bb59-5b01c514e4de-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:27.323486 master-0 kubenswrapper[4842]: I1204 22:00:27.323430 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:27.334027 master-0 kubenswrapper[4842]: I1204 22:00:27.333948 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:27.345796 master-0 kubenswrapper[4842]: I1204 22:00:27.345707 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr65l\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-kube-api-access-lr65l\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.355227 master-0 kubenswrapper[4842]: I1204 22:00:27.355150 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:27.362697 master-0 kubenswrapper[4842]: I1204 22:00:27.362656 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" Dec 04 22:00:27.369252 master-0 kubenswrapper[4842]: I1204 22:00:27.368831 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:27.378213 master-0 kubenswrapper[4842]: I1204 22:00:27.378089 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:27.379720 master-0 kubenswrapper[4842]: I1204 22:00:27.379663 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4cct\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-kube-api-access-m4cct\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.384076 master-0 kubenswrapper[4842]: I1204 22:00:27.383963 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxkk\" (UniqueName: \"kubernetes.io/projected/690b447a-19c0-4925-bc9d-d0c86a83a377-kube-api-access-wsxkk\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:27.400248 master-0 kubenswrapper[4842]: I1204 22:00:27.399918 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:27.402061 master-0 kubenswrapper[4842]: I1204 22:00:27.402009 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24648a41-875f-4e98-8b21-3bdd38dffa32-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:27.421634 master-0 kubenswrapper[4842]: I1204 22:00:27.420273 4842 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-bound-sa-token\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.421634 master-0 kubenswrapper[4842]: I1204 22:00:27.420584 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:27.457844 master-0 kubenswrapper[4842]: I1204 22:00:27.457396 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:27.502540 master-0 kubenswrapper[4842]: I1204 22:00:27.495288 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:27.515974 master-0 kubenswrapper[4842]: I1204 22:00:27.512708 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:27.532511 master-0 kubenswrapper[4842]: I1204 22:00:27.532435 4842 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:27.544996 master-0 kubenswrapper[4842]: W1204 22:00:27.544687 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcedb0b3e_674e_40b9_a10d_45a9f0c5c59c.slice/crio-32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0 WatchSource:0}: Error finding container 32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0: Status 404 returned error can't find the container with id 32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0 Dec 04 22:00:27.621303 master-0 kubenswrapper[4842]: I1204 22:00:27.621043 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:27.621303 master-0 kubenswrapper[4842]: I1204 22:00:27.621088 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:27.621960 master-0 kubenswrapper[4842]: E1204 22:00:27.621862 4842 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:27.625108 master-0 kubenswrapper[4842]: E1204 22:00:27.624060 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:28.621910912 +0000 UTC m=+158.936723097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:27.625108 master-0 kubenswrapper[4842]: E1204 22:00:27.624769 4842 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 04 22:00:27.625108 master-0 kubenswrapper[4842]: E1204 22:00:27.624824 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics podName:c6a5d14d-0409-4024-b0a8-200fa2594185 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:28.624810299 +0000 UTC m=+158.939622664 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-m9m4h" (UID: "c6a5d14d-0409-4024-b0a8-200fa2594185") : secret "marketplace-operator-metrics" not found Dec 04 22:00:27.642768 master-0 kubenswrapper[4842]: I1204 22:00:27.642636 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk"] Dec 04 22:00:27.642768 master-0 kubenswrapper[4842]: I1204 22:00:27.642694 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz"] Dec 04 22:00:27.656796 master-0 kubenswrapper[4842]: I1204 22:00:27.655582 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p"] Dec 04 22:00:27.656976 master-0 kubenswrapper[4842]: W1204 22:00:27.656914 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf893663c_7c1e_4eda_9839_99c1c0440304.slice/crio-58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f WatchSource:0}: Error finding container 58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f: Status 404 returned error can't find the container with id 58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f Dec 04 22:00:27.658654 master-0 kubenswrapper[4842]: W1204 22:00:27.658600 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46229484_5fa1_4595_94a0_44477abae90e.slice/crio-51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e WatchSource:0}: Error finding container 51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e: Status 404 returned error can't find the container with id 51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e Dec 04 22:00:27.670453 master-0 kubenswrapper[4842]: W1204 22:00:27.670392 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod465637a4_42be_4a65_a859_7af699960138.slice/crio-ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b WatchSource:0}: Error finding container ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b: Status 404 returned error can't find the container with id ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b Dec 04 22:00:27.723186 master-0 kubenswrapper[4842]: I1204 22:00:27.723133 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.723285 master-0 kubenswrapper[4842]: I1204 22:00:27.723224 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:27.723285 master-0 kubenswrapper[4842]: I1204 22:00:27.723260 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:27.723344 master-0 kubenswrapper[4842]: I1204 22:00:27.723287 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:27.723344 master-0 kubenswrapper[4842]: I1204 22:00:27.723310 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:27.723344 master-0 kubenswrapper[4842]: I1204 22:00:27.723333 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:27.723444 master-0 kubenswrapper[4842]: I1204 22:00:27.723352 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:27.723533 master-0 kubenswrapper[4842]: E1204 22:00:27.723482 4842 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:27.723575 master-0 kubenswrapper[4842]: E1204 22:00:27.723563 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls podName:addddaac-a31a-4dbf-b78f-87225b11b463 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:28.723544702 +0000 UTC m=+159.038356887 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls") pod "ingress-operator-8649c48786-qlkgh" (UID: "addddaac-a31a-4dbf-b78f-87225b11b463") : secret "metrics-tls" not found Dec 04 22:00:27.724014 master-0 kubenswrapper[4842]: E1204 22:00:27.723982 4842 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:27.724064 master-0 kubenswrapper[4842]: E1204 22:00:27.724019 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:28.724011394 +0000 UTC m=+159.038823579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:27.724064 master-0 kubenswrapper[4842]: E1204 22:00:27.724056 4842 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:27.724123 master-0 kubenswrapper[4842]: E1204 22:00:27.724075 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls podName:ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e nodeName:}" failed. No retries permitted until 2025-12-04 22:00:28.724069725 +0000 UTC m=+159.038881910 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls") pod "dns-operator-7c56cf9b74-sshsd" (UID: "ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e") : secret "metrics-tls" not found Dec 04 22:00:27.724123 master-0 kubenswrapper[4842]: E1204 22:00:27.724112 4842 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:27.724184 master-0 kubenswrapper[4842]: E1204 22:00:27.724128 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls podName:512ba6af-11ad-4217-a1ce-a2ab3ef67ec5 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:28.724122957 +0000 UTC m=+159.038935132 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-rn6cz" (UID: "512ba6af-11ad-4217-a1ce-a2ab3ef67ec5") : secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:27.724184 master-0 kubenswrapper[4842]: E1204 22:00:27.724177 4842 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 04 22:00:27.724245 master-0 kubenswrapper[4842]: E1204 22:00:27.724194 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs podName:5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:28.724189739 +0000 UTC m=+159.039001924 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs") pod "multus-admission-controller-7dfc5b745f-nk4gb" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf") : secret "multus-admission-controller-secret" not found Dec 04 22:00:27.726168 master-0 kubenswrapper[4842]: E1204 22:00:27.725866 4842 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 04 22:00:27.726168 master-0 kubenswrapper[4842]: E1204 22:00:27.725959 4842 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 04 22:00:27.726168 master-0 kubenswrapper[4842]: E1204 22:00:27.725977 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:28.725947246 +0000 UTC m=+159.040759431 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "node-tuning-operator-tls" not found Dec 04 22:00:27.726168 master-0 kubenswrapper[4842]: E1204 22:00:27.725997 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls podName:35821f48-b000-4915-847f-a739b6efc5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:00:28.725986917 +0000 UTC m=+159.040799102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-r7wcq" (UID: "35821f48-b000-4915-847f-a739b6efc5ee") : secret "image-registry-operator-tls" not found Dec 04 22:00:27.741628 master-0 kubenswrapper[4842]: I1204 22:00:27.739097 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn"] Dec 04 22:00:27.750800 master-0 kubenswrapper[4842]: W1204 22:00:27.750754 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24648a41_875f_4e98_8b21_3bdd38dffa32.slice/crio-aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376 WatchSource:0}: Error finding container aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376: Status 404 returned error can't find the container with id aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376 Dec 04 22:00:27.776408 master-0 kubenswrapper[4842]: I1204 22:00:27.776361 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp"] Dec 04 22:00:27.782936 master-0 kubenswrapper[4842]: W1204 22:00:27.782849 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode065179e_634a_4cbe_bb59_5b01c514e4de.slice/crio-faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0 WatchSource:0}: Error finding container faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0: Status 404 returned error can't find the container with id faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0 Dec 04 22:00:27.792495 master-0 kubenswrapper[4842]: I1204 22:00:27.792405 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx"] Dec 04 22:00:27.796538 master-0 kubenswrapper[4842]: W1204 22:00:27.796473 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod690b447a_19c0_4925_bc9d_d0c86a83a377.slice/crio-e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223 WatchSource:0}: Error finding container e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223: Status 404 returned error can't find the container with id e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223 Dec 04 22:00:27.890764 master-0 kubenswrapper[4842]: I1204 22:00:27.890597 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt"] Dec 04 22:00:27.896843 master-0 kubenswrapper[4842]: I1204 22:00:27.896763 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb"] Dec 04 22:00:27.898362 master-0 kubenswrapper[4842]: W1204 22:00:27.898299 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda544105a_5bec_456a_aef6_c160943c1f67.slice/crio-66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878 WatchSource:0}: Error finding container 66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878: Status 404 returned error can't find the container with id 66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878 Dec 04 22:00:27.899409 master-0 kubenswrapper[4842]: I1204 22:00:27.899375 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf"] Dec 04 22:00:27.901947 master-0 kubenswrapper[4842]: W1204 22:00:27.901892 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f091088_2166_4026_9fa6_62bd83407edb.slice/crio-6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b WatchSource:0}: Error finding container 6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b: Status 404 returned error can't find the container with id 6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b Dec 04 22:00:27.903414 master-0 kubenswrapper[4842]: I1204 22:00:27.903361 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk"] Dec 04 22:00:27.904143 master-0 kubenswrapper[4842]: W1204 22:00:27.904053 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceb419e4_d804_4111_b8d8_8436cc2ee617.slice/crio-969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a WatchSource:0}: Error finding container 969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a: Status 404 returned error can't find the container with id 969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a Dec 04 22:00:27.913165 master-0 kubenswrapper[4842]: W1204 22:00:27.913098 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37d318a_5bf8_46ed_b6de_494102738da7.slice/crio-ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2 WatchSource:0}: Error finding container ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2: Status 404 returned error can't find the container with id ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2 Dec 04 22:00:27.940028 master-0 kubenswrapper[4842]: I1204 22:00:27.939857 4842 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg"] Dec 04 22:00:27.947807 master-0 kubenswrapper[4842]: W1204 22:00:27.947763 4842 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f25fad_089d_4df6_abb1_10d4c76750f1.slice/crio-53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde WatchSource:0}: Error finding container 53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde: Status 404 returned error can't find the container with id 53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde Dec 04 22:00:28.493425 master-0 kubenswrapper[4842]: I1204 22:00:28.493217 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" event={"ID":"56f25fad-089d-4df6-abb1-10d4c76750f1","Type":"ContainerStarted","Data":"d8a2de466dc95e948ba536210f040992057ba7bc222a8102fb88249ab34f040a"} Dec 04 22:00:28.493425 master-0 kubenswrapper[4842]: I1204 22:00:28.493305 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" event={"ID":"56f25fad-089d-4df6-abb1-10d4c76750f1","Type":"ContainerStarted","Data":"53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde"} Dec 04 22:00:28.496923 master-0 kubenswrapper[4842]: I1204 22:00:28.496868 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" event={"ID":"24648a41-875f-4e98-8b21-3bdd38dffa32","Type":"ContainerStarted","Data":"aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376"} Dec 04 22:00:28.499773 master-0 kubenswrapper[4842]: I1204 22:00:28.499733 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerStarted","Data":"58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f"} Dec 04 22:00:28.501026 master-0 kubenswrapper[4842]: I1204 22:00:28.501005 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c747h" event={"ID":"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c","Type":"ContainerStarted","Data":"32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0"} Dec 04 22:00:28.502294 master-0 kubenswrapper[4842]: I1204 22:00:28.502234 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" event={"ID":"ceb419e4-d804-4111-b8d8-8436cc2ee617","Type":"ContainerStarted","Data":"969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a"} Dec 04 22:00:28.504267 master-0 kubenswrapper[4842]: I1204 22:00:28.503811 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerStarted","Data":"ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b"} Dec 04 22:00:28.506085 master-0 kubenswrapper[4842]: I1204 22:00:28.506033 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" event={"ID":"46229484-5fa1-4595-94a0-44477abae90e","Type":"ContainerStarted","Data":"51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e"} Dec 04 22:00:28.507763 master-0 kubenswrapper[4842]: I1204 22:00:28.507513 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" event={"ID":"690b447a-19c0-4925-bc9d-d0c86a83a377","Type":"ContainerStarted","Data":"e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223"} Dec 04 22:00:28.509488 master-0 kubenswrapper[4842]: I1204 22:00:28.509458 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" event={"ID":"a544105a-5bec-456a-aef6-c160943c1f67","Type":"ContainerStarted","Data":"66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878"} Dec 04 22:00:28.510669 master-0 kubenswrapper[4842]: I1204 22:00:28.510611 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" event={"ID":"e37d318a-5bf8-46ed-b6de-494102738da7","Type":"ContainerStarted","Data":"ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2"} Dec 04 22:00:28.511885 master-0 kubenswrapper[4842]: I1204 22:00:28.511855 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" event={"ID":"e065179e-634a-4cbe-bb59-5b01c514e4de","Type":"ContainerStarted","Data":"faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0"} Dec 04 22:00:28.512932 master-0 kubenswrapper[4842]: I1204 22:00:28.512891 4842 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerStarted","Data":"6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b"} Dec 04 22:00:28.635808 master-0 kubenswrapper[4842]: I1204 22:00:28.635710 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:28.635808 master-0 kubenswrapper[4842]: I1204 22:00:28.635821 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:28.636176 master-0 kubenswrapper[4842]: E1204 22:00:28.636131 4842 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:28.636282 master-0 kubenswrapper[4842]: E1204 22:00:28.636263 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:30.636224519 +0000 UTC m=+160.951036874 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:28.636349 master-0 kubenswrapper[4842]: E1204 22:00:28.636288 4842 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 04 22:00:28.636419 master-0 kubenswrapper[4842]: E1204 22:00:28.636400 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics podName:c6a5d14d-0409-4024-b0a8-200fa2594185 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:30.636370623 +0000 UTC m=+160.951182808 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-m9m4h" (UID: "c6a5d14d-0409-4024-b0a8-200fa2594185") : secret "marketplace-operator-metrics" not found Dec 04 22:00:28.737956 master-0 kubenswrapper[4842]: I1204 22:00:28.737889 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:28.737956 master-0 kubenswrapper[4842]: I1204 22:00:28.737948 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:28.738264 master-0 kubenswrapper[4842]: E1204 22:00:28.738160 4842 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 04 22:00:28.738320 master-0 kubenswrapper[4842]: E1204 22:00:28.738282 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:30.738247689 +0000 UTC m=+161.053060044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "node-tuning-operator-tls" not found Dec 04 22:00:28.738851 master-0 kubenswrapper[4842]: I1204 22:00:28.738810 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:28.739041 master-0 kubenswrapper[4842]: I1204 22:00:28.739015 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:28.739127 master-0 kubenswrapper[4842]: I1204 22:00:28.739101 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:28.739178 master-0 kubenswrapper[4842]: I1204 22:00:28.739141 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:28.739216 master-0 kubenswrapper[4842]: I1204 22:00:28.739188 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:28.739326 master-0 kubenswrapper[4842]: E1204 22:00:28.739310 4842 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 04 22:00:28.739366 master-0 kubenswrapper[4842]: E1204 22:00:28.739346 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls podName:35821f48-b000-4915-847f-a739b6efc5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:00:30.739335519 +0000 UTC m=+161.054147884 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-r7wcq" (UID: "35821f48-b000-4915-847f-a739b6efc5ee") : secret "image-registry-operator-tls" not found Dec 04 22:00:28.739426 master-0 kubenswrapper[4842]: E1204 22:00:28.739404 4842 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:28.739465 master-0 kubenswrapper[4842]: E1204 22:00:28.739441 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls podName:addddaac-a31a-4dbf-b78f-87225b11b463 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:30.739430211 +0000 UTC m=+161.054242396 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls") pod "ingress-operator-8649c48786-qlkgh" (UID: "addddaac-a31a-4dbf-b78f-87225b11b463") : secret "metrics-tls" not found Dec 04 22:00:28.739521 master-0 kubenswrapper[4842]: E1204 22:00:28.739491 4842 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:28.739566 master-0 kubenswrapper[4842]: E1204 22:00:28.739541 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:30.739530894 +0000 UTC m=+161.054343079 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:28.739675 master-0 kubenswrapper[4842]: E1204 22:00:28.739601 4842 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:28.739675 master-0 kubenswrapper[4842]: E1204 22:00:28.739635 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls podName:ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e nodeName:}" failed. No retries permitted until 2025-12-04 22:00:30.739626946 +0000 UTC m=+161.054439131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls") pod "dns-operator-7c56cf9b74-sshsd" (UID: "ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e") : secret "metrics-tls" not found Dec 04 22:00:28.739741 master-0 kubenswrapper[4842]: E1204 22:00:28.739685 4842 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:28.739741 master-0 kubenswrapper[4842]: E1204 22:00:28.739720 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls podName:512ba6af-11ad-4217-a1ce-a2ab3ef67ec5 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:30.739708669 +0000 UTC m=+161.054520864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-rn6cz" (UID: "512ba6af-11ad-4217-a1ce-a2ab3ef67ec5") : secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:28.739874 master-0 kubenswrapper[4842]: E1204 22:00:28.739772 4842 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 04 22:00:28.739874 master-0 kubenswrapper[4842]: E1204 22:00:28.739798 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs podName:5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:30.739790141 +0000 UTC m=+161.054602326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs") pod "multus-admission-controller-7dfc5b745f-nk4gb" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf") : secret "multus-admission-controller-secret" not found Dec 04 22:00:30.181296 master-0 kubenswrapper[4842]: I1204 22:00:30.181198 4842 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" podStartSLOduration=125.181170724 podStartE2EDuration="2m5.181170724s" podCreationTimestamp="2025-12-04 21:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:00:28.510242682 +0000 UTC m=+158.825054867" watchObservedRunningTime="2025-12-04 22:00:30.181170724 +0000 UTC m=+160.495982909" Dec 04 22:00:30.666101 master-0 kubenswrapper[4842]: I1204 22:00:30.666013 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:30.666101 master-0 kubenswrapper[4842]: I1204 22:00:30.666103 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:30.666445 master-0 kubenswrapper[4842]: E1204 22:00:30.666367 4842 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 04 22:00:30.666567 master-0 kubenswrapper[4842]: E1204 22:00:30.666536 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics podName:c6a5d14d-0409-4024-b0a8-200fa2594185 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:34.666464026 +0000 UTC m=+164.981276251 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-m9m4h" (UID: "c6a5d14d-0409-4024-b0a8-200fa2594185") : secret "marketplace-operator-metrics" not found Dec 04 22:00:30.666667 master-0 kubenswrapper[4842]: E1204 22:00:30.666596 4842 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:30.666727 master-0 kubenswrapper[4842]: E1204 22:00:30.666720 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:34.666691593 +0000 UTC m=+164.981503778 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:30.768306 master-0 kubenswrapper[4842]: I1204 22:00:30.767813 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:30.768562 master-0 kubenswrapper[4842]: I1204 22:00:30.768330 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:30.768562 master-0 kubenswrapper[4842]: I1204 22:00:30.768374 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:30.768562 master-0 kubenswrapper[4842]: E1204 22:00:30.768080 4842 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 04 22:00:30.768562 master-0 kubenswrapper[4842]: I1204 22:00:30.768421 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:30.768562 master-0 kubenswrapper[4842]: E1204 22:00:30.768520 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls podName:35821f48-b000-4915-847f-a739b6efc5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:00:34.768470237 +0000 UTC m=+165.083282432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-r7wcq" (UID: "35821f48-b000-4915-847f-a739b6efc5ee") : secret "image-registry-operator-tls" not found Dec 04 22:00:30.768798 master-0 kubenswrapper[4842]: E1204 22:00:30.768669 4842 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:30.768798 master-0 kubenswrapper[4842]: E1204 22:00:30.768755 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls podName:addddaac-a31a-4dbf-b78f-87225b11b463 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:34.768723664 +0000 UTC m=+165.083535879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls") pod "ingress-operator-8649c48786-qlkgh" (UID: "addddaac-a31a-4dbf-b78f-87225b11b463") : secret "metrics-tls" not found Dec 04 22:00:30.768880 master-0 kubenswrapper[4842]: E1204 22:00:30.768801 4842 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:30.768880 master-0 kubenswrapper[4842]: I1204 22:00:30.768804 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:30.768880 master-0 kubenswrapper[4842]: E1204 22:00:30.768836 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:34.768826556 +0000 UTC m=+165.083638761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:30.769041 master-0 kubenswrapper[4842]: I1204 22:00:30.769003 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:30.769089 master-0 kubenswrapper[4842]: E1204 22:00:30.769018 4842 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 04 22:00:30.769089 master-0 kubenswrapper[4842]: E1204 22:00:30.769054 4842 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:30.769176 master-0 kubenswrapper[4842]: I1204 22:00:30.769048 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:30.769229 master-0 kubenswrapper[4842]: E1204 22:00:30.769110 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls podName:ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e nodeName:}" failed. No retries permitted until 2025-12-04 22:00:34.769095233 +0000 UTC m=+165.083907448 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls") pod "dns-operator-7c56cf9b74-sshsd" (UID: "ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e") : secret "metrics-tls" not found Dec 04 22:00:30.769229 master-0 kubenswrapper[4842]: E1204 22:00:30.769125 4842 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 04 22:00:30.769309 master-0 kubenswrapper[4842]: E1204 22:00:30.769248 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:34.769217407 +0000 UTC m=+165.084029602 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "node-tuning-operator-tls" not found Dec 04 22:00:30.769309 master-0 kubenswrapper[4842]: E1204 22:00:30.769170 4842 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:30.769309 master-0 kubenswrapper[4842]: E1204 22:00:30.769273 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs podName:5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:34.769260378 +0000 UTC m=+165.084072573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs") pod "multus-admission-controller-7dfc5b745f-nk4gb" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf") : secret "multus-admission-controller-secret" not found Dec 04 22:00:30.769439 master-0 kubenswrapper[4842]: E1204 22:00:30.769341 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls podName:512ba6af-11ad-4217-a1ce-a2ab3ef67ec5 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:34.769321129 +0000 UTC m=+165.084133314 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-rn6cz" (UID: "512ba6af-11ad-4217-a1ce-a2ab3ef67ec5") : secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:34.723186 master-0 kubenswrapper[4842]: I1204 22:00:34.722696 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:34.723186 master-0 kubenswrapper[4842]: I1204 22:00:34.723163 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:34.724437 master-0 kubenswrapper[4842]: E1204 22:00:34.722957 4842 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 04 22:00:34.724437 master-0 kubenswrapper[4842]: E1204 22:00:34.723479 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics podName:c6a5d14d-0409-4024-b0a8-200fa2594185 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.72345385 +0000 UTC m=+173.038266025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-m9m4h" (UID: "c6a5d14d-0409-4024-b0a8-200fa2594185") : secret "marketplace-operator-metrics" not found Dec 04 22:00:34.724437 master-0 kubenswrapper[4842]: E1204 22:00:34.723298 4842 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:34.724437 master-0 kubenswrapper[4842]: E1204 22:00:34.723655 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.723614974 +0000 UTC m=+173.038427189 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:34.824128 master-0 kubenswrapper[4842]: I1204 22:00:34.824005 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:34.824459 master-0 kubenswrapper[4842]: E1204 22:00:34.824264 4842 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:34.824459 master-0 kubenswrapper[4842]: E1204 22:00:34.824382 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls podName:ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.82434685 +0000 UTC m=+173.139159065 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls") pod "dns-operator-7c56cf9b74-sshsd" (UID: "ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e") : secret "metrics-tls" not found Dec 04 22:00:34.824459 master-0 kubenswrapper[4842]: I1204 22:00:34.824444 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:34.824742 master-0 kubenswrapper[4842]: I1204 22:00:34.824558 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:34.825038 master-0 kubenswrapper[4842]: I1204 22:00:34.824968 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:34.825123 master-0 kubenswrapper[4842]: I1204 22:00:34.825065 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:34.825193 master-0 kubenswrapper[4842]: I1204 22:00:34.825126 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:34.825412 master-0 kubenswrapper[4842]: E1204 22:00:34.825354 4842 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 04 22:00:34.825481 master-0 kubenswrapper[4842]: E1204 22:00:34.825448 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs podName:5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.825421788 +0000 UTC m=+173.140234013 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs") pod "multus-admission-controller-7dfc5b745f-nk4gb" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf") : secret "multus-admission-controller-secret" not found Dec 04 22:00:34.825647 master-0 kubenswrapper[4842]: E1204 22:00:34.825615 4842 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:34.825720 master-0 kubenswrapper[4842]: E1204 22:00:34.825675 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls podName:512ba6af-11ad-4217-a1ce-a2ab3ef67ec5 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.825659715 +0000 UTC m=+173.140471930 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-rn6cz" (UID: "512ba6af-11ad-4217-a1ce-a2ab3ef67ec5") : secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:34.827435 master-0 kubenswrapper[4842]: E1204 22:00:34.827354 4842 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 04 22:00:34.827604 master-0 kubenswrapper[4842]: E1204 22:00:34.827457 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.827430782 +0000 UTC m=+173.142243007 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "node-tuning-operator-tls" not found Dec 04 22:00:34.827685 master-0 kubenswrapper[4842]: I1204 22:00:34.827646 4842 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:34.827888 master-0 kubenswrapper[4842]: E1204 22:00:34.827825 4842 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:34.827969 master-0 kubenswrapper[4842]: E1204 22:00:34.827918 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.827894924 +0000 UTC m=+173.142707149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:34.828444 master-0 kubenswrapper[4842]: E1204 22:00:34.828391 4842 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:34.828566 master-0 kubenswrapper[4842]: E1204 22:00:34.828457 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls podName:addddaac-a31a-4dbf-b78f-87225b11b463 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.828440809 +0000 UTC m=+173.143253034 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls") pod "ingress-operator-8649c48786-qlkgh" (UID: "addddaac-a31a-4dbf-b78f-87225b11b463") : secret "metrics-tls" not found Dec 04 22:00:34.828652 master-0 kubenswrapper[4842]: E1204 22:00:34.828577 4842 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 04 22:00:34.828652 master-0 kubenswrapper[4842]: E1204 22:00:34.828634 4842 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls podName:35821f48-b000-4915-847f-a739b6efc5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.828618813 +0000 UTC m=+173.143431038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-r7wcq" (UID: "35821f48-b000-4915-847f-a739b6efc5ee") : secret "image-registry-operator-tls" not found Dec 04 22:00:35.070906 master-0 systemd[1]: Stopping Kubernetes Kubelet... Dec 04 22:00:35.102971 master-0 systemd[1]: kubelet.service: Deactivated successfully. Dec 04 22:00:35.103546 master-0 systemd[1]: Stopped Kubernetes Kubelet. Dec 04 22:00:35.105643 master-0 systemd[1]: kubelet.service: Consumed 12.338s CPU time. Dec 04 22:00:35.133947 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 04 22:00:35.253302 master-0 kubenswrapper[8606]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 22:00:35.253302 master-0 kubenswrapper[8606]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 04 22:00:35.253302 master-0 kubenswrapper[8606]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 22:00:35.253302 master-0 kubenswrapper[8606]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 22:00:35.253302 master-0 kubenswrapper[8606]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 04 22:00:35.253302 master-0 kubenswrapper[8606]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 22:00:35.255070 master-0 kubenswrapper[8606]: I1204 22:00:35.253367 8606 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 04 22:00:35.257246 master-0 kubenswrapper[8606]: W1204 22:00:35.257205 8606 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 22:00:35.257246 master-0 kubenswrapper[8606]: W1204 22:00:35.257234 8606 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 22:00:35.257246 master-0 kubenswrapper[8606]: W1204 22:00:35.257243 8606 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 22:00:35.257246 master-0 kubenswrapper[8606]: W1204 22:00:35.257251 8606 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257266 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257273 8606 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257279 8606 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257285 8606 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257291 8606 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257296 8606 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257301 8606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257307 8606 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257312 8606 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257317 8606 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257322 8606 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257327 8606 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257332 8606 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257337 8606 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257342 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257349 8606 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257354 8606 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257359 8606 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257364 8606 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 22:00:35.257367 master-0 kubenswrapper[8606]: W1204 22:00:35.257370 8606 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257377 8606 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257383 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257390 8606 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257399 8606 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257405 8606 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257412 8606 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257418 8606 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257423 8606 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257428 8606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257434 8606 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257448 8606 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257454 8606 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257461 8606 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257468 8606 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257474 8606 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257481 8606 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257488 8606 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 22:00:35.257938 master-0 kubenswrapper[8606]: W1204 22:00:35.257495 8606 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257527 8606 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257534 8606 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257541 8606 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257548 8606 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257555 8606 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257561 8606 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257568 8606 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257575 8606 feature_gate.go:330] unrecognized feature gate: Example Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257581 8606 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257588 8606 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257595 8606 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257602 8606 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257610 8606 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257616 8606 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257625 8606 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257634 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257641 8606 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257647 8606 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257653 8606 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 22:00:35.258361 master-0 kubenswrapper[8606]: W1204 22:00:35.257659 8606 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: W1204 22:00:35.257665 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: W1204 22:00:35.257671 8606 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: W1204 22:00:35.257677 8606 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: W1204 22:00:35.257683 8606 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: W1204 22:00:35.257689 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: W1204 22:00:35.257696 8606 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: W1204 22:00:35.257702 8606 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: W1204 22:00:35.257708 8606 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: W1204 22:00:35.257714 8606 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: W1204 22:00:35.257719 8606 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257880 8606 flags.go:64] FLAG: --address="0.0.0.0" Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257902 8606 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257914 8606 flags.go:64] FLAG: --anonymous-auth="true" Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257922 8606 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257930 8606 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257937 8606 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257947 8606 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257957 8606 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257965 8606 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257973 8606 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 04 22:00:35.258938 master-0 kubenswrapper[8606]: I1204 22:00:35.257981 8606 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.257989 8606 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.257997 8606 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258005 8606 flags.go:64] FLAG: --cgroup-root="" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258011 8606 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258018 8606 flags.go:64] FLAG: --client-ca-file="" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258025 8606 flags.go:64] FLAG: --cloud-config="" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258031 8606 flags.go:64] FLAG: --cloud-provider="" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258040 8606 flags.go:64] FLAG: --cluster-dns="[]" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258047 8606 flags.go:64] FLAG: --cluster-domain="" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258054 8606 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258062 8606 flags.go:64] FLAG: --config-dir="" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258069 8606 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258077 8606 flags.go:64] FLAG: --container-log-max-files="5" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258087 8606 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258097 8606 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258104 8606 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258112 8606 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258118 8606 flags.go:64] FLAG: --contention-profiling="false" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258125 8606 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258132 8606 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258140 8606 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258147 8606 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258156 8606 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258163 8606 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 04 22:00:35.259454 master-0 kubenswrapper[8606]: I1204 22:00:35.258170 8606 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258177 8606 flags.go:64] FLAG: --enable-load-reader="false" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258185 8606 flags.go:64] FLAG: --enable-server="true" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258192 8606 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258201 8606 flags.go:64] FLAG: --event-burst="100" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258209 8606 flags.go:64] FLAG: --event-qps="50" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258216 8606 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258223 8606 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258230 8606 flags.go:64] FLAG: --eviction-hard="" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258239 8606 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258248 8606 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258256 8606 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258264 8606 flags.go:64] FLAG: --eviction-soft="" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258273 8606 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258281 8606 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258289 8606 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258296 8606 flags.go:64] FLAG: --experimental-mounter-path="" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258303 8606 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258311 8606 flags.go:64] FLAG: --fail-swap-on="true" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258319 8606 flags.go:64] FLAG: --feature-gates="" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258329 8606 flags.go:64] FLAG: --file-check-frequency="20s" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258337 8606 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258344 8606 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258352 8606 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258359 8606 flags.go:64] FLAG: --healthz-port="10248" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258369 8606 flags.go:64] FLAG: --help="false" Dec 04 22:00:35.260167 master-0 kubenswrapper[8606]: I1204 22:00:35.258377 8606 flags.go:64] FLAG: --hostname-override="" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258385 8606 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258393 8606 flags.go:64] FLAG: --http-check-frequency="20s" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258401 8606 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258409 8606 flags.go:64] FLAG: --image-credential-provider-config="" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258416 8606 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258424 8606 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258431 8606 flags.go:64] FLAG: --image-service-endpoint="" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258438 8606 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258446 8606 flags.go:64] FLAG: --kube-api-burst="100" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258453 8606 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258461 8606 flags.go:64] FLAG: --kube-api-qps="50" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258468 8606 flags.go:64] FLAG: --kube-reserved="" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258476 8606 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258482 8606 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258491 8606 flags.go:64] FLAG: --kubelet-cgroups="" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258521 8606 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258530 8606 flags.go:64] FLAG: --lock-file="" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258538 8606 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258546 8606 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258554 8606 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258566 8606 flags.go:64] FLAG: --log-json-split-stream="false" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258573 8606 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258581 8606 flags.go:64] FLAG: --log-text-split-stream="false" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258587 8606 flags.go:64] FLAG: --logging-format="text" Dec 04 22:00:35.260850 master-0 kubenswrapper[8606]: I1204 22:00:35.258595 8606 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258603 8606 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258610 8606 flags.go:64] FLAG: --manifest-url="" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258616 8606 flags.go:64] FLAG: --manifest-url-header="" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258628 8606 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258635 8606 flags.go:64] FLAG: --max-open-files="1000000" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258645 8606 flags.go:64] FLAG: --max-pods="110" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258652 8606 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258660 8606 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258667 8606 flags.go:64] FLAG: --memory-manager-policy="None" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258674 8606 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258681 8606 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258688 8606 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258695 8606 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258712 8606 flags.go:64] FLAG: --node-status-max-images="50" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258719 8606 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258727 8606 flags.go:64] FLAG: --oom-score-adj="-999" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258734 8606 flags.go:64] FLAG: --pod-cidr="" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258741 8606 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258754 8606 flags.go:64] FLAG: --pod-manifest-path="" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258761 8606 flags.go:64] FLAG: --pod-max-pids="-1" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258768 8606 flags.go:64] FLAG: --pods-per-core="0" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258775 8606 flags.go:64] FLAG: --port="10250" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258783 8606 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 04 22:00:35.261520 master-0 kubenswrapper[8606]: I1204 22:00:35.258790 8606 flags.go:64] FLAG: --provider-id="" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258801 8606 flags.go:64] FLAG: --qos-reserved="" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258808 8606 flags.go:64] FLAG: --read-only-port="10255" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258816 8606 flags.go:64] FLAG: --register-node="true" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258823 8606 flags.go:64] FLAG: --register-schedulable="true" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258831 8606 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258844 8606 flags.go:64] FLAG: --registry-burst="10" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258852 8606 flags.go:64] FLAG: --registry-qps="5" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258859 8606 flags.go:64] FLAG: --reserved-cpus="" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258866 8606 flags.go:64] FLAG: --reserved-memory="" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258875 8606 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258882 8606 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258890 8606 flags.go:64] FLAG: --rotate-certificates="false" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258897 8606 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258904 8606 flags.go:64] FLAG: --runonce="false" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258911 8606 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258919 8606 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258928 8606 flags.go:64] FLAG: --seccomp-default="false" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258936 8606 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258944 8606 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258953 8606 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258961 8606 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258970 8606 flags.go:64] FLAG: --storage-driver-password="root" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258978 8606 flags.go:64] FLAG: --storage-driver-secure="false" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258986 8606 flags.go:64] FLAG: --storage-driver-table="stats" Dec 04 22:00:35.262108 master-0 kubenswrapper[8606]: I1204 22:00:35.258993 8606 flags.go:64] FLAG: --storage-driver-user="root" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259000 8606 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259007 8606 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259015 8606 flags.go:64] FLAG: --system-cgroups="" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259022 8606 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259033 8606 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259041 8606 flags.go:64] FLAG: --tls-cert-file="" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259047 8606 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259058 8606 flags.go:64] FLAG: --tls-min-version="" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259065 8606 flags.go:64] FLAG: --tls-private-key-file="" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259073 8606 flags.go:64] FLAG: --topology-manager-policy="none" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259080 8606 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259087 8606 flags.go:64] FLAG: --topology-manager-scope="container" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259095 8606 flags.go:64] FLAG: --v="2" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259105 8606 flags.go:64] FLAG: --version="false" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259115 8606 flags.go:64] FLAG: --vmodule="" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259124 8606 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: I1204 22:00:35.259131 8606 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: W1204 22:00:35.259314 8606 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: W1204 22:00:35.259323 8606 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: W1204 22:00:35.259330 8606 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: W1204 22:00:35.259339 8606 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: W1204 22:00:35.259347 8606 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 22:00:35.262715 master-0 kubenswrapper[8606]: W1204 22:00:35.259354 8606 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259362 8606 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259369 8606 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259377 8606 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259398 8606 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259404 8606 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259411 8606 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259417 8606 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259423 8606 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259429 8606 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259435 8606 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259442 8606 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259448 8606 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259453 8606 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259459 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259468 8606 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259474 8606 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259481 8606 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259487 8606 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259493 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 22:00:35.263296 master-0 kubenswrapper[8606]: W1204 22:00:35.259520 8606 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259529 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259534 8606 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259539 8606 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259545 8606 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259550 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259556 8606 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259563 8606 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259570 8606 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259576 8606 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259582 8606 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259587 8606 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259593 8606 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259598 8606 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259604 8606 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259610 8606 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259615 8606 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259621 8606 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259627 8606 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259632 8606 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 22:00:35.263768 master-0 kubenswrapper[8606]: W1204 22:00:35.259639 8606 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259645 8606 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259651 8606 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259657 8606 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259663 8606 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259668 8606 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259674 8606 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259679 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259685 8606 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259691 8606 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259696 8606 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259703 8606 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259710 8606 feature_gate.go:330] unrecognized feature gate: Example Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259716 8606 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259722 8606 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259727 8606 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259733 8606 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259738 8606 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259744 8606 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259749 8606 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 22:00:35.264236 master-0 kubenswrapper[8606]: W1204 22:00:35.259755 8606 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 22:00:35.264772 master-0 kubenswrapper[8606]: W1204 22:00:35.259761 8606 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 22:00:35.264772 master-0 kubenswrapper[8606]: W1204 22:00:35.259766 8606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 22:00:35.264772 master-0 kubenswrapper[8606]: W1204 22:00:35.259772 8606 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 22:00:35.264772 master-0 kubenswrapper[8606]: W1204 22:00:35.259777 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 22:00:35.264772 master-0 kubenswrapper[8606]: W1204 22:00:35.259783 8606 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 22:00:35.264772 master-0 kubenswrapper[8606]: W1204 22:00:35.259788 8606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 22:00:35.264772 master-0 kubenswrapper[8606]: I1204 22:00:35.259807 8606 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 22:00:35.273469 master-0 kubenswrapper[8606]: I1204 22:00:35.273393 8606 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 04 22:00:35.273469 master-0 kubenswrapper[8606]: I1204 22:00:35.273467 8606 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 04 22:00:35.273876 master-0 kubenswrapper[8606]: W1204 22:00:35.273834 8606 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 22:00:35.273876 master-0 kubenswrapper[8606]: W1204 22:00:35.273869 8606 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 22:00:35.273933 master-0 kubenswrapper[8606]: W1204 22:00:35.273886 8606 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 22:00:35.273933 master-0 kubenswrapper[8606]: W1204 22:00:35.273903 8606 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 22:00:35.273933 master-0 kubenswrapper[8606]: W1204 22:00:35.273914 8606 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 22:00:35.273933 master-0 kubenswrapper[8606]: W1204 22:00:35.273925 8606 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 22:00:35.273933 master-0 kubenswrapper[8606]: W1204 22:00:35.273935 8606 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 22:00:35.274052 master-0 kubenswrapper[8606]: W1204 22:00:35.273947 8606 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 22:00:35.274052 master-0 kubenswrapper[8606]: W1204 22:00:35.273958 8606 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 22:00:35.274052 master-0 kubenswrapper[8606]: W1204 22:00:35.273969 8606 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 22:00:35.274052 master-0 kubenswrapper[8606]: W1204 22:00:35.273979 8606 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 22:00:35.274052 master-0 kubenswrapper[8606]: W1204 22:00:35.273993 8606 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 22:00:35.274052 master-0 kubenswrapper[8606]: W1204 22:00:35.274012 8606 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 22:00:35.274052 master-0 kubenswrapper[8606]: W1204 22:00:35.274024 8606 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 22:00:35.274052 master-0 kubenswrapper[8606]: W1204 22:00:35.274034 8606 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 22:00:35.274052 master-0 kubenswrapper[8606]: W1204 22:00:35.274044 8606 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 22:00:35.274052 master-0 kubenswrapper[8606]: W1204 22:00:35.274054 8606 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274066 8606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274079 8606 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274089 8606 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274099 8606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274110 8606 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274119 8606 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274129 8606 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274140 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274150 8606 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274159 8606 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274170 8606 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274179 8606 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274190 8606 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274200 8606 feature_gate.go:330] unrecognized feature gate: Example Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274209 8606 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274219 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274229 8606 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274258 8606 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274269 8606 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 22:00:35.274281 master-0 kubenswrapper[8606]: W1204 22:00:35.274279 8606 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274290 8606 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274301 8606 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274313 8606 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274323 8606 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274334 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274343 8606 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274354 8606 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274363 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274374 8606 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274385 8606 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274395 8606 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274405 8606 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274415 8606 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274425 8606 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274435 8606 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274445 8606 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274453 8606 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274464 8606 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274474 8606 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 22:00:35.274789 master-0 kubenswrapper[8606]: W1204 22:00:35.274483 8606 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274493 8606 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274528 8606 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274537 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274546 8606 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274554 8606 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274562 8606 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274569 8606 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274577 8606 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274585 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274593 8606 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274602 8606 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274614 8606 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274624 8606 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274646 8606 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 22:00:35.275726 master-0 kubenswrapper[8606]: W1204 22:00:35.274655 8606 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: I1204 22:00:35.274669 8606 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.274960 8606 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.274976 8606 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.274986 8606 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.274994 8606 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.275002 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.275010 8606 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.275020 8606 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.275028 8606 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.275037 8606 feature_gate.go:330] unrecognized feature gate: Example Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.275045 8606 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.275053 8606 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.275061 8606 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.275068 8606 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 22:00:35.276110 master-0 kubenswrapper[8606]: W1204 22:00:35.275076 8606 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275084 8606 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275092 8606 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275099 8606 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275108 8606 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275116 8606 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275123 8606 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275131 8606 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275139 8606 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275147 8606 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275156 8606 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275165 8606 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275174 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275182 8606 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275190 8606 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275199 8606 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275206 8606 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275214 8606 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275221 8606 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275230 8606 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 22:00:35.276539 master-0 kubenswrapper[8606]: W1204 22:00:35.275248 8606 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275257 8606 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275265 8606 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275272 8606 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275280 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275288 8606 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275296 8606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275304 8606 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275312 8606 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275319 8606 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275327 8606 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275335 8606 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275343 8606 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275353 8606 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275363 8606 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275372 8606 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275381 8606 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275389 8606 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275398 8606 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275406 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 22:00:35.276998 master-0 kubenswrapper[8606]: W1204 22:00:35.275414 8606 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275422 8606 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275430 8606 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275439 8606 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275447 8606 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275456 8606 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275465 8606 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275474 8606 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275482 8606 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275490 8606 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275498 8606 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275530 8606 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275541 8606 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275551 8606 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275561 8606 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275569 8606 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275589 8606 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275598 8606 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 22:00:35.277772 master-0 kubenswrapper[8606]: W1204 22:00:35.275606 8606 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 22:00:35.278406 master-0 kubenswrapper[8606]: I1204 22:00:35.275619 8606 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 22:00:35.279148 master-0 kubenswrapper[8606]: I1204 22:00:35.279086 8606 server.go:940] "Client rotation is on, will bootstrap in background" Dec 04 22:00:35.284121 master-0 kubenswrapper[8606]: I1204 22:00:35.284080 8606 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 04 22:00:35.284283 master-0 kubenswrapper[8606]: I1204 22:00:35.284252 8606 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 04 22:00:35.285178 master-0 kubenswrapper[8606]: I1204 22:00:35.285149 8606 server.go:997] "Starting client certificate rotation" Dec 04 22:00:35.285178 master-0 kubenswrapper[8606]: I1204 22:00:35.285173 8606 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 04 22:00:35.285678 master-0 kubenswrapper[8606]: I1204 22:00:35.285341 8606 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-05 21:50:08 +0000 UTC, rotation deadline is 2025-12-05 14:44:50.203877795 +0000 UTC Dec 04 22:00:35.285678 master-0 kubenswrapper[8606]: I1204 22:00:35.285668 8606 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 16h44m14.918212779s for next certificate rotation Dec 04 22:00:35.287542 master-0 kubenswrapper[8606]: I1204 22:00:35.287491 8606 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 22:00:35.290882 master-0 kubenswrapper[8606]: I1204 22:00:35.290845 8606 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 22:00:35.295681 master-0 kubenswrapper[8606]: I1204 22:00:35.295623 8606 log.go:25] "Validated CRI v1 runtime API" Dec 04 22:00:35.299519 master-0 kubenswrapper[8606]: I1204 22:00:35.299477 8606 log.go:25] "Validated CRI v1 image API" Dec 04 22:00:35.301365 master-0 kubenswrapper[8606]: I1204 22:00:35.301314 8606 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 04 22:00:35.306167 master-0 kubenswrapper[8606]: I1204 22:00:35.306108 8606 fs.go:135] Filesystem UUIDs: map[4c52ad11-dbba-45ec-8a7c-4164b2d3de92:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Dec 04 22:00:35.306561 master-0 kubenswrapper[8606]: I1204 22:00:35.306155 8606 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0/userdata/shm major:0 minor:341 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3aa682501427a1a26306dd6e6ffbe29276935fb92a5916c957736c383157a162/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3aa682501427a1a26306dd6e6ffbe29276935fb92a5916c957736c383157a162/userdata/shm major:0 minor:132 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4c2dca88da5957207daf9ea6b5fc6fb1e136ded6ac37a7d6c7df7c70ee7176a1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4c2dca88da5957207daf9ea6b5fc6fb1e136ded6ac37a7d6c7df7c70ee7176a1/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4c79666e90f7715124e93544d73c2c9b1066ea79ae6d56c7b16c532bd0846566/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4c79666e90f7715124e93544d73c2c9b1066ea79ae6d56c7b16c532bd0846566/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e/userdata/shm major:0 minor:316 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde/userdata/shm major:0 minor:335 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f/userdata/shm major:0 minor:314 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878/userdata/shm major:0 minor:323 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b/userdata/shm major:0 minor:330 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/88063b8731e190e3ef35fcb2f8650f0d31e7321d57a43954195df2634f632310/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/88063b8731e190e3ef35fcb2f8650f0d31e7321d57a43954195df2634f632310/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a/userdata/shm major:0 minor:327 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376/userdata/shm major:0 minor:333 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b/userdata/shm major:0 minor:319 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c382efd8856765d7e3a7c1f5148c4c397e023bc0d7fa9282b9bc8277f0af2687/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c382efd8856765d7e3a7c1f5148c4c397e023bc0d7fa9282b9bc8277f0af2687/userdata/shm major:0 minor:160 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d20bd8f7df5ba066210630b0736a496814188135f419c1b214059e7501c8fbf9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d20bd8f7df5ba066210630b0736a496814188135f419c1b214059e7501c8fbf9/userdata/shm major:0 minor:187 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b/userdata/shm major:0 minor:161 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc/userdata/shm major:0 minor:143 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223/userdata/shm major:0 minor:337 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2/userdata/shm major:0 minor:322 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f7b0f16a53cc394a75f1d385a9c55dea4c65ab334eb9a3cd2bbdaa30b3396154/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f7b0f16a53cc394a75f1d385a9c55dea4c65ab334eb9a3cd2bbdaa30b3396154/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0/userdata/shm major:0 minor:339 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~projected/kube-api-access-dvrr5:{mountpoint:/var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~projected/kube-api-access-dvrr5 major:0 minor:311 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~projected/kube-api-access major:0 minor:329 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~secret/serving-cert major:0 minor:299 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:308 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/kube-api-access-m4cct:{mountpoint:/var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/kube-api-access-m4cct major:0 minor:321 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~projected/kube-api-access-9wh6b:{mountpoint:/var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~projected/kube-api-access-9wh6b major:0 minor:157 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:156 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~projected/kube-api-access-jwk6f:{mountpoint:/var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~projected/kube-api-access-jwk6f major:0 minor:291 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~secret/serving-cert major:0 minor:288 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~projected/kube-api-access-4mttq:{mountpoint:/var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~projected/kube-api-access-4mttq major:0 minor:293 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:286 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5/volumes/kubernetes.io~projected/kube-api-access-g5nkh:{mountpoint:/var/lib/kubelet/pods/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5/volumes/kubernetes.io~projected/kube-api-access-g5nkh major:0 minor:306 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~projected/kube-api-access major:0 minor:312 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~secret/serving-cert major:0 minor:295 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~projected/kube-api-access-d4rft:{mountpoint:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~projected/kube-api-access-d4rft major:0 minor:159 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:158 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf/volumes/kubernetes.io~projected/kube-api-access-8wqqt:{mountpoint:/var/lib/kubelet/pods/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf/volumes/kubernetes.io~projected/kube-api-access-8wqqt major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~projected/kube-api-access-xgt75:{mountpoint:/var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~projected/kube-api-access-xgt75 major:0 minor:186 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~secret/webhook-cert major:0 minor:185 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~projected/kube-api-access-wsxkk:{mountpoint:/var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~projected/kube-api-access-wsxkk major:0 minor:325 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~secret/serving-cert major:0 minor:297 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6c8c45e0-2342-499b-aa6b-339b6a722a87/volumes/kubernetes.io~projected/kube-api-access-gcgg9:{mountpoint:/var/lib/kubelet/pods/6c8c45e0-2342-499b-aa6b-339b6a722a87/volumes/kubernetes.io~projected/kube-api-access-gcgg9 major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/74b7c644-ad97-4009-aac7-550edabc55ae/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/74b7c644-ad97-4009-aac7-550edabc55ae/volumes/kubernetes.io~projected/kube-api-access major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76fd9f44-4365-4271-8772-025655c50334/volumes/kubernetes.io~projected/kube-api-access-9j8fr:{mountpoint:/var/lib/kubelet/pods/76fd9f44-4365-4271-8772-025655c50334/volumes/kubernetes.io~projected/kube-api-access-9j8fr major:0 minor:142 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~projected/kube-api-access-s5t2f:{mountpoint:/var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~projected/kube-api-access-s5t2f major:0 minor:310 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~secret/serving-cert major:0 minor:300 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/813f3ee7-35b5-4ee8-b453-00d16d910eae/volumes/kubernetes.io~projected/kube-api-access-8w592:{mountpoint:/var/lib/kubelet/pods/813f3ee7-35b5-4ee8-b453-00d16d910eae/volumes/kubernetes.io~projected/kube-api-access-8w592 major:0 minor:294 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~projected/kube-api-access-lclkg:{mountpoint:/var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~projected/kube-api-access-lclkg major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~secret/metrics-tls major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~projected/kube-api-access-scht6:{mountpoint:/var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~projected/kube-api-access-scht6 major:0 minor:309 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~secret/serving-cert major:0 minor:302 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:332 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/kube-api-access-lr65l:{mountpoint:/var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/kube-api-access-lr65l major:0 minor:317 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c6a5d14d-0409-4024-b0a8-200fa2594185/volumes/kubernetes.io~projected/kube-api-access-bfklr:{mountpoint:/var/lib/kubelet/pods/c6a5d14d-0409-4024-b0a8-200fa2594185/volumes/kubernetes.io~projected/kube-api-access-bfklr major:0 minor:289 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~projected/kube-api-access-c7l9n:{mountpoint:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~projected/kube-api-access-c7l9n major:0 minor:307 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/etcd-client major:0 minor:296 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/serving-cert major:0 minor:298 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c/volumes/kubernetes.io~projected/kube-api-access-w2ndk:{mountpoint:/var/lib/kubelet/pods/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c/volumes/kubernetes.io~projected/kube-api-access-w2ndk major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/volumes/kubernetes.io~projected/kube-api-access-kcw8f:{mountpoint:/var/lib/kubelet/pods/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/volumes/kubernetes.io~projected/kube-api-access-kcw8f major:0 minor:305 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~projected/kube-api-access major:0 minor:313 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~secret/serving-cert major:0 minor:301 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e37d318a-5bf8-46ed-b6de-494102738da7/volumes/kubernetes.io~projected/kube-api-access-r57bb:{mountpoint:/var/lib/kubelet/pods/e37d318a-5bf8-46ed-b6de-494102738da7/volumes/kubernetes.io~projected/kube-api-access-r57bb major:0 minor:292 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa/volumes/kubernetes.io~projected/kube-api-access-xdbpk:{mountpoint:/var/lib/kubelet/pods/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa/volumes/kubernetes.io~projected/kube-api-access-xdbpk major:0 minor:153 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~projected/kube-api-access-g8d54:{mountpoint:/var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~projected/kube-api-access-g8d54 major:0 minor:290 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~secret/serving-cert major:0 minor:287 fsType:tmpfs blockSize:0} overlay_0-112:{mountpoint:/var/lib/containers/storage/overlay/12c66da310ba26e73f9c875a128dcd3afb590642a222c35d01477c6d6dd937f0/merged major:0 minor:112 fsType:overlay blockSize:0} overlay_0-120:{mountpoint:/var/lib/containers/storage/overlay/079f0e817f091493dfb34fea597de725f86259ae9132bd64a8d0ddf6504f636a/merged major:0 minor:120 fsType:overlay blockSize:0} overlay_0-130:{mountpoint:/var/lib/containers/storage/overlay/dfc8d550ce331c21a66175be38544deb4f5250e9dd2d631d18072f3cd4ff37ed/merged major:0 minor:130 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/143be072f31494cac37a0c0b482054aad894eef15bafddbe1c92739645139696/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/5867eae42264c04151f652cbc01027b0a3b8b60aa7855ad52523b8caec306ea8/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/93aa984adf9326a73e761b6329bf2013fea18d93fc3a7f2f7bbe6e30793efe7c/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/6a750204a72e913f18caba4b856d100251f6f3b60ce4e3a2061bbd9a12e505b0/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-145:{mountpoint:/var/lib/containers/storage/overlay/4134eea505e316ad2ea5386b1510474205d0e5bc303592975dcff1ea4f042607/merged major:0 minor:145 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/178dc1361ce0116e7e7561760bf9a7aadea81b2e34c94af6e28c1cc88da51034/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-163:{mountpoint:/var/lib/containers/storage/overlay/7025889c9756e7b7a56bc2b62ce9f6ce434447c46280b1dd16b16f8fab5a0b9e/merged major:0 minor:163 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/e036da790c16d99c571d0ccacf07ff209a590ded05c2c88c351cda4fc6c1f718/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-168:{mountpoint:/var/lib/containers/storage/overlay/52f4840dd9e5a899a3ee79d39704b79e4edb54be50c17350b49fc9809c4f3095/merged major:0 minor:168 fsType:overlay blockSize:0} overlay_0-170:{mountpoint:/var/lib/containers/storage/overlay/4f133cacce459fa2f412e70fcc6adc4ae73461c58f7611fc17293fe1f6dc2a02/merged major:0 minor:170 fsType:overlay blockSize:0} overlay_0-172:{mountpoint:/var/lib/containers/storage/overlay/4b66f1eecd742a97bdf220db36e499d95d58a7c91fe0450b37cb9839d96cac49/merged major:0 minor:172 fsType:overlay blockSize:0} overlay_0-183:{mountpoint:/var/lib/containers/storage/overlay/b6fb4a3972573514ee7adda30080134ced7360f824574c3beb772b2d5bf0b2b2/merged major:0 minor:183 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/1de4e36a9fb8c54e483655ce1d9fad13a355fdba43f22f36d87b6496502b064b/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-191:{mountpoint:/var/lib/containers/storage/overlay/238453336d53163b6ad011610393a602302770663bb4fba04782bb4959247937/merged major:0 minor:191 fsType:overlay blockSize:0} overlay_0-193:{mountpoint:/var/lib/containers/storage/overlay/b20a4a76220f680e5e829ef9a31223a016d69fe44ad6bc492f5bd6a647847c6c/merged major:0 minor:193 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/a7a6021af2778fef456d4a4a1ea72221b1eab4b1e1643207ce74b5d520c56aa9/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-197:{mountpoint:/var/lib/containers/storage/overlay/fd10eb9c73111aabfa99eb8d934c0c065407fba0f614e0b3789d2dc7145a9d61/merged major:0 minor:197 fsType:overlay blockSize:0} overlay_0-205:{mountpoint:/var/lib/containers/storage/overlay/321d620025f8336ddb34a44f22618c0bf228dbd71a420f2dda16297ec37d3ac7/merged major:0 minor:205 fsType:overlay blockSize:0} overlay_0-210:{mountpoint:/var/lib/containers/storage/overlay/6be76a70bd52cc12c4718789a3ac9a35bf7cf545aebb5dca42f534ffcca2d181/merged major:0 minor:210 fsType:overlay blockSize:0} overlay_0-216:{mountpoint:/var/lib/containers/storage/overlay/9901c0e30efb35c00269bb5de9842fde7d85b7f5e7ad5463c6f885ef6639efc9/merged major:0 minor:216 fsType:overlay blockSize:0} overlay_0-218:{mountpoint:/var/lib/containers/storage/overlay/66dfb4a6f65f3d235dc90f3c9c1e7af4874f21d37c8003abcc6dfbb7ccfac93c/merged major:0 minor:218 fsType:overlay blockSize:0} overlay_0-223:{mountpoint:/var/lib/containers/storage/overlay/ec032cec1ba6d94e756a8e7f070c14cb169f19285e4bcc88bb06948ded909ab9/merged major:0 minor:223 fsType:overlay blockSize:0} overlay_0-236:{mountpoint:/var/lib/containers/storage/overlay/8ee3997f740be65c4b819200c7009fd86111958c3b4022066357c77de2b63f4a/merged major:0 minor:236 fsType:overlay blockSize:0} overlay_0-244:{mountpoint:/var/lib/containers/storage/overlay/47caf304b739434b5c3d68259af6b1aaad2d29c64ef44138fcad186441a604d7/merged major:0 minor:244 fsType:overlay blockSize:0} overlay_0-252:{mountpoint:/var/lib/containers/storage/overlay/d6108c34ebd84742ab5d7650dddcb2dc457dbcd66fee7f0a58192a4608223934/merged major:0 minor:252 fsType:overlay blockSize:0} overlay_0-260:{mountpoint:/var/lib/containers/storage/overlay/7fab26e146b5809a5f8981fc47152e23ef8d023f7a51bcc3c07fe2aa2e99d0c9/merged major:0 minor:260 fsType:overlay blockSize:0} overlay_0-268:{mountpoint:/var/lib/containers/storage/overlay/c23e7a1856b497dace33ddd9584837bc5600a59437b9a5195174d18d488c48a5/merged major:0 minor:268 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/c0b3c8faa570433b6642ccd89196235f72b302109a37392b86e6c83e5adcc070/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/91c5135f53c8f2a07ee8b61d091d11a1e72e8d9012f5db0e4ac43d8e0915efee/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-343:{mountpoint:/var/lib/containers/storage/overlay/48b4ffa86c676051747d6e369a64026c0c3e8b0a4b99e9cd95ca257b351ffa14/merged major:0 minor:343 fsType:overlay blockSize:0} overlay_0-345:{mountpoint:/var/lib/containers/storage/overlay/1da5ad97261b9dd9c25181207ab61a55e69fd0f79f5db7fb45cd005231e69f8a/merged major:0 minor:345 fsType:overlay blockSize:0} overlay_0-347:{mountpoint:/var/lib/containers/storage/overlay/2368206871b6a8dd13aecc109feee0e66f00e7f886283538b4e9933be0783c95/merged major:0 minor:347 fsType:overlay blockSize:0} overlay_0-349:{mountpoint:/var/lib/containers/storage/overlay/45ecf6be646a849cec4e2e13cee338ae315048eb379cf33cefd82c689039a2b6/merged major:0 minor:349 fsType:overlay blockSize:0} overlay_0-351:{mountpoint:/var/lib/containers/storage/overlay/6157af232b538c0bdf0224963d07c09c58591db64e7eb7a50883b2cc1ab890b8/merged major:0 minor:351 fsType:overlay blockSize:0} overlay_0-353:{mountpoint:/var/lib/containers/storage/overlay/31f62a2f9b893e038b35620b3fc1a20d382cc0218a5e3bec0ad06a4453546362/merged major:0 minor:353 fsType:overlay blockSize:0} overlay_0-355:{mountpoint:/var/lib/containers/storage/overlay/e6fb70dc932059572abf4d9473fb40d052e35d40dc52d23049d085da25e6da8b/merged major:0 minor:355 fsType:overlay blockSize:0} overlay_0-357:{mountpoint:/var/lib/containers/storage/overlay/5d5425bd31c7ed35089c40dc8b77f1477b7b549064c42d43c048306900105c1f/merged major:0 minor:357 fsType:overlay blockSize:0} overlay_0-359:{mountpoint:/var/lib/containers/storage/overlay/1379fb856aa57ed4051cd51b76f6881b79fab88039881964945e677e20e73353/merged major:0 minor:359 fsType:overlay blockSize:0} overlay_0-361:{mountpoint:/var/lib/containers/storage/overlay/8afc0a3f6e0d2b9bb420a233e1ea5e3e1ae2de777b0fb0a9e6d003a54a72fc61/merged major:0 minor:361 fsType:overlay blockSize:0} overlay_0-363:{mountpoint:/var/lib/containers/storage/overlay/535c981b56b7b8ef1da0d3fd2beedc778545473359fc06161f1fbf1966957626/merged major:0 minor:363 fsType:overlay blockSize:0} overlay_0-365:{mountpoint:/var/lib/containers/storage/overlay/028f9f1952e5e1a2dcee26a80c14ba041f08a22223ef6923434bb8be210d0ed2/merged major:0 minor:365 fsType:overlay blockSize:0} overlay_0-367:{mountpoint:/var/lib/containers/storage/overlay/f324571e919c8ae087a65d30ed2ca7127f89a378fc713bb4f9a151d5c34da7ee/merged major:0 minor:367 fsType:overlay blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/9a5f215847a4d1d24951a091674bdf5602f9b6a85ef6127011ed8fbe88a3d7fc/merged major:0 minor:43 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/01861b06c9e0ce1d43fd738cf99f5cc9b5558174e1efbdab6d21a2d9df386b29/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/351dd05d74b0831cb80765d78f4847cd1bd9ebf00d58f7950985f9ee7fde124f/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/d19935918215d6b7995c922f92d1172dbffa746e9595a2e54a7e0147e2ab2848/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/58c25130541163a6c5515c207203947974565035ce0395cb4dcb00f323e2757f/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/7402445059ddb74cf9554f7d02fca9ae6790f97fbafae5bab9690f419fe0897d/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/a1ff79be2a9af4f05c318a91bea09228927657d6ae5919f810e08bab8d6c4ad0/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/84d0f272a3e1735f6a4afde82153542df2d99f2046b895aced8012ff1fb49a09/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-69:{mountpoint:/var/lib/containers/storage/overlay/789d117fbfb8a4f1010c8e6cedfcc85a45cb9533df7cf66cc25a3379b06223af/merged major:0 minor:69 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/81a28e35de7dec163d3ed4d2127f38abc7fcbebb0792e6cddfbf4b84f8a4d1eb/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/57336447126c1fbbf7930be558ed2633b8c4fc9c17489fdcc678aa77c567e289/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-80:{mountpoint:/var/lib/containers/storage/overlay/e9fb898dcf4350a55755337705dbfe9eb94c5cc7cbb09d40dd8af597afab23cd/merged major:0 minor:80 fsType:overlay blockSize:0} overlay_0-82:{mountpoint:/var/lib/containers/storage/overlay/8d153b36651c1d26d4b300c6b2979ed166c81aa4a1d98f5436844d98bc4e7d33/merged major:0 minor:82 fsType:overlay blockSize:0} overlay_0-87:{mountpoint:/var/lib/containers/storage/overlay/6e60effa88818591e383529d367636315fc24310b3acb737a77a9bbd3c335e1d/merged major:0 minor:87 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/a5e743c21a7b6b1f8c9df2a4c5adfe047dfae0891fdf9858cf2216520ea76f9d/merged major:0 minor:89 fsType:overlay blockSize:0}] Dec 04 22:00:35.338101 master-0 kubenswrapper[8606]: I1204 22:00:35.336604 8606 manager.go:217] Machine: {Timestamp:2025-12-04 22:00:35.335006487 +0000 UTC m=+0.145308722 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:58e57637271046a9a49cd83dda54d0eb SystemUUID:58e57637-2710-46a9-a49c-d83dda54d0eb BootID:4d17516d-34b9-4c3d-aaa6-c745ecd06d22 Filesystems:[{Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~projected/kube-api-access-g8d54 DeviceMajor:0 DeviceMinor:290 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/813f3ee7-35b5-4ee8-b453-00d16d910eae/volumes/kubernetes.io~projected/kube-api-access-8w592 DeviceMajor:0 DeviceMinor:294 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-80 DeviceMajor:0 DeviceMinor:80 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3aa682501427a1a26306dd6e6ffbe29276935fb92a5916c957736c383157a162/userdata/shm DeviceMajor:0 DeviceMinor:132 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/76fd9f44-4365-4271-8772-025655c50334/volumes/kubernetes.io~projected/kube-api-access-9j8fr DeviceMajor:0 DeviceMinor:142 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-236 DeviceMajor:0 DeviceMinor:236 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-355 DeviceMajor:0 DeviceMinor:355 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:185 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-197 DeviceMajor:0 DeviceMinor:197 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~projected/kube-api-access-c7l9n DeviceMajor:0 DeviceMinor:307 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d20bd8f7df5ba066210630b0736a496814188135f419c1b214059e7501c8fbf9/userdata/shm DeviceMajor:0 DeviceMinor:187 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b/userdata/shm DeviceMajor:0 DeviceMinor:161 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-252 DeviceMajor:0 DeviceMinor:252 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf/volumes/kubernetes.io~projected/kube-api-access-8wqqt DeviceMajor:0 DeviceMinor:303 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e/userdata/shm DeviceMajor:0 DeviceMinor:316 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-216 DeviceMajor:0 DeviceMinor:216 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f/userdata/shm DeviceMajor:0 DeviceMinor:314 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376/userdata/shm DeviceMajor:0 DeviceMinor:333 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~projected/kube-api-access-jwk6f DeviceMajor:0 DeviceMinor:291 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-168 DeviceMajor:0 DeviceMinor:168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6c8c45e0-2342-499b-aa6b-339b6a722a87/volumes/kubernetes.io~projected/kube-api-access-gcgg9 DeviceMajor:0 DeviceMinor:126 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa/volumes/kubernetes.io~projected/kube-api-access-xdbpk DeviceMajor:0 DeviceMinor:153 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:312 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2/userdata/shm DeviceMajor:0 DeviceMinor:322 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-112 DeviceMajor:0 DeviceMinor:112 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:302 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-183 DeviceMajor:0 DeviceMinor:183 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:332 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f7b0f16a53cc394a75f1d385a9c55dea4c65ab334eb9a3cd2bbdaa30b3396154/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/74b7c644-ad97-4009-aac7-550edabc55ae/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:125 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~projected/kube-api-access-dvrr5 DeviceMajor:0 DeviceMinor:311 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/kube-api-access-lr65l DeviceMajor:0 DeviceMinor:317 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-69 DeviceMajor:0 DeviceMinor:69 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c382efd8856765d7e3a7c1f5148c4c397e023bc0d7fa9282b9bc8277f0af2687/userdata/shm DeviceMajor:0 DeviceMinor:160 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-210 DeviceMajor:0 DeviceMinor:210 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5/volumes/kubernetes.io~projected/kube-api-access-g5nkh DeviceMajor:0 DeviceMinor:306 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-87 DeviceMajor:0 DeviceMinor:87 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:286 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-145 DeviceMajor:0 DeviceMinor:145 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:287 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e37d318a-5bf8-46ed-b6de-494102738da7/volumes/kubernetes.io~projected/kube-api-access-r57bb DeviceMajor:0 DeviceMinor:292 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-172 DeviceMajor:0 DeviceMinor:172 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-349 DeviceMajor:0 DeviceMinor:349 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-268 DeviceMajor:0 DeviceMinor:268 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:297 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~projected/kube-api-access-s5t2f DeviceMajor:0 DeviceMinor:310 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b/userdata/shm DeviceMajor:0 DeviceMinor:319 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4c79666e90f7715124e93544d73c2c9b1066ea79ae6d56c7b16c532bd0846566/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-244 DeviceMajor:0 DeviceMinor:244 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:288 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:296 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:301 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-365 DeviceMajor:0 DeviceMinor:365 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-120 DeviceMajor:0 DeviceMinor:120 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc/userdata/shm DeviceMajor:0 DeviceMinor:143 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:156 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:299 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/kube-api-access-m4cct DeviceMajor:0 DeviceMinor:321 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0/userdata/shm DeviceMajor:0 DeviceMinor:339 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-130 DeviceMajor:0 DeviceMinor:130 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-170 DeviceMajor:0 DeviceMinor:170 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-163 DeviceMajor:0 DeviceMinor:163 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-218 DeviceMajor:0 DeviceMinor:218 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-260 DeviceMajor:0 DeviceMinor:260 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-82 DeviceMajor:0 DeviceMinor:82 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-205 DeviceMajor:0 DeviceMinor:205 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c/volumes/kubernetes.io~projected/kube-api-access-w2ndk DeviceMajor:0 DeviceMinor:304 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde/userdata/shm DeviceMajor:0 DeviceMinor:335 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/88063b8731e190e3ef35fcb2f8650f0d31e7321d57a43954195df2634f632310/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-191 DeviceMajor:0 DeviceMinor:191 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:308 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~projected/kube-api-access-scht6 DeviceMajor:0 DeviceMinor:309 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878/userdata/shm DeviceMajor:0 DeviceMinor:323 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a/userdata/shm DeviceMajor:0 DeviceMinor:327 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:329 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:158 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:300 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-359 DeviceMajor:0 DeviceMinor:359 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-363 DeviceMajor:0 DeviceMinor:363 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-361 DeviceMajor:0 DeviceMinor:361 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4c2dca88da5957207daf9ea6b5fc6fb1e136ded6ac37a7d6c7df7c70ee7176a1/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:124 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~projected/kube-api-access-9wh6b DeviceMajor:0 DeviceMinor:157 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~projected/kube-api-access-d4rft DeviceMajor:0 DeviceMinor:159 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~projected/kube-api-access-4mttq DeviceMajor:0 DeviceMinor:293 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/volumes/kubernetes.io~projected/kube-api-access-kcw8f DeviceMajor:0 DeviceMinor:305 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0/userdata/shm DeviceMajor:0 DeviceMinor:341 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c6a5d14d-0409-4024-b0a8-200fa2594185/volumes/kubernetes.io~projected/kube-api-access-bfklr DeviceMajor:0 DeviceMinor:289 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:298 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223/userdata/shm DeviceMajor:0 DeviceMinor:337 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-347 DeviceMajor:0 DeviceMinor:347 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-345 DeviceMajor:0 DeviceMinor:345 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~projected/kube-api-access-lclkg DeviceMajor:0 DeviceMinor:127 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~projected/kube-api-access-xgt75 DeviceMajor:0 DeviceMinor:186 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:313 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-351 DeviceMajor:0 DeviceMinor:351 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-367 DeviceMajor:0 DeviceMinor:367 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-223 DeviceMajor:0 DeviceMinor:223 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-353 DeviceMajor:0 DeviceMinor:353 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:295 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~projected/kube-api-access-wsxkk DeviceMajor:0 DeviceMinor:325 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-343 DeviceMajor:0 DeviceMinor:343 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-193 DeviceMajor:0 DeviceMinor:193 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b/userdata/shm DeviceMajor:0 DeviceMinor:330 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-357 DeviceMajor:0 DeviceMinor:357 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:51d6ac9cf90d7c8 MacAddress:26:5c:e7:61:a9:42 Speed:10000 Mtu:8900} {Name:53415730f490fe2 MacAddress:e2:f4:f7:19:81:55 Speed:10000 Mtu:8900} {Name:58c253cb55a2596 MacAddress:22:61:bb:31:8a:a0 Speed:10000 Mtu:8900} {Name:66619c69e4b8475 MacAddress:5e:d1:6c:51:a3:fb Speed:10000 Mtu:8900} {Name:6bf343216707410 MacAddress:46:e7:ec:ee:39:79 Speed:10000 Mtu:8900} {Name:969ff9f89143902 MacAddress:56:fd:2b:03:22:7e Speed:10000 Mtu:8900} {Name:aa68f9d56263db1 MacAddress:fe:de:75:bb:75:ec Speed:10000 Mtu:8900} {Name:ab0050370c98df5 MacAddress:ee:f1:e4:ef:79:b8 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:12:f7:6c:8f:54:65 Speed:0 Mtu:8900} {Name:e18ea7a7e8b99e9 MacAddress:86:ec:6f:fd:b3:7d Speed:10000 Mtu:8900} {Name:ea7c4bd82fb1342 MacAddress:6e:9c:b3:97:f2:28 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:65:18:02 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:40:5b:43 Speed:-1 Mtu:9000} {Name:faca5825f225200 MacAddress:6e:35:7a:fe:5b:ea Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:c6:7e:e9:15:bf:77 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 04 22:00:35.338101 master-0 kubenswrapper[8606]: I1204 22:00:35.338084 8606 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 04 22:00:35.338577 master-0 kubenswrapper[8606]: I1204 22:00:35.338459 8606 manager.go:233] Version: {KernelVersion:5.14.0-427.100.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511170715-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 04 22:00:35.338835 master-0 kubenswrapper[8606]: I1204 22:00:35.338808 8606 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 04 22:00:35.339132 master-0 kubenswrapper[8606]: I1204 22:00:35.339077 8606 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 04 22:00:35.339402 master-0 kubenswrapper[8606]: I1204 22:00:35.339126 8606 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 04 22:00:35.339593 master-0 kubenswrapper[8606]: I1204 22:00:35.339424 8606 topology_manager.go:138] "Creating topology manager with none policy" Dec 04 22:00:35.339593 master-0 kubenswrapper[8606]: I1204 22:00:35.339440 8606 container_manager_linux.go:303] "Creating device plugin manager" Dec 04 22:00:35.339593 master-0 kubenswrapper[8606]: I1204 22:00:35.339452 8606 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 22:00:35.339593 master-0 kubenswrapper[8606]: I1204 22:00:35.339479 8606 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 22:00:35.339729 master-0 kubenswrapper[8606]: I1204 22:00:35.339711 8606 state_mem.go:36] "Initialized new in-memory state store" Dec 04 22:00:35.339885 master-0 kubenswrapper[8606]: I1204 22:00:35.339834 8606 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 04 22:00:35.339954 master-0 kubenswrapper[8606]: I1204 22:00:35.339917 8606 kubelet.go:418] "Attempting to sync node with API server" Dec 04 22:00:35.339954 master-0 kubenswrapper[8606]: I1204 22:00:35.339934 8606 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 04 22:00:35.340032 master-0 kubenswrapper[8606]: I1204 22:00:35.339956 8606 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 04 22:00:35.340032 master-0 kubenswrapper[8606]: I1204 22:00:35.339974 8606 kubelet.go:324] "Adding apiserver pod source" Dec 04 22:00:35.340032 master-0 kubenswrapper[8606]: I1204 22:00:35.339997 8606 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 04 22:00:35.342403 master-0 kubenswrapper[8606]: I1204 22:00:35.342057 8606 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 04 22:00:35.342403 master-0 kubenswrapper[8606]: I1204 22:00:35.342352 8606 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 04 22:00:35.343636 master-0 kubenswrapper[8606]: I1204 22:00:35.343325 8606 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 04 22:00:35.343636 master-0 kubenswrapper[8606]: I1204 22:00:35.343548 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 04 22:00:35.343636 master-0 kubenswrapper[8606]: I1204 22:00:35.343571 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 04 22:00:35.343636 master-0 kubenswrapper[8606]: I1204 22:00:35.343591 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 04 22:00:35.343636 master-0 kubenswrapper[8606]: I1204 22:00:35.343610 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 04 22:00:35.343636 master-0 kubenswrapper[8606]: I1204 22:00:35.343628 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 04 22:00:35.343636 master-0 kubenswrapper[8606]: I1204 22:00:35.343646 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 04 22:00:35.343923 master-0 kubenswrapper[8606]: I1204 22:00:35.343659 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 04 22:00:35.343923 master-0 kubenswrapper[8606]: I1204 22:00:35.343676 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 04 22:00:35.343923 master-0 kubenswrapper[8606]: I1204 22:00:35.343689 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 04 22:00:35.343923 master-0 kubenswrapper[8606]: I1204 22:00:35.343707 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 04 22:00:35.343923 master-0 kubenswrapper[8606]: I1204 22:00:35.343734 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 04 22:00:35.343923 master-0 kubenswrapper[8606]: I1204 22:00:35.343763 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 04 22:00:35.343923 master-0 kubenswrapper[8606]: I1204 22:00:35.343823 8606 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 04 22:00:35.344768 master-0 kubenswrapper[8606]: I1204 22:00:35.344456 8606 server.go:1280] "Started kubelet" Dec 04 22:00:35.344768 master-0 kubenswrapper[8606]: I1204 22:00:35.344567 8606 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 04 22:00:35.345144 master-0 kubenswrapper[8606]: I1204 22:00:35.345041 8606 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 04 22:00:35.345144 master-0 kubenswrapper[8606]: I1204 22:00:35.345121 8606 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 04 22:00:35.346449 master-0 kubenswrapper[8606]: I1204 22:00:35.345965 8606 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 04 22:00:35.348247 master-0 kubenswrapper[8606]: I1204 22:00:35.346464 8606 server.go:449] "Adding debug handlers to kubelet server" Dec 04 22:00:35.348069 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 04 22:00:35.353421 master-0 kubenswrapper[8606]: I1204 22:00:35.353342 8606 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 04 22:00:35.353612 master-0 kubenswrapper[8606]: I1204 22:00:35.353427 8606 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 04 22:00:35.353946 master-0 kubenswrapper[8606]: I1204 22:00:35.353827 8606 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-05 21:50:08 +0000 UTC, rotation deadline is 2025-12-05 15:52:16.832667045 +0000 UTC Dec 04 22:00:35.353946 master-0 kubenswrapper[8606]: I1204 22:00:35.353940 8606 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h51m41.478731225s for next certificate rotation Dec 04 22:00:35.354386 master-0 kubenswrapper[8606]: I1204 22:00:35.354289 8606 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 22:00:35.355039 master-0 kubenswrapper[8606]: I1204 22:00:35.354806 8606 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 04 22:00:35.355039 master-0 kubenswrapper[8606]: I1204 22:00:35.354880 8606 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 04 22:00:35.356400 master-0 kubenswrapper[8606]: I1204 22:00:35.354628 8606 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 04 22:00:35.358700 master-0 kubenswrapper[8606]: I1204 22:00:35.358613 8606 factory.go:55] Registering systemd factory Dec 04 22:00:35.358700 master-0 kubenswrapper[8606]: I1204 22:00:35.358645 8606 factory.go:221] Registration of the systemd container factory successfully Dec 04 22:00:35.359554 master-0 kubenswrapper[8606]: I1204 22:00:35.359477 8606 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 22:00:35.360299 master-0 kubenswrapper[8606]: I1204 22:00:35.360153 8606 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 22:00:35.361200 master-0 kubenswrapper[8606]: I1204 22:00:35.360800 8606 factory.go:153] Registering CRI-O factory Dec 04 22:00:35.361200 master-0 kubenswrapper[8606]: I1204 22:00:35.360875 8606 factory.go:221] Registration of the crio container factory successfully Dec 04 22:00:35.361200 master-0 kubenswrapper[8606]: I1204 22:00:35.360982 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="871cb002-67f4-43aa-a41d-7a5b2f340059" volumeName="kubernetes.io/projected/871cb002-67f4-43aa-a41d-7a5b2f340059-kube-api-access-lclkg" seLinuxMountContext="" Dec 04 22:00:35.361200 master-0 kubenswrapper[8606]: I1204 22:00:35.361037 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cedb0b3e-674e-40b9-a10d-45a9f0c5c59c" volumeName="kubernetes.io/configmap/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-iptables-alerter-script" seLinuxMountContext="" Dec 04 22:00:35.361200 master-0 kubenswrapper[8606]: I1204 22:00:35.361048 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="690b447a-19c0-4925-bc9d-d0c86a83a377" volumeName="kubernetes.io/secret/690b447a-19c0-4925-bc9d-d0c86a83a377-serving-cert" seLinuxMountContext="" Dec 04 22:00:35.361200 master-0 kubenswrapper[8606]: I1204 22:00:35.361060 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46229484-5fa1-4595-94a0-44477abae90e" volumeName="kubernetes.io/configmap/46229484-5fa1-4595-94a0-44477abae90e-config" seLinuxMountContext="" Dec 04 22:00:35.361200 master-0 kubenswrapper[8606]: I1204 22:00:35.361062 8606 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 04 22:00:35.361200 master-0 kubenswrapper[8606]: I1204 22:00:35.361148 8606 factory.go:103] Registering Raw factory Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361069 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="465637a4-42be-4a65-a859-7af699960138" volumeName="kubernetes.io/secret/465637a4-42be-4a65-a859-7af699960138-cluster-olm-operator-serving-cert" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361273 8606 manager.go:1196] Started watching for new ooms in manager Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361391 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="634c1df6-de4d-4e26-8c71-d39311cae0ce" volumeName="kubernetes.io/secret/634c1df6-de4d-4e26-8c71-d39311cae0ce-webhook-cert" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361590 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="690b447a-19c0-4925-bc9d-d0c86a83a377" volumeName="kubernetes.io/configmap/690b447a-19c0-4925-bc9d-d0c86a83a377-config" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361690 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74b7c644-ad97-4009-aac7-550edabc55ae" volumeName="kubernetes.io/projected/74b7c644-ad97-4009-aac7-550edabc55ae-kube-api-access" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361775 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e065179e-634a-4cbe-bb59-5b01c514e4de" volumeName="kubernetes.io/configmap/e065179e-634a-4cbe-bb59-5b01c514e4de-config" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361813 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" volumeName="kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovnkube-config" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361849 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="465637a4-42be-4a65-a859-7af699960138" volumeName="kubernetes.io/projected/465637a4-42be-4a65-a859-7af699960138-kube-api-access-4mttq" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361876 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76fd9f44-4365-4271-8772-025655c50334" volumeName="kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-whereabouts-configmap" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361898 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f893663c-7c1e-4eda-9839-99c1c0440304" volumeName="kubernetes.io/secret/f893663c-7c1e-4eda-9839-99c1c0440304-serving-cert" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361953 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" volumeName="kubernetes.io/secret/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361968 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76fd9f44-4365-4271-8772-025655c50334" volumeName="kubernetes.io/projected/76fd9f44-4365-4271-8772-025655c50334-kube-api-access-9j8fr" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.361984 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a544105a-5bec-456a-aef6-c160943c1f67" volumeName="kubernetes.io/configmap/a544105a-5bec-456a-aef6-c160943c1f67-config" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362000 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="addddaac-a31a-4dbf-b78f-87225b11b463" volumeName="kubernetes.io/configmap/addddaac-a31a-4dbf-b78f-87225b11b463-trusted-ca" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362015 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-config" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362038 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24648a41-875f-4e98-8b21-3bdd38dffa32" volumeName="kubernetes.io/projected/24648a41-875f-4e98-8b21-3bdd38dffa32-kube-api-access" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362053 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="35821f48-b000-4915-847f-a739b6efc5ee" volumeName="kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-bound-sa-token" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362100 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="59d3d0d8-1a2a-4d14-8312-d33818acba88" volumeName="kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-config" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362115 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" volumeName="kubernetes.io/projected/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-kube-api-access-8wqqt" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362131 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="634c1df6-de4d-4e26-8c71-d39311cae0ce" volumeName="kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-ovnkube-identity-cm" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362145 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76fd9f44-4365-4271-8772-025655c50334" volumeName="kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-sysctl-allowlist" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362164 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76fd9f44-4365-4271-8772-025655c50334" volumeName="kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-binary-copy" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362183 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f091088-2166-4026-9fa6-62bd83407edb" volumeName="kubernetes.io/configmap/7f091088-2166-4026-9fa6-62bd83407edb-config" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362205 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24648a41-875f-4e98-8b21-3bdd38dffa32" volumeName="kubernetes.io/configmap/24648a41-875f-4e98-8b21-3bdd38dffa32-config" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362222 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-service-ca" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362241 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f091088-2166-4026-9fa6-62bd83407edb" volumeName="kubernetes.io/secret/7f091088-2166-4026-9fa6-62bd83407edb-serving-cert" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362257 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="59d3d0d8-1a2a-4d14-8312-d33818acba88" volumeName="kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-script-lib" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362274 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="59d3d0d8-1a2a-4d14-8312-d33818acba88" volumeName="kubernetes.io/projected/59d3d0d8-1a2a-4d14-8312-d33818acba88-kube-api-access-d4rft" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362290 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="59d3d0d8-1a2a-4d14-8312-d33818acba88" volumeName="kubernetes.io/secret/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovn-node-metrics-cert" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362305 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="690b447a-19c0-4925-bc9d-d0c86a83a377" volumeName="kubernetes.io/projected/690b447a-19c0-4925-bc9d-d0c86a83a377-kube-api-access-wsxkk" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362323 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e" volumeName="kubernetes.io/projected/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-kube-api-access-kcw8f" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362341 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="512ba6af-11ad-4217-a1ce-a2ab3ef67ec5" volumeName="kubernetes.io/projected/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-kube-api-access-g5nkh" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362369 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f091088-2166-4026-9fa6-62bd83407edb" volumeName="kubernetes.io/projected/7f091088-2166-4026-9fa6-62bd83407edb-kube-api-access-s5t2f" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362389 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" volumeName="kubernetes.io/projected/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-kube-api-access-9wh6b" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362411 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e37d318a-5bf8-46ed-b6de-494102738da7" volumeName="kubernetes.io/projected/e37d318a-5bf8-46ed-b6de-494102738da7-kube-api-access-r57bb" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362440 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f893663c-7c1e-4eda-9839-99c1c0440304" volumeName="kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-service-ca-bundle" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362461 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0beb871c-3bf1-471c-a028-746a650267bf" volumeName="kubernetes.io/configmap/0beb871c-3bf1-471c-a028-746a650267bf-trusted-ca" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362484 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" volumeName="kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-env-overrides" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362525 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56f25fad-089d-4df6-abb1-10d4c76750f1" volumeName="kubernetes.io/secret/56f25fad-089d-4df6-abb1-10d4c76750f1-serving-cert" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362556 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="634c1df6-de4d-4e26-8c71-d39311cae0ce" volumeName="kubernetes.io/projected/634c1df6-de4d-4e26-8c71-d39311cae0ce-kube-api-access-xgt75" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362579 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-ca" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362608 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/projected/ceb419e4-d804-4111-b8d8-8436cc2ee617-kube-api-access-c7l9n" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362632 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f893663c-7c1e-4eda-9839-99c1c0440304" volumeName="kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-trusted-ca-bundle" seLinuxMountContext="" Dec 04 22:00:35.362681 master-0 kubenswrapper[8606]: I1204 22:00:35.362653 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0beb871c-3bf1-471c-a028-746a650267bf" volumeName="kubernetes.io/projected/0beb871c-3bf1-471c-a028-746a650267bf-kube-api-access-dvrr5" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.362933 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a544105a-5bec-456a-aef6-c160943c1f67" volumeName="kubernetes.io/secret/a544105a-5bec-456a-aef6-c160943c1f67-serving-cert" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.362968 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6c8c45e0-2342-499b-aa6b-339b6a722a87" volumeName="kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-cni-binary-copy" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.362991 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6c8c45e0-2342-499b-aa6b-339b6a722a87" volumeName="kubernetes.io/projected/6c8c45e0-2342-499b-aa6b-339b6a722a87-kube-api-access-gcgg9" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363015 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="813f3ee7-35b5-4ee8-b453-00d16d910eae" volumeName="kubernetes.io/projected/813f3ee7-35b5-4ee8-b453-00d16d910eae-kube-api-access-8w592" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363037 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" volumeName="kubernetes.io/projected/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-kube-api-access-xdbpk" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363161 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f893663c-7c1e-4eda-9839-99c1c0440304" volumeName="kubernetes.io/projected/f893663c-7c1e-4eda-9839-99c1c0440304-kube-api-access-g8d54" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363201 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56f25fad-089d-4df6-abb1-10d4c76750f1" volumeName="kubernetes.io/projected/56f25fad-089d-4df6-abb1-10d4c76750f1-kube-api-access" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363238 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56f25fad-089d-4df6-abb1-10d4c76750f1" volumeName="kubernetes.io/configmap/56f25fad-089d-4df6-abb1-10d4c76750f1-config" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363269 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c6a5d14d-0409-4024-b0a8-200fa2594185" volumeName="kubernetes.io/configmap/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-trusted-ca" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363302 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-serving-cert" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363332 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cedb0b3e-674e-40b9-a10d-45a9f0c5c59c" volumeName="kubernetes.io/projected/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-kube-api-access-w2ndk" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363350 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="35821f48-b000-4915-847f-a739b6efc5ee" volumeName="kubernetes.io/configmap/35821f48-b000-4915-847f-a739b6efc5ee-trusted-ca" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363367 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="634c1df6-de4d-4e26-8c71-d39311cae0ce" volumeName="kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-env-overrides" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363384 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c6a5d14d-0409-4024-b0a8-200fa2594185" volumeName="kubernetes.io/projected/c6a5d14d-0409-4024-b0a8-200fa2594185-kube-api-access-bfklr" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363411 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-client" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363433 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e065179e-634a-4cbe-bb59-5b01c514e4de" volumeName="kubernetes.io/secret/e065179e-634a-4cbe-bb59-5b01c514e4de-serving-cert" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363453 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="512ba6af-11ad-4217-a1ce-a2ab3ef67ec5" volumeName="kubernetes.io/configmap/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-telemetry-config" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363476 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="35821f48-b000-4915-847f-a739b6efc5ee" volumeName="kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-kube-api-access-m4cct" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363545 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46229484-5fa1-4595-94a0-44477abae90e" volumeName="kubernetes.io/secret/46229484-5fa1-4595-94a0-44477abae90e-serving-cert" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363576 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="59d3d0d8-1a2a-4d14-8312-d33818acba88" volumeName="kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-env-overrides" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363607 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="addddaac-a31a-4dbf-b78f-87225b11b463" volumeName="kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-bound-sa-token" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363636 8606 manager.go:319] Starting recovery of all containers Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363630 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24648a41-875f-4e98-8b21-3bdd38dffa32" volumeName="kubernetes.io/secret/24648a41-875f-4e98-8b21-3bdd38dffa32-serving-cert" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363758 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="465637a4-42be-4a65-a859-7af699960138" volumeName="kubernetes.io/empty-dir/465637a4-42be-4a65-a859-7af699960138-operand-assets" seLinuxMountContext="" Dec 04 22:00:35.363786 master-0 kubenswrapper[8606]: I1204 22:00:35.363813 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6c8c45e0-2342-499b-aa6b-339b6a722a87" volumeName="kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-daemon-config" seLinuxMountContext="" Dec 04 22:00:35.364416 master-0 kubenswrapper[8606]: I1204 22:00:35.363842 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="871cb002-67f4-43aa-a41d-7a5b2f340059" volumeName="kubernetes.io/secret/871cb002-67f4-43aa-a41d-7a5b2f340059-metrics-tls" seLinuxMountContext="" Dec 04 22:00:35.364416 master-0 kubenswrapper[8606]: I1204 22:00:35.363861 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e065179e-634a-4cbe-bb59-5b01c514e4de" volumeName="kubernetes.io/projected/e065179e-634a-4cbe-bb59-5b01c514e4de-kube-api-access" seLinuxMountContext="" Dec 04 22:00:35.364416 master-0 kubenswrapper[8606]: I1204 22:00:35.363890 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f893663c-7c1e-4eda-9839-99c1c0440304" volumeName="kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-config" seLinuxMountContext="" Dec 04 22:00:35.364416 master-0 kubenswrapper[8606]: I1204 22:00:35.363914 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46229484-5fa1-4595-94a0-44477abae90e" volumeName="kubernetes.io/projected/46229484-5fa1-4595-94a0-44477abae90e-kube-api-access-jwk6f" seLinuxMountContext="" Dec 04 22:00:35.364416 master-0 kubenswrapper[8606]: I1204 22:00:35.363929 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a544105a-5bec-456a-aef6-c160943c1f67" volumeName="kubernetes.io/projected/a544105a-5bec-456a-aef6-c160943c1f67-kube-api-access-scht6" seLinuxMountContext="" Dec 04 22:00:35.364416 master-0 kubenswrapper[8606]: I1204 22:00:35.363945 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="addddaac-a31a-4dbf-b78f-87225b11b463" volumeName="kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-kube-api-access-lr65l" seLinuxMountContext="" Dec 04 22:00:35.364416 master-0 kubenswrapper[8606]: I1204 22:00:35.363960 8606 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74b7c644-ad97-4009-aac7-550edabc55ae" volumeName="kubernetes.io/configmap/74b7c644-ad97-4009-aac7-550edabc55ae-service-ca" seLinuxMountContext="" Dec 04 22:00:35.364416 master-0 kubenswrapper[8606]: I1204 22:00:35.363981 8606 reconstruct.go:97] "Volume reconstruction finished" Dec 04 22:00:35.364416 master-0 kubenswrapper[8606]: I1204 22:00:35.364000 8606 reconciler.go:26] "Reconciler: start to sync state" Dec 04 22:00:35.367123 master-0 kubenswrapper[8606]: I1204 22:00:35.367092 8606 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 04 22:00:35.387739 master-0 kubenswrapper[8606]: I1204 22:00:35.387682 8606 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 04 22:00:35.390338 master-0 kubenswrapper[8606]: I1204 22:00:35.390297 8606 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 04 22:00:35.390423 master-0 kubenswrapper[8606]: I1204 22:00:35.390356 8606 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 04 22:00:35.390423 master-0 kubenswrapper[8606]: I1204 22:00:35.390389 8606 kubelet.go:2335] "Starting kubelet main sync loop" Dec 04 22:00:35.390485 master-0 kubenswrapper[8606]: E1204 22:00:35.390448 8606 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 04 22:00:35.392906 master-0 kubenswrapper[8606]: I1204 22:00:35.392864 8606 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 22:00:35.397540 master-0 kubenswrapper[8606]: I1204 22:00:35.397470 8606 generic.go:334] "Generic (PLEG): container finished" podID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerID="719d3f66cbdb2170aefa60d42b234f7eb81fd7d5f45e585cd2b86f0e36930c80" exitCode=0 Dec 04 22:00:35.407490 master-0 kubenswrapper[8606]: I1204 22:00:35.407455 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/2.log" Dec 04 22:00:35.410083 master-0 kubenswrapper[8606]: I1204 22:00:35.410038 8606 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="b6f9e5d170c5c01abcc938a01683f90fda3ae8ba34521ff9d208045fb85cbe9d" exitCode=1 Dec 04 22:00:35.410083 master-0 kubenswrapper[8606]: I1204 22:00:35.410074 8606 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="7ab8b346978ad6f1cf331a2cd2e464eb81897737f06c530111b331aae07ed9d5" exitCode=0 Dec 04 22:00:35.423387 master-0 kubenswrapper[8606]: I1204 22:00:35.423339 8606 generic.go:334] "Generic (PLEG): container finished" podID="d75143d9bc4a2dc15781dc51ccff632a" containerID="2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8" exitCode=0 Dec 04 22:00:35.435836 master-0 kubenswrapper[8606]: I1204 22:00:35.435366 8606 generic.go:334] "Generic (PLEG): container finished" podID="59d3d0d8-1a2a-4d14-8312-d33818acba88" containerID="d14cbc85e41a76d9831e3cb322a42ef6928588924655708cdbc5b0d0983944d9" exitCode=0 Dec 04 22:00:35.446727 master-0 kubenswrapper[8606]: I1204 22:00:35.446682 8606 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="758bcdf683109d822a1017f454c5645fc9f981b1015625c2d5ef493072ef4678" exitCode=0 Dec 04 22:00:35.446727 master-0 kubenswrapper[8606]: I1204 22:00:35.446718 8606 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="3574d633e7308db3b6dd662bd037451e5d0ed5c34c61a73c66397c77d3caf66e" exitCode=0 Dec 04 22:00:35.446727 master-0 kubenswrapper[8606]: I1204 22:00:35.446728 8606 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="189119b91f6e6ef0f62e51f0cc69d03fbbc0144ce142853e62f56609d2029b1d" exitCode=0 Dec 04 22:00:35.446857 master-0 kubenswrapper[8606]: I1204 22:00:35.446738 8606 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="eade6c63cfbfd85793c4e11745edd4d5a786bcef37074f29af89908e936863d7" exitCode=0 Dec 04 22:00:35.446857 master-0 kubenswrapper[8606]: I1204 22:00:35.446748 8606 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="3903951768e93b52af44e2ee6090549f67bc30f2eeffd34acda2b5e56323b0df" exitCode=0 Dec 04 22:00:35.446857 master-0 kubenswrapper[8606]: I1204 22:00:35.446758 8606 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="20fdbd8f60e4052a44e37c80c735da9d3ff66c7350cb568fd169c055622f648f" exitCode=0 Dec 04 22:00:35.458666 master-0 kubenswrapper[8606]: I1204 22:00:35.458605 8606 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="183ef4cdec698bdfc64b154e7dcc6b79ea7ba6ed603e295f5859a3a5c552d57d" exitCode=1 Dec 04 22:00:35.463363 master-0 kubenswrapper[8606]: I1204 22:00:35.463312 8606 generic.go:334] "Generic (PLEG): container finished" podID="5c492425-adf8-424f-ac19-f465071857f9" containerID="d2ec9d7da1c0e81ac2a2563a5da4eba0b637698001afaf92060cbb9b07bcf2c4" exitCode=0 Dec 04 22:00:35.489725 master-0 kubenswrapper[8606]: I1204 22:00:35.489697 8606 manager.go:324] Recovery completed Dec 04 22:00:35.490669 master-0 kubenswrapper[8606]: E1204 22:00:35.490637 8606 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:00:35.528915 master-0 kubenswrapper[8606]: I1204 22:00:35.528873 8606 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 04 22:00:35.528915 master-0 kubenswrapper[8606]: I1204 22:00:35.528906 8606 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 04 22:00:35.528915 master-0 kubenswrapper[8606]: I1204 22:00:35.528935 8606 state_mem.go:36] "Initialized new in-memory state store" Dec 04 22:00:35.529307 master-0 kubenswrapper[8606]: I1204 22:00:35.529235 8606 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 04 22:00:35.529357 master-0 kubenswrapper[8606]: I1204 22:00:35.529312 8606 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 04 22:00:35.529357 master-0 kubenswrapper[8606]: I1204 22:00:35.529351 8606 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Dec 04 22:00:35.529410 master-0 kubenswrapper[8606]: I1204 22:00:35.529362 8606 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Dec 04 22:00:35.529410 master-0 kubenswrapper[8606]: I1204 22:00:35.529374 8606 policy_none.go:49] "None policy: Start" Dec 04 22:00:35.530896 master-0 kubenswrapper[8606]: I1204 22:00:35.530873 8606 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 04 22:00:35.530945 master-0 kubenswrapper[8606]: I1204 22:00:35.530913 8606 state_mem.go:35] "Initializing new in-memory state store" Dec 04 22:00:35.531314 master-0 kubenswrapper[8606]: I1204 22:00:35.531296 8606 state_mem.go:75] "Updated machine memory state" Dec 04 22:00:35.531382 master-0 kubenswrapper[8606]: I1204 22:00:35.531319 8606 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Dec 04 22:00:35.544202 master-0 kubenswrapper[8606]: I1204 22:00:35.544172 8606 manager.go:334] "Starting Device Plugin manager" Dec 04 22:00:35.544509 master-0 kubenswrapper[8606]: I1204 22:00:35.544475 8606 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 04 22:00:35.544553 master-0 kubenswrapper[8606]: I1204 22:00:35.544497 8606 server.go:79] "Starting device plugin registration server" Dec 04 22:00:35.545680 master-0 kubenswrapper[8606]: I1204 22:00:35.545652 8606 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 04 22:00:35.545839 master-0 kubenswrapper[8606]: I1204 22:00:35.545684 8606 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 04 22:00:35.547358 master-0 kubenswrapper[8606]: I1204 22:00:35.546225 8606 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 04 22:00:35.547482 master-0 kubenswrapper[8606]: I1204 22:00:35.547467 8606 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 04 22:00:35.547548 master-0 kubenswrapper[8606]: I1204 22:00:35.547486 8606 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 04 22:00:35.646261 master-0 kubenswrapper[8606]: I1204 22:00:35.646138 8606 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 22:00:35.647735 master-0 kubenswrapper[8606]: I1204 22:00:35.647705 8606 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 22:00:35.647797 master-0 kubenswrapper[8606]: I1204 22:00:35.647750 8606 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 22:00:35.647797 master-0 kubenswrapper[8606]: I1204 22:00:35.647780 8606 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 22:00:35.647869 master-0 kubenswrapper[8606]: I1204 22:00:35.647860 8606 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 22:00:35.660282 master-0 kubenswrapper[8606]: I1204 22:00:35.660252 8606 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Dec 04 22:00:35.660366 master-0 kubenswrapper[8606]: I1204 22:00:35.660345 8606 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 04 22:00:35.691868 master-0 kubenswrapper[8606]: I1204 22:00:35.691755 8606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Dec 04 22:00:35.692780 master-0 kubenswrapper[8606]: I1204 22:00:35.692725 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="102176ffeb29c005b794a7107abe8379110f11cc84e8df32b1ece06e515aee64" Dec 04 22:00:35.692897 master-0 kubenswrapper[8606]: I1204 22:00:35.692793 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"587901d613877303166a73aefe83b729a828ee57d294468839ecb48ee62967aa"} Dec 04 22:00:35.692976 master-0 kubenswrapper[8606]: I1204 22:00:35.692903 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"b6f9e5d170c5c01abcc938a01683f90fda3ae8ba34521ff9d208045fb85cbe9d"} Dec 04 22:00:35.692976 master-0 kubenswrapper[8606]: I1204 22:00:35.692929 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"7ab8b346978ad6f1cf331a2cd2e464eb81897737f06c530111b331aae07ed9d5"} Dec 04 22:00:35.692976 master-0 kubenswrapper[8606]: I1204 22:00:35.692951 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94"} Dec 04 22:00:35.692976 master-0 kubenswrapper[8606]: I1204 22:00:35.692971 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"b4b557e71fac173d7ebddbf04536e46989f934644030fceea9234231919b8e8f"} Dec 04 22:00:35.693109 master-0 kubenswrapper[8606]: I1204 22:00:35.692991 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"75e4e520a75639c893eb6ea15b07a3187aaf4dfc898564bd4832b04c7d30a431"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693012 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"4c79666e90f7715124e93544d73c2c9b1066ea79ae6d56c7b16c532bd0846566"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693261 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693285 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693350 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerDied","Data":"2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693373 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"88063b8731e190e3ef35fcb2f8650f0d31e7321d57a43954195df2634f632310"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693469 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"d00575d56d81b17e8c0212c4cad634fe2f3afd13660a8de6afbe8a4381dd50d7"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693488 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"4c2dca88da5957207daf9ea6b5fc6fb1e136ded6ac37a7d6c7df7c70ee7176a1"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693690 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693715 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"70ec2f528f522213daf96bac275fda7cf7f15b026ed56e4b58dab19aaca3bd29"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693732 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"183ef4cdec698bdfc64b154e7dcc6b79ea7ba6ed603e295f5859a3a5c552d57d"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693754 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"f7b0f16a53cc394a75f1d385a9c55dea4c65ab334eb9a3cd2bbdaa30b3396154"} Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693781 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6c99a1def9360d6a4883701478a0eaa20d1a5711ce6f5867cfe014cc60feead" Dec 04 22:00:35.695939 master-0 kubenswrapper[8606]: I1204 22:00:35.693799 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2790dad3d78d4a3d56be2a118c350f47bfd8d53723c6fb4f92908cfb1c9d89a4" Dec 04 22:00:35.706268 master-0 kubenswrapper[8606]: E1204 22:00:35.706130 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 22:00:35.707090 master-0 kubenswrapper[8606]: W1204 22:00:35.707045 8606 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Dec 04 22:00:35.707225 master-0 kubenswrapper[8606]: E1204 22:00:35.707095 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:00:35.707843 master-0 kubenswrapper[8606]: E1204 22:00:35.707619 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.707979 master-0 kubenswrapper[8606]: E1204 22:00:35.707960 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769141 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769205 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769242 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769266 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769293 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769337 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769362 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769392 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769419 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769443 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769465 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769490 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769573 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769601 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769624 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769646 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.769917 master-0 kubenswrapper[8606]: I1204 22:00:35.769670 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.808378 master-0 kubenswrapper[8606]: E1204 22:00:35.808301 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870753 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870803 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870836 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870858 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870881 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870897 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870914 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870931 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870948 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870972 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.870987 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871004 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871022 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871040 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871056 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871073 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871091 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871163 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871223 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871248 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871272 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871294 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871317 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871339 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871361 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871383 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871403 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871425 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871445 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871467 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871488 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871521 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871542 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:35.871592 master-0 kubenswrapper[8606]: I1204 22:00:35.871564 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:36.341130 master-0 kubenswrapper[8606]: I1204 22:00:36.341072 8606 apiserver.go:52] "Watching apiserver" Dec 04 22:00:36.355124 master-0 kubenswrapper[8606]: I1204 22:00:36.351439 8606 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 22:00:36.355124 master-0 kubenswrapper[8606]: I1204 22:00:36.352281 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0","openshift-etcd/etcd-master-0-master-0","openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb","openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb","openshift-ingress-operator/ingress-operator-8649c48786-qlkgh","openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg","openshift-network-node-identity/network-node-identity-nk92d","openshift-network-operator/network-operator-79767b7ff9-8lq7w","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b","openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-marketplace/marketplace-operator-f797b99b6-m9m4h","openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz","kube-system/bootstrap-kube-controller-manager-master-0","openshift-dns-operator/dns-operator-7c56cf9b74-sshsd","openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq","openshift-multus/multus-dgpw9","openshift-network-operator/iptables-alerter-c747h","openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk","openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-multus/multus-additional-cni-plugins-5tpnf","openshift-network-diagnostics/network-check-target-6jkkl","openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz","assisted-installer/assisted-installer-controller-mxfnl","openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p","openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj","openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp","openshift-ovn-kubernetes/ovnkube-node-8nxc5","openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn","openshift-multus/network-metrics-daemon-9pfhj","openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5","openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs"] Dec 04 22:00:36.355124 master-0 kubenswrapper[8606]: I1204 22:00:36.354018 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 22:00:36.363373 master-0 kubenswrapper[8606]: I1204 22:00:36.363303 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:36.363790 master-0 kubenswrapper[8606]: I1204 22:00:36.363746 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:36.363907 master-0 kubenswrapper[8606]: I1204 22:00:36.363743 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:36.364079 master-0 kubenswrapper[8606]: I1204 22:00:36.364024 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:36.364196 master-0 kubenswrapper[8606]: I1204 22:00:36.364154 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:36.364263 master-0 kubenswrapper[8606]: I1204 22:00:36.364240 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:36.364560 master-0 kubenswrapper[8606]: I1204 22:00:36.364545 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:36.364712 master-0 kubenswrapper[8606]: I1204 22:00:36.364668 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:36.365150 master-0 kubenswrapper[8606]: I1204 22:00:36.365101 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.365222 master-0 kubenswrapper[8606]: I1204 22:00:36.365116 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.365267 master-0 kubenswrapper[8606]: I1204 22:00:36.365201 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:36.365424 master-0 kubenswrapper[8606]: I1204 22:00:36.365384 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:36.369024 master-0 kubenswrapper[8606]: I1204 22:00:36.369000 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.371168 master-0 kubenswrapper[8606]: I1204 22:00:36.371136 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 22:00:36.371431 master-0 kubenswrapper[8606]: I1204 22:00:36.371374 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 22:00:36.371651 master-0 kubenswrapper[8606]: I1204 22:00:36.371622 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 22:00:36.371886 master-0 kubenswrapper[8606]: I1204 22:00:36.371844 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 22:00:36.372723 master-0 kubenswrapper[8606]: I1204 22:00:36.372697 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 22:00:36.372796 master-0 kubenswrapper[8606]: I1204 22:00:36.372735 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 22:00:36.372886 master-0 kubenswrapper[8606]: I1204 22:00:36.372862 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.372972 master-0 kubenswrapper[8606]: I1204 22:00:36.372947 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.373087 master-0 kubenswrapper[8606]: I1204 22:00:36.372881 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 22:00:36.376980 master-0 kubenswrapper[8606]: I1204 22:00:36.376935 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.377151 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.377338 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.377383 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.377708 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.377720 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378062 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378072 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcw8f\" (UniqueName: \"kubernetes.io/projected/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-kube-api-access-kcw8f\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378116 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378150 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-ovnkube-identity-cm\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378178 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378209 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-config\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378232 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-binary-copy\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378257 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378284 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvrr5\" (UniqueName: \"kubernetes.io/projected/0beb871c-3bf1-471c-a028-746a650267bf-kube-api-access-dvrr5\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378309 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24648a41-875f-4e98-8b21-3bdd38dffa32-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378336 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378365 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-conf-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378399 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wh6b\" (UniqueName: \"kubernetes.io/projected/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-kube-api-access-9wh6b\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378427 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-var-lib-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378456 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-config\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378537 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/871cb002-67f4-43aa-a41d-7a5b2f340059-host-etc-kube\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378573 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwk6f\" (UniqueName: \"kubernetes.io/projected/46229484-5fa1-4595-94a0-44477abae90e-kube-api-access-jwk6f\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378602 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-serving-cert\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378648 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-daemon-config\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378234 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 22:00:36.378679 master-0 kubenswrapper[8606]: I1204 22:00:36.378759 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.378678 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.378284 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.378850 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqqt\" (UniqueName: \"kubernetes.io/projected/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-kube-api-access-8wqqt\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.378324 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.378974 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/634c1df6-de4d-4e26-8c71-d39311cae0ce-webhook-cert\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379028 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690b447a-19c0-4925-bc9d-d0c86a83a377-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.378440 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379066 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7l9n\" (UniqueName: \"kubernetes.io/projected/ceb419e4-d804-4111-b8d8-8436cc2ee617-kube-api-access-c7l9n\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379096 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379122 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4rft\" (UniqueName: \"kubernetes.io/projected/59d3d0d8-1a2a-4d14-8312-d33818acba88-kube-api-access-d4rft\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379171 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379596 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379637 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379672 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379702 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379730 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-node-log\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379758 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-cnibin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379789 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-netns\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379814 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-netd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379841 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379870 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f893663c-7c1e-4eda-9839-99c1c0440304-serving-cert\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379896 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/871cb002-67f4-43aa-a41d-7a5b2f340059-metrics-tls\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379926 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/465637a4-42be-4a65-a859-7af699960138-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.379988 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-etc-kubernetes\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380044 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/634c1df6-de4d-4e26-8c71-d39311cae0ce-webhook-cert\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380091 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/465637a4-42be-4a65-a859-7af699960138-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380093 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380210 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w592\" (UniqueName: \"kubernetes.io/projected/813f3ee7-35b5-4ee8-b453-00d16d910eae-kube-api-access-8w592\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380254 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380213 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380288 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35821f48-b000-4915-847f-a739b6efc5ee-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380322 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-script-lib\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380346 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b7c644-ad97-4009-aac7-550edabc55ae-kube-api-access\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380454 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-system-cni-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380482 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380532 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-env-overrides\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380567 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lclkg\" (UniqueName: \"kubernetes.io/projected/871cb002-67f4-43aa-a41d-7a5b2f340059-kube-api-access-lclkg\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380594 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-cnibin\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380621 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-socket-dir-parent\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380652 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46229484-5fa1-4595-94a0-44477abae90e-serving-cert\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:36.380647 master-0 kubenswrapper[8606]: I1204 22:00:36.380693 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24648a41-875f-4e98-8b21-3bdd38dffa32-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.380778 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690b447a-19c0-4925-bc9d-d0c86a83a377-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.380819 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.380866 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381000 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-slash\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381128 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381158 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381197 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f25fad-089d-4df6-abb1-10d4c76750f1-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381225 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-client\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381253 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-ovn\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381280 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f091088-2166-4026-9fa6-62bd83407edb-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381351 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381367 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-k8s-cni-cncf-io\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381395 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e065179e-634a-4cbe-bb59-5b01c514e4de-serving-cert\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381423 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scht6\" (UniqueName: \"kubernetes.io/projected/a544105a-5bec-456a-aef6-c160943c1f67-kube-api-access-scht6\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381448 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-os-release\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381471 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-cni-binary-copy\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381495 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-multus\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381536 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovn-node-metrics-cert\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381562 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a544105a-5bec-456a-aef6-c160943c1f67-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381627 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381710 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381774 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381763 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381866 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381880 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381902 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-system-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381919 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381965 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f091088-2166-4026-9fa6-62bd83407edb-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.381974 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0beb871c-3bf1-471c-a028-746a650267bf-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382055 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a544105a-5bec-456a-aef6-c160943c1f67-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382126 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382141 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr65l\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-kube-api-access-lr65l\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382191 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382189 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgt75\" (UniqueName: \"kubernetes.io/projected/634c1df6-de4d-4e26-8c71-d39311cae0ce-kube-api-access-xgt75\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382248 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382282 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovn-node-metrics-cert\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382299 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f091088-2166-4026-9fa6-62bd83407edb-config\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382378 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382391 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382414 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382436 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382417 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j8fr\" (UniqueName: \"kubernetes.io/projected/76fd9f44-4365-4271-8772-025655c50334-kube-api-access-9j8fr\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382577 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382612 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-os-release\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382821 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxkk\" (UniqueName: \"kubernetes.io/projected/690b447a-19c0-4925-bc9d-d0c86a83a377-kube-api-access-wsxkk\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382877 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56f25fad-089d-4df6-abb1-10d4c76750f1-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.382919 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-systemd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.383055 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.383155 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.383174 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r57bb\" (UniqueName: \"kubernetes.io/projected/e37d318a-5bf8-46ed-b6de-494102738da7-kube-api-access-r57bb\") pod \"csi-snapshot-controller-operator-6bc8656fdc-xhndk\" (UID: \"e37d318a-5bf8-46ed-b6de-494102738da7\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.383164 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.383345 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfklr\" (UniqueName: \"kubernetes.io/projected/c6a5d14d-0409-4024-b0a8-200fa2594185-kube-api-access-bfklr\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.383416 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-etc-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.383434 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.383487 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 22:00:36.383617 master-0 kubenswrapper[8606]: I1204 22:00:36.383661 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-env-overrides\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.383801 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e065179e-634a-4cbe-bb59-5b01c514e4de-config\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.383863 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24648a41-875f-4e98-8b21-3bdd38dffa32-config\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.383918 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mttq\" (UniqueName: \"kubernetes.io/projected/465637a4-42be-4a65-a859-7af699960138-kube-api-access-4mttq\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384033 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-env-overrides\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384064 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384114 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f25fad-089d-4df6-abb1-10d4c76750f1-config\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384153 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-kubelet\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384167 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24648a41-875f-4e98-8b21-3bdd38dffa32-config\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384186 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-bin\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384300 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5t2f\" (UniqueName: \"kubernetes.io/projected/7f091088-2166-4026-9fa6-62bd83407edb-kube-api-access-s5t2f\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384337 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384355 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-multus-certs\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384375 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/addddaac-a31a-4dbf-b78f-87225b11b463-trusted-ca\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384393 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384413 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46229484-5fa1-4595-94a0-44477abae90e-config\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384433 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384456 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74b7c644-ad97-4009-aac7-550edabc55ae-service-ca\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384480 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384530 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-systemd-units\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384555 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8d54\" (UniqueName: \"kubernetes.io/projected/f893663c-7c1e-4eda-9839-99c1c0440304-kube-api-access-g8d54\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384620 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384641 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nkh\" (UniqueName: \"kubernetes.io/projected/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-kube-api-access-g5nkh\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384662 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/465637a4-42be-4a65-a859-7af699960138-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384680 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-config\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384752 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-netns\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384775 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcgg9\" (UniqueName: \"kubernetes.io/projected/6c8c45e0-2342-499b-aa6b-339b6a722a87-kube-api-access-gcgg9\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384800 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-bound-sa-token\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384821 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-log-socket\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384843 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-bin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384863 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-kubelet\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384887 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-hostroot\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384951 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cct\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-kube-api-access-m4cct\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.384972 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbpk\" (UniqueName: \"kubernetes.io/projected/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-kube-api-access-xdbpk\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385031 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f25fad-089d-4df6-abb1-10d4c76750f1-config\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385050 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385073 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e065179e-634a-4cbe-bb59-5b01c514e4de-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385082 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385093 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a544105a-5bec-456a-aef6-c160943c1f67-config\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385209 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385240 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a544105a-5bec-456a-aef6-c160943c1f67-config\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385273 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385354 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385367 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/465637a4-42be-4a65-a859-7af699960138-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385466 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385469 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-config\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385607 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385661 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385817 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46229484-5fa1-4595-94a0-44477abae90e-config\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.385847 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.386187 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.386405 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.386412 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.386604 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.386621 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 22:00:36.387596 master-0 kubenswrapper[8606]: I1204 22:00:36.386619 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 04 22:00:36.390946 master-0 kubenswrapper[8606]: I1204 22:00:36.388114 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 22:00:36.390946 master-0 kubenswrapper[8606]: I1204 22:00:36.388302 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.390946 master-0 kubenswrapper[8606]: I1204 22:00:36.388424 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 22:00:36.390946 master-0 kubenswrapper[8606]: I1204 22:00:36.389722 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 22:00:36.390946 master-0 kubenswrapper[8606]: I1204 22:00:36.389926 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 22:00:36.390946 master-0 kubenswrapper[8606]: I1204 22:00:36.390044 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-binary-copy\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.392620 master-0 kubenswrapper[8606]: I1204 22:00:36.392567 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 22:00:36.393297 master-0 kubenswrapper[8606]: I1204 22:00:36.393246 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 22:00:36.393549 master-0 kubenswrapper[8606]: I1204 22:00:36.393478 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.393847 master-0 kubenswrapper[8606]: I1204 22:00:36.393798 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 22:00:36.394396 master-0 kubenswrapper[8606]: I1204 22:00:36.394347 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.394666 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.394763 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.395099 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.395551 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.395644 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.395833 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.396133 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.396170 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.397163 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.397486 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.397891 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.398217 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f893663c-7c1e-4eda-9839-99c1c0440304-serving-cert\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.398275 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46229484-5fa1-4595-94a0-44477abae90e-serving-cert\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.398565 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.398603 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690b447a-19c0-4925-bc9d-d0c86a83a377-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.398707 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.398846 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-config\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.399026 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.399072 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/871cb002-67f4-43aa-a41d-7a5b2f340059-metrics-tls\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.399109 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.399301 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.399424 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.399757 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.399948 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-script-lib\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.400144 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.400206 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-env-overrides\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.400280 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.400183 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.400817 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.401086 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.399047 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.401251 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e065179e-634a-4cbe-bb59-5b01c514e4de-config\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.400285 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.401548 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.402908 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f25fad-089d-4df6-abb1-10d4c76750f1-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:36.403002 master-0 kubenswrapper[8606]: I1204 22:00:36.402913 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.403154 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.403471 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.403685 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f091088-2166-4026-9fa6-62bd83407edb-config\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.404077 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.404100 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.404671 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690b447a-19c0-4925-bc9d-d0c86a83a377-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.404926 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-cni-binary-copy\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.405148 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74b7c644-ad97-4009-aac7-550edabc55ae-service-ca\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.405553 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-client\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.405599 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-config\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.405724 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-daemon-config\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.405907 master-0 kubenswrapper[8606]: I1204 22:00:36.405761 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24648a41-875f-4e98-8b21-3bdd38dffa32-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:36.410910 master-0 kubenswrapper[8606]: I1204 22:00:36.410833 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-serving-cert\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.411271 master-0 kubenswrapper[8606]: I1204 22:00:36.411092 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 04 22:00:36.411570 master-0 kubenswrapper[8606]: I1204 22:00:36.411522 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:00:36.411570 master-0 kubenswrapper[8606]: I1204 22:00:36.411554 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 22:00:36.412792 master-0 kubenswrapper[8606]: I1204 22:00:36.412002 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.412792 master-0 kubenswrapper[8606]: I1204 22:00:36.412016 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 04 22:00:36.412792 master-0 kubenswrapper[8606]: I1204 22:00:36.412186 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e065179e-634a-4cbe-bb59-5b01c514e4de-serving-cert\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:36.412792 master-0 kubenswrapper[8606]: I1204 22:00:36.412343 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.412792 master-0 kubenswrapper[8606]: I1204 22:00:36.412343 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.412792 master-0 kubenswrapper[8606]: I1204 22:00:36.412420 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 22:00:36.412792 master-0 kubenswrapper[8606]: I1204 22:00:36.412625 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:36.420110 master-0 kubenswrapper[8606]: I1204 22:00:36.415286 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 22:00:36.420110 master-0 kubenswrapper[8606]: I1204 22:00:36.419313 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-ovnkube-identity-cm\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:00:36.420368 master-0 kubenswrapper[8606]: I1204 22:00:36.420207 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:36.421038 master-0 kubenswrapper[8606]: I1204 22:00:36.420964 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35821f48-b000-4915-847f-a739b6efc5ee-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:36.421791 master-0 kubenswrapper[8606]: I1204 22:00:36.421754 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 22:00:36.422087 master-0 kubenswrapper[8606]: I1204 22:00:36.422018 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 22:00:36.423491 master-0 kubenswrapper[8606]: I1204 22:00:36.423451 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 04 22:00:36.426001 master-0 kubenswrapper[8606]: I1204 22:00:36.425951 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:36.426190 master-0 kubenswrapper[8606]: I1204 22:00:36.426051 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/addddaac-a31a-4dbf-b78f-87225b11b463-trusted-ca\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:36.432969 master-0 kubenswrapper[8606]: I1204 22:00:36.432924 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqqt\" (UniqueName: \"kubernetes.io/projected/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-kube-api-access-8wqqt\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:36.433580 master-0 kubenswrapper[8606]: I1204 22:00:36.433529 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcw8f\" (UniqueName: \"kubernetes.io/projected/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-kube-api-access-kcw8f\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:36.433889 master-0 kubenswrapper[8606]: I1204 22:00:36.433785 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0beb871c-3bf1-471c-a028-746a650267bf-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:36.437293 master-0 kubenswrapper[8606]: I1204 22:00:36.437232 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvrr5\" (UniqueName: \"kubernetes.io/projected/0beb871c-3bf1-471c-a028-746a650267bf-kube-api-access-dvrr5\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:36.456989 master-0 kubenswrapper[8606]: I1204 22:00:36.456937 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwk6f\" (UniqueName: \"kubernetes.io/projected/46229484-5fa1-4595-94a0-44477abae90e-kube-api-access-jwk6f\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:00:36.457315 master-0 kubenswrapper[8606]: I1204 22:00:36.457042 8606 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 04 22:00:36.475402 master-0 kubenswrapper[8606]: I1204 22:00:36.475374 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4rft\" (UniqueName: \"kubernetes.io/projected/59d3d0d8-1a2a-4d14-8312-d33818acba88-kube-api-access-d4rft\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.486486 master-0 kubenswrapper[8606]: I1204 22:00:36.486437 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-k8s-cni-cncf-io\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.486593 master-0 kubenswrapper[8606]: I1204 22:00:36.486536 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-os-release\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.486668 master-0 kubenswrapper[8606]: I1204 22:00:36.486628 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-multus\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.486728 master-0 kubenswrapper[8606]: I1204 22:00:36.486676 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-system-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.486728 master-0 kubenswrapper[8606]: I1204 22:00:36.486716 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-host-slash\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:36.486860 master-0 kubenswrapper[8606]: I1204 22:00:36.486814 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-os-release\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.486912 master-0 kubenswrapper[8606]: I1204 22:00:36.486876 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-systemd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.486952 master-0 kubenswrapper[8606]: I1204 22:00:36.486912 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.486991 master-0 kubenswrapper[8606]: I1204 22:00:36.486963 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-etc-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.487103 master-0 kubenswrapper[8606]: I1204 22:00:36.487074 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-kubelet\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.487187 master-0 kubenswrapper[8606]: I1204 22:00:36.486709 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-k8s-cni-cncf-io\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.487977 master-0 kubenswrapper[8606]: I1204 22:00:36.487936 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-bin\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.491820 master-0 kubenswrapper[8606]: I1204 22:00:36.491763 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-kubelet\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.492181 master-0 kubenswrapper[8606]: I1204 22:00:36.492145 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-systemd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.492264 master-0 kubenswrapper[8606]: I1204 22:00:36.492149 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-os-release\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.492411 master-0 kubenswrapper[8606]: I1204 22:00:36.492280 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.492411 master-0 kubenswrapper[8606]: I1204 22:00:36.492398 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-os-release\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.492411 master-0 kubenswrapper[8606]: I1204 22:00:36.492283 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-etc-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.492777 master-0 kubenswrapper[8606]: I1204 22:00:36.492738 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-multus\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.492835 master-0 kubenswrapper[8606]: I1204 22:00:36.492776 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:36.492883 master-0 kubenswrapper[8606]: I1204 22:00:36.492835 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-system-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.492883 master-0 kubenswrapper[8606]: I1204 22:00:36.492850 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-multus-certs\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.492991 master-0 kubenswrapper[8606]: I1204 22:00:36.492954 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-multus-certs\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.493189 master-0 kubenswrapper[8606]: I1204 22:00:36.493147 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-bin\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.493302 master-0 kubenswrapper[8606]: I1204 22:00:36.493268 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.493361 master-0 kubenswrapper[8606]: E1204 22:00:36.493335 8606 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:36.493361 master-0 kubenswrapper[8606]: I1204 22:00:36.493341 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.493487 master-0 kubenswrapper[8606]: I1204 22:00:36.493456 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.493648 master-0 kubenswrapper[8606]: I1204 22:00:36.493618 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.494652 master-0 kubenswrapper[8606]: E1204 22:00:36.494623 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.994582724 +0000 UTC m=+1.804884959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:36.494727 master-0 kubenswrapper[8606]: I1204 22:00:36.494683 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-systemd-units\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.494808 master-0 kubenswrapper[8606]: I1204 22:00:36.494747 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:36.494808 master-0 kubenswrapper[8606]: I1204 22:00:36.494755 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-systemd-units\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.494808 master-0 kubenswrapper[8606]: I1204 22:00:36.494775 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.494925 master-0 kubenswrapper[8606]: I1204 22:00:36.494831 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.494925 master-0 kubenswrapper[8606]: E1204 22:00:36.494858 8606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 04 22:00:36.494925 master-0 kubenswrapper[8606]: E1204 22:00:36.494915 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.994887862 +0000 UTC m=+1.805190087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : secret "metrics-daemon-secret" not found Dec 04 22:00:36.495040 master-0 kubenswrapper[8606]: I1204 22:00:36.495015 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-log-socket\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.495085 master-0 kubenswrapper[8606]: I1204 22:00:36.495051 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-netns\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.495085 master-0 kubenswrapper[8606]: I1204 22:00:36.495078 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-kubelet\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.495173 master-0 kubenswrapper[8606]: I1204 22:00:36.495104 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-hostroot\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.495173 master-0 kubenswrapper[8606]: I1204 22:00:36.495105 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-log-socket\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.495173 master-0 kubenswrapper[8606]: I1204 22:00:36.495122 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-netns\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.495292 master-0 kubenswrapper[8606]: I1204 22:00:36.495182 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-kubelet\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.495292 master-0 kubenswrapper[8606]: I1204 22:00:36.495247 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-bin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.495370 master-0 kubenswrapper[8606]: I1204 22:00:36.495291 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-hostroot\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.495409 master-0 kubenswrapper[8606]: I1204 22:00:36.495396 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-bin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.495453 master-0 kubenswrapper[8606]: I1204 22:00:36.495437 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:36.495495 master-0 kubenswrapper[8606]: I1204 22:00:36.495471 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.495671 master-0 kubenswrapper[8606]: I1204 22:00:36.495647 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:36.495728 master-0 kubenswrapper[8606]: I1204 22:00:36.495687 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:36.495728 master-0 kubenswrapper[8606]: I1204 22:00:36.495716 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-iptables-alerter-script\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:36.495811 master-0 kubenswrapper[8606]: I1204 22:00:36.495771 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-conf-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.495811 master-0 kubenswrapper[8606]: E1204 22:00:36.495785 8606 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 04 22:00:36.495885 master-0 kubenswrapper[8606]: E1204 22:00:36.495817 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs podName:5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.995807176 +0000 UTC m=+1.806109391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs") pod "multus-admission-controller-7dfc5b745f-nk4gb" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf") : secret "multus-admission-controller-secret" not found Dec 04 22:00:36.495947 master-0 kubenswrapper[8606]: E1204 22:00:36.495921 8606 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:36.495994 master-0 kubenswrapper[8606]: I1204 22:00:36.495942 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.495994 master-0 kubenswrapper[8606]: E1204 22:00:36.495981 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls podName:512ba6af-11ad-4217-a1ce-a2ab3ef67ec5 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.99596828 +0000 UTC m=+1.806270505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-rn6cz" (UID: "512ba6af-11ad-4217-a1ce-a2ab3ef67ec5") : secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:36.496079 master-0 kubenswrapper[8606]: I1204 22:00:36.496013 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-conf-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.496442 master-0 kubenswrapper[8606]: I1204 22:00:36.496376 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-var-lib-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.496442 master-0 kubenswrapper[8606]: I1204 22:00:36.496429 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-var-lib-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.496550 master-0 kubenswrapper[8606]: I1204 22:00:36.496469 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:36.496550 master-0 kubenswrapper[8606]: I1204 22:00:36.496519 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ndk\" (UniqueName: \"kubernetes.io/projected/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-kube-api-access-w2ndk\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:36.496656 master-0 kubenswrapper[8606]: I1204 22:00:36.496555 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/871cb002-67f4-43aa-a41d-7a5b2f340059-host-etc-kube\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:00:36.496656 master-0 kubenswrapper[8606]: E1204 22:00:36.496565 8606 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 04 22:00:36.496656 master-0 kubenswrapper[8606]: I1204 22:00:36.496558 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-iptables-alerter-script\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:36.496656 master-0 kubenswrapper[8606]: E1204 22:00:36.496613 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics podName:c6a5d14d-0409-4024-b0a8-200fa2594185 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.996594447 +0000 UTC m=+1.806896662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-m9m4h" (UID: "c6a5d14d-0409-4024-b0a8-200fa2594185") : secret "marketplace-operator-metrics" not found Dec 04 22:00:36.496834 master-0 kubenswrapper[8606]: I1204 22:00:36.496710 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-node-log\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.496834 master-0 kubenswrapper[8606]: I1204 22:00:36.496743 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:36.496834 master-0 kubenswrapper[8606]: I1204 22:00:36.496810 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/871cb002-67f4-43aa-a41d-7a5b2f340059-host-etc-kube\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:00:36.496941 master-0 kubenswrapper[8606]: I1204 22:00:36.496839 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-cnibin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.496941 master-0 kubenswrapper[8606]: I1204 22:00:36.496865 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-netns\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.496941 master-0 kubenswrapper[8606]: E1204 22:00:36.496878 8606 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 04 22:00:36.496941 master-0 kubenswrapper[8606]: I1204 22:00:36.496892 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-node-log\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.496941 master-0 kubenswrapper[8606]: I1204 22:00:36.496920 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-netd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.497118 master-0 kubenswrapper[8606]: I1204 22:00:36.496958 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-netns\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.497118 master-0 kubenswrapper[8606]: E1204 22:00:36.496972 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.996947017 +0000 UTC m=+1.807249222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "node-tuning-operator-tls" not found Dec 04 22:00:36.497118 master-0 kubenswrapper[8606]: I1204 22:00:36.496891 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-netd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.497118 master-0 kubenswrapper[8606]: I1204 22:00:36.497037 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-etc-kubernetes\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.497118 master-0 kubenswrapper[8606]: I1204 22:00:36.497062 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-cnibin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.497118 master-0 kubenswrapper[8606]: I1204 22:00:36.497063 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:36.497335 master-0 kubenswrapper[8606]: E1204 22:00:36.497134 8606 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:36.497335 master-0 kubenswrapper[8606]: I1204 22:00:36.497149 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.497335 master-0 kubenswrapper[8606]: E1204 22:00:36.497177 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls podName:addddaac-a31a-4dbf-b78f-87225b11b463 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.997163202 +0000 UTC m=+1.807465417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls") pod "ingress-operator-8649c48786-qlkgh" (UID: "addddaac-a31a-4dbf-b78f-87225b11b463") : secret "metrics-tls" not found Dec 04 22:00:36.497335 master-0 kubenswrapper[8606]: I1204 22:00:36.497225 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-etc-kubernetes\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.497335 master-0 kubenswrapper[8606]: E1204 22:00:36.497248 8606 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 22:00:36.497335 master-0 kubenswrapper[8606]: E1204 22:00:36.497298 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.997279825 +0000 UTC m=+1.807582040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 22:00:36.497335 master-0 kubenswrapper[8606]: I1204 22:00:36.497284 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-system-cni-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.497623 master-0 kubenswrapper[8606]: I1204 22:00:36.497344 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-system-cni-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.497623 master-0 kubenswrapper[8606]: I1204 22:00:36.497370 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.497623 master-0 kubenswrapper[8606]: I1204 22:00:36.497452 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-cnibin\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.497623 master-0 kubenswrapper[8606]: I1204 22:00:36.497455 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.497623 master-0 kubenswrapper[8606]: I1204 22:00:36.497521 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-socket-dir-parent\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.497828 master-0 kubenswrapper[8606]: I1204 22:00:36.497665 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-cnibin\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.497828 master-0 kubenswrapper[8606]: I1204 22:00:36.497732 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:36.497828 master-0 kubenswrapper[8606]: I1204 22:00:36.497756 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-socket-dir-parent\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.497951 master-0 kubenswrapper[8606]: E1204 22:00:36.497874 8606 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:36.497951 master-0 kubenswrapper[8606]: I1204 22:00:36.497880 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:36.498027 master-0 kubenswrapper[8606]: E1204 22:00:36.497974 8606 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:36.498027 master-0 kubenswrapper[8606]: I1204 22:00:36.497999 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7l9n\" (UniqueName: \"kubernetes.io/projected/ceb419e4-d804-4111-b8d8-8436cc2ee617-kube-api-access-c7l9n\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:00:36.498231 master-0 kubenswrapper[8606]: I1204 22:00:36.498184 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-slash\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.498283 master-0 kubenswrapper[8606]: I1204 22:00:36.498167 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-slash\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.498327 master-0 kubenswrapper[8606]: E1204 22:00:36.498297 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls podName:ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.998263861 +0000 UTC m=+1.808566236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls") pod "dns-operator-7c56cf9b74-sshsd" (UID: "ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e") : secret "metrics-tls" not found Dec 04 22:00:36.498374 master-0 kubenswrapper[8606]: E1204 22:00:36.498339 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.998327133 +0000 UTC m=+1.808629588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:36.498418 master-0 kubenswrapper[8606]: I1204 22:00:36.498386 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:36.498457 master-0 kubenswrapper[8606]: I1204 22:00:36.498427 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.498529 master-0 kubenswrapper[8606]: I1204 22:00:36.498460 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-ovn\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.498700 master-0 kubenswrapper[8606]: I1204 22:00:36.498663 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.498911 master-0 kubenswrapper[8606]: E1204 22:00:36.498686 8606 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 04 22:00:36.498979 master-0 kubenswrapper[8606]: E1204 22:00:36.498969 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls podName:35821f48-b000-4915-847f-a739b6efc5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:00:36.998940719 +0000 UTC m=+1.809243014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-r7wcq" (UID: "35821f48-b000-4915-847f-a739b6efc5ee") : secret "image-registry-operator-tls" not found Dec 04 22:00:36.499039 master-0 kubenswrapper[8606]: I1204 22:00:36.498716 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-ovn\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.522495 master-0 kubenswrapper[8606]: I1204 22:00:36.522369 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:36.539750 master-0 kubenswrapper[8606]: I1204 22:00:36.539663 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wh6b\" (UniqueName: \"kubernetes.io/projected/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-kube-api-access-9wh6b\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:00:36.560956 master-0 kubenswrapper[8606]: I1204 22:00:36.560884 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w592\" (UniqueName: \"kubernetes.io/projected/813f3ee7-35b5-4ee8-b453-00d16d910eae-kube-api-access-8w592\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:36.575419 master-0 kubenswrapper[8606]: I1204 22:00:36.575353 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b7c644-ad97-4009-aac7-550edabc55ae-kube-api-access\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:36.598938 master-0 kubenswrapper[8606]: I1204 22:00:36.598791 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lclkg\" (UniqueName: \"kubernetes.io/projected/871cb002-67f4-43aa-a41d-7a5b2f340059-kube-api-access-lclkg\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:00:36.599412 master-0 kubenswrapper[8606]: I1204 22:00:36.599364 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-host-slash\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:36.599628 master-0 kubenswrapper[8606]: I1204 22:00:36.599562 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-host-slash\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:36.615052 master-0 kubenswrapper[8606]: I1204 22:00:36.614999 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24648a41-875f-4e98-8b21-3bdd38dffa32-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:00:36.635392 master-0 kubenswrapper[8606]: I1204 22:00:36.635328 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scht6\" (UniqueName: \"kubernetes.io/projected/a544105a-5bec-456a-aef6-c160943c1f67-kube-api-access-scht6\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:00:36.656562 master-0 kubenswrapper[8606]: I1204 22:00:36.656475 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr65l\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-kube-api-access-lr65l\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:36.658322 master-0 kubenswrapper[8606]: I1204 22:00:36.657825 8606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 22:00:36.677526 master-0 kubenswrapper[8606]: I1204 22:00:36.677465 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j8fr\" (UniqueName: \"kubernetes.io/projected/76fd9f44-4365-4271-8772-025655c50334-kube-api-access-9j8fr\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:00:36.695979 master-0 kubenswrapper[8606]: I1204 22:00:36.695935 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxkk\" (UniqueName: \"kubernetes.io/projected/690b447a-19c0-4925-bc9d-d0c86a83a377-kube-api-access-wsxkk\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:00:36.715877 master-0 kubenswrapper[8606]: I1204 22:00:36.715828 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgt75\" (UniqueName: \"kubernetes.io/projected/634c1df6-de4d-4e26-8c71-d39311cae0ce-kube-api-access-xgt75\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:00:36.745492 master-0 kubenswrapper[8606]: I1204 22:00:36.745355 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56f25fad-089d-4df6-abb1-10d4c76750f1-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:00:36.771116 master-0 kubenswrapper[8606]: I1204 22:00:36.771069 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfklr\" (UniqueName: \"kubernetes.io/projected/c6a5d14d-0409-4024-b0a8-200fa2594185-kube-api-access-bfklr\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:36.782214 master-0 kubenswrapper[8606]: I1204 22:00:36.781285 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r57bb\" (UniqueName: \"kubernetes.io/projected/e37d318a-5bf8-46ed-b6de-494102738da7-kube-api-access-r57bb\") pod \"csi-snapshot-controller-operator-6bc8656fdc-xhndk\" (UID: \"e37d318a-5bf8-46ed-b6de-494102738da7\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" Dec 04 22:00:36.806336 master-0 kubenswrapper[8606]: I1204 22:00:36.806246 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mttq\" (UniqueName: \"kubernetes.io/projected/465637a4-42be-4a65-a859-7af699960138-kube-api-access-4mttq\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:00:36.824947 master-0 kubenswrapper[8606]: I1204 22:00:36.824885 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5t2f\" (UniqueName: \"kubernetes.io/projected/7f091088-2166-4026-9fa6-62bd83407edb-kube-api-access-s5t2f\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:00:36.838451 master-0 kubenswrapper[8606]: I1204 22:00:36.838399 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4cct\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-kube-api-access-m4cct\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:36.855398 master-0 kubenswrapper[8606]: I1204 22:00:36.855351 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcgg9\" (UniqueName: \"kubernetes.io/projected/6c8c45e0-2342-499b-aa6b-339b6a722a87-kube-api-access-gcgg9\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:00:36.879131 master-0 kubenswrapper[8606]: I1204 22:00:36.879043 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e065179e-634a-4cbe-bb59-5b01c514e4de-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:00:36.902035 master-0 kubenswrapper[8606]: I1204 22:00:36.901983 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-bound-sa-token\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:36.922995 master-0 kubenswrapper[8606]: I1204 22:00:36.922919 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nkh\" (UniqueName: \"kubernetes.io/projected/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-kube-api-access-g5nkh\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:36.935215 master-0 kubenswrapper[8606]: I1204 22:00:36.934875 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8d54\" (UniqueName: \"kubernetes.io/projected/f893663c-7c1e-4eda-9839-99c1c0440304-kube-api-access-g8d54\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:00:36.968399 master-0 kubenswrapper[8606]: I1204 22:00:36.968347 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbpk\" (UniqueName: \"kubernetes.io/projected/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-kube-api-access-xdbpk\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:36.972564 master-0 kubenswrapper[8606]: I1204 22:00:36.971724 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:36.995519 master-0 kubenswrapper[8606]: E1204 22:00:36.995449 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 22:00:37.004718 master-0 kubenswrapper[8606]: I1204 22:00:37.004664 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:37.004804 master-0 kubenswrapper[8606]: I1204 22:00:37.004760 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:37.004839 master-0 kubenswrapper[8606]: I1204 22:00:37.004820 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:37.004923 master-0 kubenswrapper[8606]: I1204 22:00:37.004890 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:37.004961 master-0 kubenswrapper[8606]: I1204 22:00:37.004938 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:37.005013 master-0 kubenswrapper[8606]: I1204 22:00:37.004987 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:37.005089 master-0 kubenswrapper[8606]: I1204 22:00:37.005054 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:37.005130 master-0 kubenswrapper[8606]: I1204 22:00:37.005110 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:37.005166 master-0 kubenswrapper[8606]: I1204 22:00:37.005151 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:37.005226 master-0 kubenswrapper[8606]: I1204 22:00:37.005197 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:37.005264 master-0 kubenswrapper[8606]: I1204 22:00:37.005243 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:37.005470 master-0 kubenswrapper[8606]: E1204 22:00:37.005431 8606 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:37.005590 master-0 kubenswrapper[8606]: E1204 22:00:37.005565 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls podName:ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.005498907 +0000 UTC m=+2.815801152 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls") pod "dns-operator-7c56cf9b74-sshsd" (UID: "ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e") : secret "metrics-tls" not found Dec 04 22:00:37.005769 master-0 kubenswrapper[8606]: E1204 22:00:37.005732 8606 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:37.005804 master-0 kubenswrapper[8606]: E1204 22:00:37.005793 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls podName:addddaac-a31a-4dbf-b78f-87225b11b463 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.005778334 +0000 UTC m=+2.816080579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls") pod "ingress-operator-8649c48786-qlkgh" (UID: "addddaac-a31a-4dbf-b78f-87225b11b463") : secret "metrics-tls" not found Dec 04 22:00:37.005837 master-0 kubenswrapper[8606]: E1204 22:00:37.005812 8606 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 04 22:00:37.005891 master-0 kubenswrapper[8606]: E1204 22:00:37.005869 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls podName:35821f48-b000-4915-847f-a739b6efc5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.005855696 +0000 UTC m=+2.816157911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-r7wcq" (UID: "35821f48-b000-4915-847f-a739b6efc5ee") : secret "image-registry-operator-tls" not found Dec 04 22:00:37.005985 master-0 kubenswrapper[8606]: E1204 22:00:37.005951 8606 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 22:00:37.006040 master-0 kubenswrapper[8606]: E1204 22:00:37.006015 8606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 04 22:00:37.006079 master-0 kubenswrapper[8606]: E1204 22:00:37.006043 8606 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:37.006285 master-0 kubenswrapper[8606]: E1204 22:00:37.006145 8606 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:37.006285 master-0 kubenswrapper[8606]: E1204 22:00:37.006193 8606 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:37.006285 master-0 kubenswrapper[8606]: E1204 22:00:37.006067 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.006054431 +0000 UTC m=+2.816356646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : secret "metrics-daemon-secret" not found Dec 04 22:00:37.006285 master-0 kubenswrapper[8606]: E1204 22:00:37.006230 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.006214306 +0000 UTC m=+2.816516561 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 22:00:37.006285 master-0 kubenswrapper[8606]: E1204 22:00:37.006256 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.006242197 +0000 UTC m=+2.816544442 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:37.006285 master-0 kubenswrapper[8606]: E1204 22:00:37.006281 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls podName:512ba6af-11ad-4217-a1ce-a2ab3ef67ec5 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.006267768 +0000 UTC m=+2.816570023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-rn6cz" (UID: "512ba6af-11ad-4217-a1ce-a2ab3ef67ec5") : secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:37.006530 master-0 kubenswrapper[8606]: E1204 22:00:37.006304 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.006293809 +0000 UTC m=+2.816596054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:37.006530 master-0 kubenswrapper[8606]: E1204 22:00:37.006317 8606 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 04 22:00:37.006530 master-0 kubenswrapper[8606]: E1204 22:00:37.006342 8606 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 04 22:00:37.006530 master-0 kubenswrapper[8606]: E1204 22:00:37.006378 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.00636672 +0000 UTC m=+2.816668935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "node-tuning-operator-tls" not found Dec 04 22:00:37.006530 master-0 kubenswrapper[8606]: E1204 22:00:37.006409 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs podName:5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.006387181 +0000 UTC m=+2.816689396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs") pod "multus-admission-controller-7dfc5b745f-nk4gb" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf") : secret "multus-admission-controller-secret" not found Dec 04 22:00:37.006530 master-0 kubenswrapper[8606]: E1204 22:00:37.006415 8606 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 04 22:00:37.006530 master-0 kubenswrapper[8606]: E1204 22:00:37.006526 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics podName:c6a5d14d-0409-4024-b0a8-200fa2594185 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:38.006475863 +0000 UTC m=+2.816778078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-m9m4h" (UID: "c6a5d14d-0409-4024-b0a8-200fa2594185") : secret "marketplace-operator-metrics" not found Dec 04 22:00:37.020212 master-0 kubenswrapper[8606]: E1204 22:00:37.020125 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:00:37.034570 master-0 kubenswrapper[8606]: E1204 22:00:37.034493 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:37.035264 master-0 kubenswrapper[8606]: I1204 22:00:37.035202 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:37.053914 master-0 kubenswrapper[8606]: E1204 22:00:37.053625 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:37.090545 master-0 kubenswrapper[8606]: W1204 22:00:37.090068 8606 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Dec 04 22:00:37.090545 master-0 kubenswrapper[8606]: E1204 22:00:37.090160 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:00:37.101367 master-0 kubenswrapper[8606]: I1204 22:00:37.101323 8606 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 04 22:00:37.107575 master-0 kubenswrapper[8606]: I1204 22:00:37.107530 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:37.115824 master-0 kubenswrapper[8606]: I1204 22:00:37.115776 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ndk\" (UniqueName: \"kubernetes.io/projected/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-kube-api-access-w2ndk\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:00:37.267931 master-0 kubenswrapper[8606]: I1204 22:00:37.266242 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:37.306708 master-0 kubenswrapper[8606]: I1204 22:00:37.306646 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:37.312902 master-0 kubenswrapper[8606]: I1204 22:00:37.312866 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:37.505844 master-0 kubenswrapper[8606]: I1204 22:00:37.504840 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" event={"ID":"e065179e-634a-4cbe-bb59-5b01c514e4de","Type":"ContainerStarted","Data":"aec30a53010adc6ee6176e40e860c2639cbdf974b27b2d24e1d71f75f8a5c427"} Dec 04 22:00:37.525401 master-0 kubenswrapper[8606]: I1204 22:00:37.525365 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6jkkl"] Dec 04 22:00:37.528703 master-0 kubenswrapper[8606]: I1204 22:00:37.528107 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" event={"ID":"24648a41-875f-4e98-8b21-3bdd38dffa32","Type":"ContainerStarted","Data":"cb9981e4dfed9821dbae6b8b7a8e8e8f099f873bacacc6149961ccf58995e524"} Dec 04 22:00:37.534395 master-0 kubenswrapper[8606]: I1204 22:00:37.534363 8606 generic.go:334] "Generic (PLEG): container finished" podID="465637a4-42be-4a65-a859-7af699960138" containerID="8bd59644ccf9cb7c047ca7a95b61cb37f033530818fb51a36548a6089157cac2" exitCode=0 Dec 04 22:00:37.534482 master-0 kubenswrapper[8606]: I1204 22:00:37.534429 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerDied","Data":"8bd59644ccf9cb7c047ca7a95b61cb37f033530818fb51a36548a6089157cac2"} Dec 04 22:00:37.544425 master-0 kubenswrapper[8606]: I1204 22:00:37.543676 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" event={"ID":"a544105a-5bec-456a-aef6-c160943c1f67","Type":"ContainerStarted","Data":"9820de7c24faf6bdc5aac51f81548f854bf3fa05b1f8fd46fe8346195ddc8ca4"} Dec 04 22:00:37.546789 master-0 kubenswrapper[8606]: I1204 22:00:37.546750 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" event={"ID":"46229484-5fa1-4595-94a0-44477abae90e","Type":"ContainerStarted","Data":"c77537fc4f2900520f8e93c8fc7a9508c178081936170d16a0dcd4122f2c7777"} Dec 04 22:00:37.552321 master-0 kubenswrapper[8606]: I1204 22:00:37.552072 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" event={"ID":"ceb419e4-d804-4111-b8d8-8436cc2ee617","Type":"ContainerStarted","Data":"466a053aebc195d2f55d104f73cf9c35f09469c457c1576c051e6861f31f8a13"} Dec 04 22:00:37.565148 master-0 kubenswrapper[8606]: I1204 22:00:37.558176 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" event={"ID":"690b447a-19c0-4925-bc9d-d0c86a83a377","Type":"ContainerStarted","Data":"f701b6e27b366f9b3e2d799e563c87e892e7b625684a50d11abda6232179d479"} Dec 04 22:00:37.565148 master-0 kubenswrapper[8606]: I1204 22:00:37.562852 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" event={"ID":"e37d318a-5bf8-46ed-b6de-494102738da7","Type":"ContainerStarted","Data":"79e1b635bb095edbd66388094922c6134f767f1a2efc7b3eca6e45abd8f571c6"} Dec 04 22:00:37.565148 master-0 kubenswrapper[8606]: I1204 22:00:37.564918 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerStarted","Data":"49e5b6467d42b24a4142a36b3091700faf9ab3af4e0dd62b2e3ca1fd3da47a30"} Dec 04 22:00:37.780197 master-0 kubenswrapper[8606]: I1204 22:00:37.779777 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:37.787574 master-0 kubenswrapper[8606]: I1204 22:00:37.787537 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:38.037345 master-0 kubenswrapper[8606]: I1204 22:00:38.037266 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:38.037345 master-0 kubenswrapper[8606]: I1204 22:00:38.037335 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:38.037644 master-0 kubenswrapper[8606]: I1204 22:00:38.037539 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:38.037644 master-0 kubenswrapper[8606]: E1204 22:00:38.037546 8606 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 04 22:00:38.037644 master-0 kubenswrapper[8606]: I1204 22:00:38.037587 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:38.037644 master-0 kubenswrapper[8606]: I1204 22:00:38.037626 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:38.037768 master-0 kubenswrapper[8606]: E1204 22:00:38.037635 8606 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 04 22:00:38.037768 master-0 kubenswrapper[8606]: E1204 22:00:38.037655 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.037625418 +0000 UTC m=+4.847927633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "node-tuning-operator-tls" not found Dec 04 22:00:38.037840 master-0 kubenswrapper[8606]: I1204 22:00:38.037765 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:38.037840 master-0 kubenswrapper[8606]: I1204 22:00:38.037822 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:38.037901 master-0 kubenswrapper[8606]: E1204 22:00:38.037848 8606 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:38.037901 master-0 kubenswrapper[8606]: E1204 22:00:38.037849 8606 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 22:00:38.037901 master-0 kubenswrapper[8606]: E1204 22:00:38.037900 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls podName:ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.037871574 +0000 UTC m=+4.848173789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls") pod "dns-operator-7c56cf9b74-sshsd" (UID: "ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e") : secret "metrics-tls" not found Dec 04 22:00:38.038002 master-0 kubenswrapper[8606]: E1204 22:00:38.037947 8606 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:38.038002 master-0 kubenswrapper[8606]: I1204 22:00:38.037868 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:38.038002 master-0 kubenswrapper[8606]: E1204 22:00:38.037983 8606 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:38.038002 master-0 kubenswrapper[8606]: E1204 22:00:38.037950 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.037924596 +0000 UTC m=+4.848226811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 22:00:38.038117 master-0 kubenswrapper[8606]: E1204 22:00:38.038011 8606 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:38.038117 master-0 kubenswrapper[8606]: E1204 22:00:38.038019 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.038009498 +0000 UTC m=+4.848311713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:38.038117 master-0 kubenswrapper[8606]: E1204 22:00:38.038035 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls podName:addddaac-a31a-4dbf-b78f-87225b11b463 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.038028299 +0000 UTC m=+4.848330504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls") pod "ingress-operator-8649c48786-qlkgh" (UID: "addddaac-a31a-4dbf-b78f-87225b11b463") : secret "metrics-tls" not found Dec 04 22:00:38.038117 master-0 kubenswrapper[8606]: E1204 22:00:38.038053 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics podName:c6a5d14d-0409-4024-b0a8-200fa2594185 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.038046809 +0000 UTC m=+4.848349024 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-m9m4h" (UID: "c6a5d14d-0409-4024-b0a8-200fa2594185") : secret "marketplace-operator-metrics" not found Dec 04 22:00:38.038117 master-0 kubenswrapper[8606]: E1204 22:00:38.038067 8606 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 04 22:00:38.038117 master-0 kubenswrapper[8606]: E1204 22:00:38.038096 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls podName:35821f48-b000-4915-847f-a739b6efc5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.03808886 +0000 UTC m=+4.848391075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-r7wcq" (UID: "35821f48-b000-4915-847f-a739b6efc5ee") : secret "image-registry-operator-tls" not found Dec 04 22:00:38.038117 master-0 kubenswrapper[8606]: I1204 22:00:38.038093 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:38.038117 master-0 kubenswrapper[8606]: E1204 22:00:38.038118 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.038104191 +0000 UTC m=+4.848406406 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:38.038338 master-0 kubenswrapper[8606]: E1204 22:00:38.038137 8606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 04 22:00:38.038338 master-0 kubenswrapper[8606]: E1204 22:00:38.038187 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.038165332 +0000 UTC m=+4.848467767 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : secret "metrics-daemon-secret" not found Dec 04 22:00:38.038338 master-0 kubenswrapper[8606]: I1204 22:00:38.038212 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:38.038338 master-0 kubenswrapper[8606]: I1204 22:00:38.038244 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:38.038338 master-0 kubenswrapper[8606]: E1204 22:00:38.038300 8606 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 04 22:00:38.038338 master-0 kubenswrapper[8606]: E1204 22:00:38.038323 8606 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:38.038338 master-0 kubenswrapper[8606]: E1204 22:00:38.038332 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs podName:5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.038321296 +0000 UTC m=+4.848623501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs") pod "multus-admission-controller-7dfc5b745f-nk4gb" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf") : secret "multus-admission-controller-secret" not found Dec 04 22:00:38.038615 master-0 kubenswrapper[8606]: E1204 22:00:38.038354 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls podName:512ba6af-11ad-4217-a1ce-a2ab3ef67ec5 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.038345037 +0000 UTC m=+4.848647492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-rn6cz" (UID: "512ba6af-11ad-4217-a1ce-a2ab3ef67ec5") : secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:38.097700 master-0 kubenswrapper[8606]: I1204 22:00:38.097640 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5"] Dec 04 22:00:38.098002 master-0 kubenswrapper[8606]: E1204 22:00:38.097835 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c492425-adf8-424f-ac19-f465071857f9" containerName="prober" Dec 04 22:00:38.098002 master-0 kubenswrapper[8606]: I1204 22:00:38.097852 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c492425-adf8-424f-ac19-f465071857f9" containerName="prober" Dec 04 22:00:38.098002 master-0 kubenswrapper[8606]: E1204 22:00:38.097865 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerName="assisted-installer-controller" Dec 04 22:00:38.098002 master-0 kubenswrapper[8606]: I1204 22:00:38.097874 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerName="assisted-installer-controller" Dec 04 22:00:38.098002 master-0 kubenswrapper[8606]: I1204 22:00:38.097945 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerName="assisted-installer-controller" Dec 04 22:00:38.098002 master-0 kubenswrapper[8606]: I1204 22:00:38.097964 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c492425-adf8-424f-ac19-f465071857f9" containerName="prober" Dec 04 22:00:38.098487 master-0 kubenswrapper[8606]: I1204 22:00:38.098436 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" Dec 04 22:00:38.112621 master-0 kubenswrapper[8606]: I1204 22:00:38.112574 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5"] Dec 04 22:00:38.123167 master-0 kubenswrapper[8606]: I1204 22:00:38.123111 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 22:00:38.145533 master-0 kubenswrapper[8606]: I1204 22:00:38.144758 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 22:00:38.242816 master-0 kubenswrapper[8606]: I1204 22:00:38.242558 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4cdh\" (UniqueName: \"kubernetes.io/projected/0173b8a7-07b4-407a-80b6-d86754072fd8-kube-api-access-z4cdh\") pod \"migrator-74b7b57c65-nzpb5\" (UID: \"0173b8a7-07b4-407a-80b6-d86754072fd8\") " pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" Dec 04 22:00:38.343638 master-0 kubenswrapper[8606]: I1204 22:00:38.343383 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4cdh\" (UniqueName: \"kubernetes.io/projected/0173b8a7-07b4-407a-80b6-d86754072fd8-kube-api-access-z4cdh\") pod \"migrator-74b7b57c65-nzpb5\" (UID: \"0173b8a7-07b4-407a-80b6-d86754072fd8\") " pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" Dec 04 22:00:38.405570 master-0 kubenswrapper[8606]: I1204 22:00:38.405485 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4cdh\" (UniqueName: \"kubernetes.io/projected/0173b8a7-07b4-407a-80b6-d86754072fd8-kube-api-access-z4cdh\") pod \"migrator-74b7b57c65-nzpb5\" (UID: \"0173b8a7-07b4-407a-80b6-d86754072fd8\") " pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" Dec 04 22:00:38.417557 master-0 kubenswrapper[8606]: I1204 22:00:38.417235 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" Dec 04 22:00:38.603613 master-0 kubenswrapper[8606]: I1204 22:00:38.602979 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6jkkl" event={"ID":"510a595a-21bf-48fc-85cd-707bc8f5536f","Type":"ContainerStarted","Data":"07470ecb67c001251340fae3151b0ef12e1a2a108ad2fba4324431951a35b097"} Dec 04 22:00:38.603613 master-0 kubenswrapper[8606]: I1204 22:00:38.603374 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6jkkl" event={"ID":"510a595a-21bf-48fc-85cd-707bc8f5536f","Type":"ContainerStarted","Data":"4fc051e954a566d97cf4dcb3626713517bc5479301f571be1eec860a1f2d884c"} Dec 04 22:00:38.608522 master-0 kubenswrapper[8606]: I1204 22:00:38.606344 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerStarted","Data":"2da555718ea10aaf4197144683ccb4702237b92306aae894f469e5c551742616"} Dec 04 22:00:38.608522 master-0 kubenswrapper[8606]: I1204 22:00:38.607869 8606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:00:38.608522 master-0 kubenswrapper[8606]: I1204 22:00:38.607882 8606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:00:38.608522 master-0 kubenswrapper[8606]: I1204 22:00:38.608419 8606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:00:38.676519 master-0 kubenswrapper[8606]: I1204 22:00:38.674604 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc"] Dec 04 22:00:38.676519 master-0 kubenswrapper[8606]: I1204 22:00:38.675193 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" Dec 04 22:00:38.683077 master-0 kubenswrapper[8606]: I1204 22:00:38.680113 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc"] Dec 04 22:00:38.752543 master-0 kubenswrapper[8606]: I1204 22:00:38.752107 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd6d8\" (UniqueName: \"kubernetes.io/projected/4f22eee4-a42d-4d2b-bffa-6c3f29f1f026-kube-api-access-vd6d8\") pod \"csi-snapshot-controller-6b958b6f94-w7hnc\" (UID: \"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" Dec 04 22:00:38.814620 master-0 kubenswrapper[8606]: I1204 22:00:38.803107 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5"] Dec 04 22:00:38.853233 master-0 kubenswrapper[8606]: I1204 22:00:38.853159 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd6d8\" (UniqueName: \"kubernetes.io/projected/4f22eee4-a42d-4d2b-bffa-6c3f29f1f026-kube-api-access-vd6d8\") pod \"csi-snapshot-controller-6b958b6f94-w7hnc\" (UID: \"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" Dec 04 22:00:38.882652 master-0 kubenswrapper[8606]: I1204 22:00:38.879323 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd6d8\" (UniqueName: \"kubernetes.io/projected/4f22eee4-a42d-4d2b-bffa-6c3f29f1f026-kube-api-access-vd6d8\") pod \"csi-snapshot-controller-6b958b6f94-w7hnc\" (UID: \"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" Dec 04 22:00:38.989868 master-0 kubenswrapper[8606]: I1204 22:00:38.989786 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:38.994216 master-0 kubenswrapper[8606]: I1204 22:00:38.994177 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:00:39.074010 master-0 kubenswrapper[8606]: I1204 22:00:39.073923 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" Dec 04 22:00:39.282227 master-0 kubenswrapper[8606]: I1204 22:00:39.282093 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc"] Dec 04 22:00:39.293751 master-0 kubenswrapper[8606]: W1204 22:00:39.293675 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f22eee4_a42d_4d2b_bffa_6c3f29f1f026.slice/crio-df11c4a8f3347747aecb87e080a5126c781276285c92708ad28d5159ae4229dc WatchSource:0}: Error finding container df11c4a8f3347747aecb87e080a5126c781276285c92708ad28d5159ae4229dc: Status 404 returned error can't find the container with id df11c4a8f3347747aecb87e080a5126c781276285c92708ad28d5159ae4229dc Dec 04 22:00:39.613111 master-0 kubenswrapper[8606]: I1204 22:00:39.612563 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerStarted","Data":"df11c4a8f3347747aecb87e080a5126c781276285c92708ad28d5159ae4229dc"} Dec 04 22:00:39.615245 master-0 kubenswrapper[8606]: I1204 22:00:39.615203 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" event={"ID":"0173b8a7-07b4-407a-80b6-d86754072fd8","Type":"ContainerStarted","Data":"99da9d5b3d27d57501f5191969d7c3ca653c3d4bf3252f476bdc359e5ff9e271"} Dec 04 22:00:39.617645 master-0 kubenswrapper[8606]: I1204 22:00:39.617610 8606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:00:39.811721 master-0 kubenswrapper[8606]: I1204 22:00:39.811643 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6"] Dec 04 22:00:39.812338 master-0 kubenswrapper[8606]: I1204 22:00:39.812301 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.814614 master-0 kubenswrapper[8606]: I1204 22:00:39.814572 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 22:00:39.814690 master-0 kubenswrapper[8606]: I1204 22:00:39.814629 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 22:00:39.814729 master-0 kubenswrapper[8606]: I1204 22:00:39.814572 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 22:00:39.815547 master-0 kubenswrapper[8606]: I1204 22:00:39.815493 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 22:00:39.816265 master-0 kubenswrapper[8606]: I1204 22:00:39.816214 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 22:00:39.816396 master-0 kubenswrapper[8606]: I1204 22:00:39.816363 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 22:00:39.823350 master-0 kubenswrapper[8606]: I1204 22:00:39.823273 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6"] Dec 04 22:00:39.868593 master-0 kubenswrapper[8606]: I1204 22:00:39.868265 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.868593 master-0 kubenswrapper[8606]: I1204 22:00:39.868442 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.868593 master-0 kubenswrapper[8606]: I1204 22:00:39.868533 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66b7z\" (UniqueName: \"kubernetes.io/projected/b1101917-a902-451c-b099-2b03ab4b558d-kube-api-access-66b7z\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.868593 master-0 kubenswrapper[8606]: I1204 22:00:39.868585 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.868593 master-0 kubenswrapper[8606]: I1204 22:00:39.868604 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.969739 master-0 kubenswrapper[8606]: I1204 22:00:39.969637 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.969739 master-0 kubenswrapper[8606]: I1204 22:00:39.969705 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66b7z\" (UniqueName: \"kubernetes.io/projected/b1101917-a902-451c-b099-2b03ab4b558d-kube-api-access-66b7z\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.970043 master-0 kubenswrapper[8606]: I1204 22:00:39.969757 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.970043 master-0 kubenswrapper[8606]: I1204 22:00:39.969776 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.970043 master-0 kubenswrapper[8606]: I1204 22:00:39.969823 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:39.970278 master-0 kubenswrapper[8606]: E1204 22:00:39.970253 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Dec 04 22:00:39.970585 master-0 kubenswrapper[8606]: E1204 22:00:39.970561 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config podName:b1101917-a902-451c-b099-2b03ab4b558d nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.47053272 +0000 UTC m=+5.280834935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config") pod "controller-manager-77f4fc6d5d-5g4n6" (UID: "b1101917-a902-451c-b099-2b03ab4b558d") : configmap "config" not found Dec 04 22:00:39.970682 master-0 kubenswrapper[8606]: E1204 22:00:39.970663 8606 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 04 22:00:39.970719 master-0 kubenswrapper[8606]: E1204 22:00:39.970696 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert podName:b1101917-a902-451c-b099-2b03ab4b558d nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.470689014 +0000 UTC m=+5.280991229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert") pod "controller-manager-77f4fc6d5d-5g4n6" (UID: "b1101917-a902-451c-b099-2b03ab4b558d") : secret "serving-cert" not found Dec 04 22:00:39.970755 master-0 kubenswrapper[8606]: E1204 22:00:39.970726 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Dec 04 22:00:39.970755 master-0 kubenswrapper[8606]: E1204 22:00:39.970746 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles podName:b1101917-a902-451c-b099-2b03ab4b558d nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.470740275 +0000 UTC m=+5.281042490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles") pod "controller-manager-77f4fc6d5d-5g4n6" (UID: "b1101917-a902-451c-b099-2b03ab4b558d") : configmap "openshift-global-ca" not found Dec 04 22:00:39.970809 master-0 kubenswrapper[8606]: E1204 22:00:39.970769 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:39.970809 master-0 kubenswrapper[8606]: E1204 22:00:39.970790 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca podName:b1101917-a902-451c-b099-2b03ab4b558d nodeName:}" failed. No retries permitted until 2025-12-04 22:00:40.470784926 +0000 UTC m=+5.281087141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca") pod "controller-manager-77f4fc6d5d-5g4n6" (UID: "b1101917-a902-451c-b099-2b03ab4b558d") : configmap "client-ca" not found Dec 04 22:00:40.070778 master-0 kubenswrapper[8606]: I1204 22:00:40.070699 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:40.070778 master-0 kubenswrapper[8606]: I1204 22:00:40.070775 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:40.071079 master-0 kubenswrapper[8606]: I1204 22:00:40.070812 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:40.071079 master-0 kubenswrapper[8606]: I1204 22:00:40.070847 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:40.071079 master-0 kubenswrapper[8606]: I1204 22:00:40.070874 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:40.071079 master-0 kubenswrapper[8606]: I1204 22:00:40.070914 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:40.071079 master-0 kubenswrapper[8606]: I1204 22:00:40.070952 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:40.071079 master-0 kubenswrapper[8606]: I1204 22:00:40.070978 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:40.071079 master-0 kubenswrapper[8606]: I1204 22:00:40.071003 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:40.071079 master-0 kubenswrapper[8606]: I1204 22:00:40.071068 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:40.071363 master-0 kubenswrapper[8606]: I1204 22:00:40.071091 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:40.071363 master-0 kubenswrapper[8606]: E1204 22:00:40.071249 8606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 04 22:00:40.071363 master-0 kubenswrapper[8606]: E1204 22:00:40.071332 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.071308037 +0000 UTC m=+8.881610252 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : secret "metrics-daemon-secret" not found Dec 04 22:00:40.071793 master-0 kubenswrapper[8606]: E1204 22:00:40.071754 8606 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:40.071793 master-0 kubenswrapper[8606]: E1204 22:00:40.071785 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls podName:512ba6af-11ad-4217-a1ce-a2ab3ef67ec5 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.071776919 +0000 UTC m=+8.882079134 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-rn6cz" (UID: "512ba6af-11ad-4217-a1ce-a2ab3ef67ec5") : secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:40.071922 master-0 kubenswrapper[8606]: E1204 22:00:40.071822 8606 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 04 22:00:40.071922 master-0 kubenswrapper[8606]: E1204 22:00:40.071840 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs podName:5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.071833391 +0000 UTC m=+8.882135606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs") pod "multus-admission-controller-7dfc5b745f-nk4gb" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf") : secret "multus-admission-controller-secret" not found Dec 04 22:00:40.071922 master-0 kubenswrapper[8606]: E1204 22:00:40.071871 8606 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 04 22:00:40.071922 master-0 kubenswrapper[8606]: E1204 22:00:40.071889 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics podName:c6a5d14d-0409-4024-b0a8-200fa2594185 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.071883022 +0000 UTC m=+8.882185237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-m9m4h" (UID: "c6a5d14d-0409-4024-b0a8-200fa2594185") : secret "marketplace-operator-metrics" not found Dec 04 22:00:40.071922 master-0 kubenswrapper[8606]: E1204 22:00:40.071921 8606 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 04 22:00:40.072105 master-0 kubenswrapper[8606]: E1204 22:00:40.071938 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.071932814 +0000 UTC m=+8.882235029 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "node-tuning-operator-tls" not found Dec 04 22:00:40.072105 master-0 kubenswrapper[8606]: E1204 22:00:40.071972 8606 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:40.072105 master-0 kubenswrapper[8606]: E1204 22:00:40.071988 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls podName:addddaac-a31a-4dbf-b78f-87225b11b463 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.071982905 +0000 UTC m=+8.882285120 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls") pod "ingress-operator-8649c48786-qlkgh" (UID: "addddaac-a31a-4dbf-b78f-87225b11b463") : secret "metrics-tls" not found Dec 04 22:00:40.072105 master-0 kubenswrapper[8606]: E1204 22:00:40.072020 8606 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 04 22:00:40.072105 master-0 kubenswrapper[8606]: E1204 22:00:40.072038 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert podName:74b7c644-ad97-4009-aac7-550edabc55ae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.072032076 +0000 UTC m=+8.882334291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2smgj" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae") : secret "cluster-version-operator-serving-cert" not found Dec 04 22:00:40.072105 master-0 kubenswrapper[8606]: E1204 22:00:40.072075 8606 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:40.072105 master-0 kubenswrapper[8606]: E1204 22:00:40.072096 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.072090218 +0000 UTC m=+8.882392433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:40.072373 master-0 kubenswrapper[8606]: E1204 22:00:40.072128 8606 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:40.072373 master-0 kubenswrapper[8606]: E1204 22:00:40.072144 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls podName:ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.072138539 +0000 UTC m=+8.882440754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls") pod "dns-operator-7c56cf9b74-sshsd" (UID: "ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e") : secret "metrics-tls" not found Dec 04 22:00:40.072373 master-0 kubenswrapper[8606]: E1204 22:00:40.072177 8606 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 04 22:00:40.072373 master-0 kubenswrapper[8606]: E1204 22:00:40.072196 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls podName:35821f48-b000-4915-847f-a739b6efc5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.07218945 +0000 UTC m=+8.882491665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-r7wcq" (UID: "35821f48-b000-4915-847f-a739b6efc5ee") : secret "image-registry-operator-tls" not found Dec 04 22:00:40.072373 master-0 kubenswrapper[8606]: E1204 22:00:40.072227 8606 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:40.072373 master-0 kubenswrapper[8606]: E1204 22:00:40.072245 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert podName:0beb871c-3bf1-471c-a028-746a650267bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.072237692 +0000 UTC m=+8.882539907 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-4dv2b" (UID: "0beb871c-3bf1-471c-a028-746a650267bf") : secret "performance-addon-operator-webhook-cert" not found Dec 04 22:00:40.078722 master-0 kubenswrapper[8606]: I1204 22:00:40.078676 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-77c99c46b8-fpnwr"] Dec 04 22:00:40.079259 master-0 kubenswrapper[8606]: I1204 22:00:40.079236 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.083689 master-0 kubenswrapper[8606]: I1204 22:00:40.083066 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 22:00:40.085619 master-0 kubenswrapper[8606]: I1204 22:00:40.085493 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-77c99c46b8-fpnwr"] Dec 04 22:00:40.087274 master-0 kubenswrapper[8606]: I1204 22:00:40.087212 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 22:00:40.087368 master-0 kubenswrapper[8606]: I1204 22:00:40.087325 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 22:00:40.087634 master-0 kubenswrapper[8606]: I1204 22:00:40.087557 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 22:00:40.172239 master-0 kubenswrapper[8606]: I1204 22:00:40.172078 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6vjb\" (UniqueName: \"kubernetes.io/projected/4d68dcb1-efe4-425f-9b28-1e5575548a32-kube-api-access-r6vjb\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.172239 master-0 kubenswrapper[8606]: I1204 22:00:40.172185 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-key\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.172496 master-0 kubenswrapper[8606]: I1204 22:00:40.172437 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-cabundle\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.186218 master-0 kubenswrapper[8606]: I1204 22:00:40.186143 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66b7z\" (UniqueName: \"kubernetes.io/projected/b1101917-a902-451c-b099-2b03ab4b558d-kube-api-access-66b7z\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:40.273325 master-0 kubenswrapper[8606]: I1204 22:00:40.273245 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-key\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.273325 master-0 kubenswrapper[8606]: I1204 22:00:40.273324 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-cabundle\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.273797 master-0 kubenswrapper[8606]: I1204 22:00:40.273761 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6vjb\" (UniqueName: \"kubernetes.io/projected/4d68dcb1-efe4-425f-9b28-1e5575548a32-kube-api-access-r6vjb\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.274459 master-0 kubenswrapper[8606]: I1204 22:00:40.274420 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-cabundle\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.276409 master-0 kubenswrapper[8606]: I1204 22:00:40.276377 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-key\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.291828 master-0 kubenswrapper[8606]: I1204 22:00:40.291728 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6vjb\" (UniqueName: \"kubernetes.io/projected/4d68dcb1-efe4-425f-9b28-1e5575548a32-kube-api-access-r6vjb\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.403177 master-0 kubenswrapper[8606]: I1204 22:00:40.403091 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:00:40.476296 master-0 kubenswrapper[8606]: I1204 22:00:40.476081 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:40.476648 master-0 kubenswrapper[8606]: E1204 22:00:40.476322 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Dec 04 22:00:40.476648 master-0 kubenswrapper[8606]: I1204 22:00:40.476487 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:40.476648 master-0 kubenswrapper[8606]: E1204 22:00:40.476574 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config podName:b1101917-a902-451c-b099-2b03ab4b558d nodeName:}" failed. No retries permitted until 2025-12-04 22:00:41.476493831 +0000 UTC m=+6.286796046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config") pod "controller-manager-77f4fc6d5d-5g4n6" (UID: "b1101917-a902-451c-b099-2b03ab4b558d") : configmap "config" not found Dec 04 22:00:40.478458 master-0 kubenswrapper[8606]: E1204 22:00:40.476713 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:40.478458 master-0 kubenswrapper[8606]: I1204 22:00:40.476806 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:40.478458 master-0 kubenswrapper[8606]: E1204 22:00:40.476832 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca podName:b1101917-a902-451c-b099-2b03ab4b558d nodeName:}" failed. No retries permitted until 2025-12-04 22:00:41.47680902 +0000 UTC m=+6.287111235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca") pod "controller-manager-77f4fc6d5d-5g4n6" (UID: "b1101917-a902-451c-b099-2b03ab4b558d") : configmap "client-ca" not found Dec 04 22:00:40.478458 master-0 kubenswrapper[8606]: E1204 22:00:40.476892 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Dec 04 22:00:40.478458 master-0 kubenswrapper[8606]: I1204 22:00:40.476927 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:40.478458 master-0 kubenswrapper[8606]: E1204 22:00:40.476949 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles podName:b1101917-a902-451c-b099-2b03ab4b558d nodeName:}" failed. No retries permitted until 2025-12-04 22:00:41.476923202 +0000 UTC m=+6.287225428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles") pod "controller-manager-77f4fc6d5d-5g4n6" (UID: "b1101917-a902-451c-b099-2b03ab4b558d") : configmap "openshift-global-ca" not found Dec 04 22:00:40.478458 master-0 kubenswrapper[8606]: E1204 22:00:40.477290 8606 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 04 22:00:40.478458 master-0 kubenswrapper[8606]: E1204 22:00:40.477346 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert podName:b1101917-a902-451c-b099-2b03ab4b558d nodeName:}" failed. No retries permitted until 2025-12-04 22:00:41.477338484 +0000 UTC m=+6.287640699 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert") pod "controller-manager-77f4fc6d5d-5g4n6" (UID: "b1101917-a902-451c-b099-2b03ab4b558d") : secret "serving-cert" not found Dec 04 22:00:41.102390 master-0 kubenswrapper[8606]: I1204 22:00:41.102345 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6"] Dec 04 22:00:41.105464 master-0 kubenswrapper[8606]: E1204 22:00:41.105370 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" podUID="b1101917-a902-451c-b099-2b03ab4b558d" Dec 04 22:00:41.129668 master-0 kubenswrapper[8606]: I1204 22:00:41.127810 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-77c99c46b8-fpnwr"] Dec 04 22:00:41.146589 master-0 kubenswrapper[8606]: I1204 22:00:41.144585 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl"] Dec 04 22:00:41.149989 master-0 kubenswrapper[8606]: I1204 22:00:41.149706 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.154972 master-0 kubenswrapper[8606]: I1204 22:00:41.154481 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 22:00:41.154972 master-0 kubenswrapper[8606]: I1204 22:00:41.154789 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 22:00:41.155055 master-0 kubenswrapper[8606]: I1204 22:00:41.155009 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 22:00:41.155306 master-0 kubenswrapper[8606]: I1204 22:00:41.155281 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 22:00:41.155477 master-0 kubenswrapper[8606]: I1204 22:00:41.155455 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 22:00:41.156258 master-0 kubenswrapper[8606]: I1204 22:00:41.156215 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl"] Dec 04 22:00:41.190316 master-0 kubenswrapper[8606]: I1204 22:00:41.190253 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.190316 master-0 kubenswrapper[8606]: I1204 22:00:41.190319 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.190448 master-0 kubenswrapper[8606]: I1204 22:00:41.190342 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-config\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.190448 master-0 kubenswrapper[8606]: I1204 22:00:41.190382 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8lls\" (UniqueName: \"kubernetes.io/projected/de7f9a47-eab7-49ef-8479-2ee6953a4de9-kube-api-access-b8lls\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.291734 master-0 kubenswrapper[8606]: I1204 22:00:41.291648 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.291734 master-0 kubenswrapper[8606]: I1204 22:00:41.291730 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.292036 master-0 kubenswrapper[8606]: I1204 22:00:41.291772 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-config\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.292036 master-0 kubenswrapper[8606]: E1204 22:00:41.291961 8606 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:41.292036 master-0 kubenswrapper[8606]: E1204 22:00:41.291995 8606 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 04 22:00:41.292128 master-0 kubenswrapper[8606]: I1204 22:00:41.292048 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8lls\" (UniqueName: \"kubernetes.io/projected/de7f9a47-eab7-49ef-8479-2ee6953a4de9-kube-api-access-b8lls\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.292128 master-0 kubenswrapper[8606]: E1204 22:00:41.292081 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:41.792049268 +0000 UTC m=+6.602351683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : configmap "client-ca" not found Dec 04 22:00:41.292196 master-0 kubenswrapper[8606]: E1204 22:00:41.292176 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:41.79214083 +0000 UTC m=+6.602443045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : secret "serving-cert" not found Dec 04 22:00:41.293686 master-0 kubenswrapper[8606]: I1204 22:00:41.293636 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-config\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.312215 master-0 kubenswrapper[8606]: I1204 22:00:41.312168 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8lls\" (UniqueName: \"kubernetes.io/projected/de7f9a47-eab7-49ef-8479-2ee6953a4de9-kube-api-access-b8lls\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.501398 master-0 kubenswrapper[8606]: I1204 22:00:41.501320 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:41.501398 master-0 kubenswrapper[8606]: I1204 22:00:41.501398 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:41.501718 master-0 kubenswrapper[8606]: I1204 22:00:41.501432 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:41.501718 master-0 kubenswrapper[8606]: I1204 22:00:41.501552 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:41.501718 master-0 kubenswrapper[8606]: E1204 22:00:41.501667 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:41.501823 master-0 kubenswrapper[8606]: E1204 22:00:41.501732 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca podName:b1101917-a902-451c-b099-2b03ab4b558d nodeName:}" failed. No retries permitted until 2025-12-04 22:00:43.501711578 +0000 UTC m=+8.312013793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca") pod "controller-manager-77f4fc6d5d-5g4n6" (UID: "b1101917-a902-451c-b099-2b03ab4b558d") : configmap "client-ca" not found Dec 04 22:00:41.502372 master-0 kubenswrapper[8606]: E1204 22:00:41.502343 8606 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 04 22:00:41.502419 master-0 kubenswrapper[8606]: E1204 22:00:41.502388 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert podName:b1101917-a902-451c-b099-2b03ab4b558d nodeName:}" failed. No retries permitted until 2025-12-04 22:00:43.502376527 +0000 UTC m=+8.312678742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert") pod "controller-manager-77f4fc6d5d-5g4n6" (UID: "b1101917-a902-451c-b099-2b03ab4b558d") : secret "serving-cert" not found Dec 04 22:00:41.502839 master-0 kubenswrapper[8606]: I1204 22:00:41.502751 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:41.503085 master-0 kubenswrapper[8606]: I1204 22:00:41.503058 8606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:00:41.503357 master-0 kubenswrapper[8606]: I1204 22:00:41.503316 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:41.504268 master-0 kubenswrapper[8606]: I1204 22:00:41.504236 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles\") pod \"controller-manager-77f4fc6d5d-5g4n6\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:41.509806 master-0 kubenswrapper[8606]: I1204 22:00:41.509760 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:41.628234 master-0 kubenswrapper[8606]: I1204 22:00:41.628087 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" event={"ID":"4d68dcb1-efe4-425f-9b28-1e5575548a32","Type":"ContainerStarted","Data":"2f8f422694aa4bc57d4ecc64211f7f799287be11acc20da48bbd2da04f761575"} Dec 04 22:00:41.628234 master-0 kubenswrapper[8606]: I1204 22:00:41.628158 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" event={"ID":"4d68dcb1-efe4-425f-9b28-1e5575548a32","Type":"ContainerStarted","Data":"caf36cfc8384a756669c5effc9f040f914b8e0fafbb77841a2ef74350bfc51bf"} Dec 04 22:00:41.632146 master-0 kubenswrapper[8606]: I1204 22:00:41.632079 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" event={"ID":"0173b8a7-07b4-407a-80b6-d86754072fd8","Type":"ContainerStarted","Data":"fef3967d923683ebb718e4e4b14da4f44280d20b91b49309470c2a559a417975"} Dec 04 22:00:41.632146 master-0 kubenswrapper[8606]: I1204 22:00:41.632126 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" event={"ID":"0173b8a7-07b4-407a-80b6-d86754072fd8","Type":"ContainerStarted","Data":"fa56f8a9b20a66ea35b68dc55286c46ebdc98bcc65664051a0ce154f588cd501"} Dec 04 22:00:41.637231 master-0 kubenswrapper[8606]: I1204 22:00:41.637151 8606 generic.go:334] "Generic (PLEG): container finished" podID="465637a4-42be-4a65-a859-7af699960138" containerID="f11190eeabf32ca439cc6dbf2e5f945ac6892b6b5bf3d933639699117a6a4cbd" exitCode=0 Dec 04 22:00:41.637769 master-0 kubenswrapper[8606]: I1204 22:00:41.637700 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerDied","Data":"f11190eeabf32ca439cc6dbf2e5f945ac6892b6b5bf3d933639699117a6a4cbd"} Dec 04 22:00:41.637856 master-0 kubenswrapper[8606]: I1204 22:00:41.637815 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:41.645839 master-0 kubenswrapper[8606]: I1204 22:00:41.645744 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" podStartSLOduration=1.645722505 podStartE2EDuration="1.645722505s" podCreationTimestamp="2025-12-04 22:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:00:41.645359685 +0000 UTC m=+6.455661930" watchObservedRunningTime="2025-12-04 22:00:41.645722505 +0000 UTC m=+6.456024730" Dec 04 22:00:41.655537 master-0 kubenswrapper[8606]: I1204 22:00:41.655459 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:41.691460 master-0 kubenswrapper[8606]: I1204 22:00:41.691322 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" podStartSLOduration=1.68223711 podStartE2EDuration="3.691256554s" podCreationTimestamp="2025-12-04 22:00:38 +0000 UTC" firstStartedPulling="2025-12-04 22:00:38.818916015 +0000 UTC m=+3.629218230" lastFinishedPulling="2025-12-04 22:00:40.827935459 +0000 UTC m=+5.638237674" observedRunningTime="2025-12-04 22:00:41.689945259 +0000 UTC m=+6.500247514" watchObservedRunningTime="2025-12-04 22:00:41.691256554 +0000 UTC m=+6.501558799" Dec 04 22:00:41.703689 master-0 kubenswrapper[8606]: I1204 22:00:41.703638 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66b7z\" (UniqueName: \"kubernetes.io/projected/b1101917-a902-451c-b099-2b03ab4b558d-kube-api-access-66b7z\") pod \"b1101917-a902-451c-b099-2b03ab4b558d\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " Dec 04 22:00:41.703925 master-0 kubenswrapper[8606]: I1204 22:00:41.703736 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles\") pod \"b1101917-a902-451c-b099-2b03ab4b558d\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " Dec 04 22:00:41.703925 master-0 kubenswrapper[8606]: I1204 22:00:41.703767 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config\") pod \"b1101917-a902-451c-b099-2b03ab4b558d\" (UID: \"b1101917-a902-451c-b099-2b03ab4b558d\") " Dec 04 22:00:41.704332 master-0 kubenswrapper[8606]: I1204 22:00:41.704236 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config" (OuterVolumeSpecName: "config") pod "b1101917-a902-451c-b099-2b03ab4b558d" (UID: "b1101917-a902-451c-b099-2b03ab4b558d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:00:41.704609 master-0 kubenswrapper[8606]: I1204 22:00:41.704542 8606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:41.704901 master-0 kubenswrapper[8606]: I1204 22:00:41.704878 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b1101917-a902-451c-b099-2b03ab4b558d" (UID: "b1101917-a902-451c-b099-2b03ab4b558d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:00:41.710860 master-0 kubenswrapper[8606]: I1204 22:00:41.710775 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1101917-a902-451c-b099-2b03ab4b558d-kube-api-access-66b7z" (OuterVolumeSpecName: "kube-api-access-66b7z") pod "b1101917-a902-451c-b099-2b03ab4b558d" (UID: "b1101917-a902-451c-b099-2b03ab4b558d"). InnerVolumeSpecName "kube-api-access-66b7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:00:41.805946 master-0 kubenswrapper[8606]: I1204 22:00:41.805864 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.805946 master-0 kubenswrapper[8606]: I1204 22:00:41.805934 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:41.806298 master-0 kubenswrapper[8606]: I1204 22:00:41.806009 8606 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:41.806298 master-0 kubenswrapper[8606]: I1204 22:00:41.806027 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66b7z\" (UniqueName: \"kubernetes.io/projected/b1101917-a902-451c-b099-2b03ab4b558d-kube-api-access-66b7z\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:41.806298 master-0 kubenswrapper[8606]: E1204 22:00:41.806261 8606 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 04 22:00:41.806470 master-0 kubenswrapper[8606]: E1204 22:00:41.806294 8606 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:41.806588 master-0 kubenswrapper[8606]: E1204 22:00:41.806433 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.806393003 +0000 UTC m=+7.616695248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : secret "serving-cert" not found Dec 04 22:00:41.806658 master-0 kubenswrapper[8606]: E1204 22:00:41.806603 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:42.806581188 +0000 UTC m=+7.616883393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : configmap "client-ca" not found Dec 04 22:00:42.651520 master-0 kubenswrapper[8606]: I1204 22:00:42.649311 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6" Dec 04 22:00:42.651520 master-0 kubenswrapper[8606]: I1204 22:00:42.649489 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c747h" event={"ID":"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c","Type":"ContainerStarted","Data":"445f62d39aa04dcf1c8ebad8cd7e2899244dd127c8c97b181ddff4af36c8b535"} Dec 04 22:00:42.748525 master-0 kubenswrapper[8606]: I1204 22:00:42.744780 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6"] Dec 04 22:00:42.755025 master-0 kubenswrapper[8606]: I1204 22:00:42.754988 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77f4fc6d5d-5g4n6"] Dec 04 22:00:42.822529 master-0 kubenswrapper[8606]: I1204 22:00:42.819608 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:42.822529 master-0 kubenswrapper[8606]: I1204 22:00:42.819667 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:42.822529 master-0 kubenswrapper[8606]: I1204 22:00:42.819737 8606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1101917-a902-451c-b099-2b03ab4b558d-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:42.822529 master-0 kubenswrapper[8606]: I1204 22:00:42.819749 8606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b1101917-a902-451c-b099-2b03ab4b558d-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:42.822529 master-0 kubenswrapper[8606]: E1204 22:00:42.819835 8606 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:42.822529 master-0 kubenswrapper[8606]: E1204 22:00:42.819893 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.819873649 +0000 UTC m=+9.630175864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : configmap "client-ca" not found Dec 04 22:00:42.822529 master-0 kubenswrapper[8606]: E1204 22:00:42.820128 8606 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 04 22:00:42.822529 master-0 kubenswrapper[8606]: E1204 22:00:42.820151 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.820144876 +0000 UTC m=+9.630447081 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : secret "serving-cert" not found Dec 04 22:00:42.921967 master-0 kubenswrapper[8606]: I1204 22:00:42.921609 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:42.927868 master-0 kubenswrapper[8606]: I1204 22:00:42.927849 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:43.404929 master-0 kubenswrapper[8606]: I1204 22:00:43.404824 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1101917-a902-451c-b099-2b03ab4b558d" path="/var/lib/kubelet/pods/b1101917-a902-451c-b099-2b03ab4b558d/volumes" Dec 04 22:00:43.658077 master-0 kubenswrapper[8606]: I1204 22:00:43.657898 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerStarted","Data":"264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb"} Dec 04 22:00:43.676824 master-0 kubenswrapper[8606]: I1204 22:00:43.675497 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podStartSLOduration=2.485997003 podStartE2EDuration="5.675470289s" podCreationTimestamp="2025-12-04 22:00:38 +0000 UTC" firstStartedPulling="2025-12-04 22:00:39.296954514 +0000 UTC m=+4.107256739" lastFinishedPulling="2025-12-04 22:00:42.48642781 +0000 UTC m=+7.296730025" observedRunningTime="2025-12-04 22:00:43.675427298 +0000 UTC m=+8.485729543" watchObservedRunningTime="2025-12-04 22:00:43.675470289 +0000 UTC m=+8.485772514" Dec 04 22:00:44.137892 master-0 kubenswrapper[8606]: I1204 22:00:44.137685 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:44.137892 master-0 kubenswrapper[8606]: I1204 22:00:44.137756 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:44.137892 master-0 kubenswrapper[8606]: I1204 22:00:44.137782 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:44.137892 master-0 kubenswrapper[8606]: I1204 22:00:44.137810 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:44.138483 master-0 kubenswrapper[8606]: I1204 22:00:44.138034 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:44.138483 master-0 kubenswrapper[8606]: E1204 22:00:44.138111 8606 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 04 22:00:44.138483 master-0 kubenswrapper[8606]: E1204 22:00:44.138311 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs podName:5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf nodeName:}" failed. No retries permitted until 2025-12-04 22:00:52.138283124 +0000 UTC m=+16.948585349 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs") pod "multus-admission-controller-7dfc5b745f-nk4gb" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf") : secret "multus-admission-controller-secret" not found Dec 04 22:00:44.138483 master-0 kubenswrapper[8606]: E1204 22:00:44.138401 8606 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:44.138666 master-0 kubenswrapper[8606]: E1204 22:00:44.138528 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls podName:512ba6af-11ad-4217-a1ce-a2ab3ef67ec5 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:52.13847355 +0000 UTC m=+16.948775795 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-rn6cz" (UID: "512ba6af-11ad-4217-a1ce-a2ab3ef67ec5") : secret "cluster-monitoring-operator-tls" not found Dec 04 22:00:44.138842 master-0 kubenswrapper[8606]: I1204 22:00:44.138728 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:44.139194 master-0 kubenswrapper[8606]: E1204 22:00:44.138951 8606 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:44.139194 master-0 kubenswrapper[8606]: I1204 22:00:44.139135 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:44.139194 master-0 kubenswrapper[8606]: I1204 22:00:44.139174 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:44.140402 master-0 kubenswrapper[8606]: E1204 22:00:44.139307 8606 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 04 22:00:44.140402 master-0 kubenswrapper[8606]: E1204 22:00:44.139453 8606 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 04 22:00:44.140402 master-0 kubenswrapper[8606]: E1204 22:00:44.139495 8606 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: E1204 22:00:44.142651 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls podName:addddaac-a31a-4dbf-b78f-87225b11b463 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:52.13923348 +0000 UTC m=+16.949535715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls") pod "ingress-operator-8649c48786-qlkgh" (UID: "addddaac-a31a-4dbf-b78f-87225b11b463") : secret "metrics-tls" not found Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: I1204 22:00:44.142754 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: I1204 22:00:44.142855 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: E1204 22:00:44.142862 8606 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: E1204 22:00:44.143004 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls podName:ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e nodeName:}" failed. No retries permitted until 2025-12-04 22:00:52.142854466 +0000 UTC m=+16.953156681 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls") pod "dns-operator-7c56cf9b74-sshsd" (UID: "ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e") : secret "metrics-tls" not found Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: E1204 22:00:44.143083 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics podName:c6a5d14d-0409-4024-b0a8-200fa2594185 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:52.143070822 +0000 UTC m=+16.953373037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-m9m4h" (UID: "c6a5d14d-0409-4024-b0a8-200fa2594185") : secret "marketplace-operator-metrics" not found Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: E1204 22:00:44.143104 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:00:52.143094213 +0000 UTC m=+16.953396428 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: I1204 22:00:44.143275 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: E1204 22:00:44.143445 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls podName:35821f48-b000-4915-847f-a739b6efc5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:00:52.143435081 +0000 UTC m=+16.953737296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-r7wcq" (UID: "35821f48-b000-4915-847f-a739b6efc5ee") : secret "image-registry-operator-tls" not found Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: E1204 22:00:44.143542 8606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: E1204 22:00:44.144141 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs podName:ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa nodeName:}" failed. No retries permitted until 2025-12-04 22:00:52.144126861 +0000 UTC m=+16.954429076 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs") pod "network-metrics-daemon-9pfhj" (UID: "ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa") : secret "metrics-daemon-secret" not found Dec 04 22:00:44.144643 master-0 kubenswrapper[8606]: I1204 22:00:44.144401 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2smgj\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:44.150560 master-0 kubenswrapper[8606]: I1204 22:00:44.146663 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:44.150560 master-0 kubenswrapper[8606]: I1204 22:00:44.147294 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:44.166619 master-0 kubenswrapper[8606]: I1204 22:00:44.166552 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:00:44.171885 master-0 kubenswrapper[8606]: I1204 22:00:44.171846 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:00:44.196950 master-0 kubenswrapper[8606]: W1204 22:00:44.196900 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74b7c644_ad97_4009_aac7_550edabc55ae.slice/crio-9ba3114789d579136ad84d27a499346b11ba547f0a0d6e213113615a12220157 WatchSource:0}: Error finding container 9ba3114789d579136ad84d27a499346b11ba547f0a0d6e213113615a12220157: Status 404 returned error can't find the container with id 9ba3114789d579136ad84d27a499346b11ba547f0a0d6e213113615a12220157 Dec 04 22:00:44.317630 master-0 kubenswrapper[8606]: I1204 22:00:44.317026 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs"] Dec 04 22:00:44.318176 master-0 kubenswrapper[8606]: I1204 22:00:44.318104 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.321559 master-0 kubenswrapper[8606]: I1204 22:00:44.321516 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 22:00:44.321659 master-0 kubenswrapper[8606]: I1204 22:00:44.321575 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 22:00:44.321736 master-0 kubenswrapper[8606]: I1204 22:00:44.321532 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 22:00:44.322021 master-0 kubenswrapper[8606]: I1204 22:00:44.321993 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 22:00:44.336217 master-0 kubenswrapper[8606]: I1204 22:00:44.334511 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 22:00:44.338723 master-0 kubenswrapper[8606]: I1204 22:00:44.338678 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs"] Dec 04 22:00:44.340797 master-0 kubenswrapper[8606]: I1204 22:00:44.340752 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 22:00:44.379678 master-0 kubenswrapper[8606]: I1204 22:00:44.375295 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.379678 master-0 kubenswrapper[8606]: I1204 22:00:44.375406 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-proxy-ca-bundles\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.379678 master-0 kubenswrapper[8606]: I1204 22:00:44.375433 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.379678 master-0 kubenswrapper[8606]: I1204 22:00:44.375477 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r9r6\" (UniqueName: \"kubernetes.io/projected/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-kube-api-access-9r9r6\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.379678 master-0 kubenswrapper[8606]: I1204 22:00:44.375508 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-config\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.436382 master-0 kubenswrapper[8606]: I1204 22:00:44.436029 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b"] Dec 04 22:00:44.476467 master-0 kubenswrapper[8606]: I1204 22:00:44.476406 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-config\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.476741 master-0 kubenswrapper[8606]: I1204 22:00:44.476488 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.476741 master-0 kubenswrapper[8606]: I1204 22:00:44.476580 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-proxy-ca-bundles\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.476741 master-0 kubenswrapper[8606]: I1204 22:00:44.476600 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.476741 master-0 kubenswrapper[8606]: I1204 22:00:44.476648 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r9r6\" (UniqueName: \"kubernetes.io/projected/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-kube-api-access-9r9r6\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.477022 master-0 kubenswrapper[8606]: E1204 22:00:44.476997 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:44.477087 master-0 kubenswrapper[8606]: E1204 22:00:44.477047 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca podName:83e7e3bd-0f10-4b2c-91e5-191a5b21be4f nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.977034034 +0000 UTC m=+9.787336249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca") pod "controller-manager-5686ff9f7d-xxnvs" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f") : configmap "client-ca" not found Dec 04 22:00:44.479445 master-0 kubenswrapper[8606]: I1204 22:00:44.477531 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-config\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.479445 master-0 kubenswrapper[8606]: E1204 22:00:44.477612 8606 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 04 22:00:44.479445 master-0 kubenswrapper[8606]: E1204 22:00:44.477641 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert podName:83e7e3bd-0f10-4b2c-91e5-191a5b21be4f nodeName:}" failed. No retries permitted until 2025-12-04 22:00:44.977631391 +0000 UTC m=+9.787933606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert") pod "controller-manager-5686ff9f7d-xxnvs" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f") : secret "serving-cert" not found Dec 04 22:00:44.479445 master-0 kubenswrapper[8606]: I1204 22:00:44.478652 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-proxy-ca-bundles\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.499687 master-0 kubenswrapper[8606]: I1204 22:00:44.499615 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r9r6\" (UniqueName: \"kubernetes.io/projected/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-kube-api-access-9r9r6\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.673827 master-0 kubenswrapper[8606]: I1204 22:00:44.670461 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" event={"ID":"0beb871c-3bf1-471c-a028-746a650267bf","Type":"ContainerStarted","Data":"c83e316239457de6d2cf065ee11c69192c6233457017b9e9bdae1e03d84ad9fc"} Dec 04 22:00:44.673827 master-0 kubenswrapper[8606]: I1204 22:00:44.673682 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" event={"ID":"74b7c644-ad97-4009-aac7-550edabc55ae","Type":"ContainerStarted","Data":"9ba3114789d579136ad84d27a499346b11ba547f0a0d6e213113615a12220157"} Dec 04 22:00:44.737469 master-0 kubenswrapper[8606]: I1204 22:00:44.737288 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:44.737740 master-0 kubenswrapper[8606]: I1204 22:00:44.737488 8606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:00:44.737740 master-0 kubenswrapper[8606]: I1204 22:00:44.737530 8606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:00:44.778918 master-0 kubenswrapper[8606]: I1204 22:00:44.778838 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:44.883413 master-0 kubenswrapper[8606]: I1204 22:00:44.882583 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:44.883413 master-0 kubenswrapper[8606]: I1204 22:00:44.882668 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:44.883413 master-0 kubenswrapper[8606]: E1204 22:00:44.882996 8606 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:44.883413 master-0 kubenswrapper[8606]: E1204 22:00:44.883078 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:48.883058311 +0000 UTC m=+13.693360526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : configmap "client-ca" not found Dec 04 22:00:44.883413 master-0 kubenswrapper[8606]: E1204 22:00:44.883277 8606 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 04 22:00:44.883413 master-0 kubenswrapper[8606]: E1204 22:00:44.883413 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:48.883375849 +0000 UTC m=+13.693678234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : secret "serving-cert" not found Dec 04 22:00:44.985194 master-0 kubenswrapper[8606]: I1204 22:00:44.985047 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.985432 master-0 kubenswrapper[8606]: E1204 22:00:44.985255 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:44.985432 master-0 kubenswrapper[8606]: E1204 22:00:44.985351 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca podName:83e7e3bd-0f10-4b2c-91e5-191a5b21be4f nodeName:}" failed. No retries permitted until 2025-12-04 22:00:45.985326239 +0000 UTC m=+10.795628444 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca") pod "controller-manager-5686ff9f7d-xxnvs" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f") : configmap "client-ca" not found Dec 04 22:00:44.985928 master-0 kubenswrapper[8606]: I1204 22:00:44.985660 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:44.985928 master-0 kubenswrapper[8606]: E1204 22:00:44.985840 8606 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 04 22:00:44.985928 master-0 kubenswrapper[8606]: E1204 22:00:44.985901 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert podName:83e7e3bd-0f10-4b2c-91e5-191a5b21be4f nodeName:}" failed. No retries permitted until 2025-12-04 22:00:45.985882663 +0000 UTC m=+10.796184878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert") pod "controller-manager-5686ff9f7d-xxnvs" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f") : secret "serving-cert" not found Dec 04 22:00:45.034785 master-0 kubenswrapper[8606]: I1204 22:00:45.033492 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:45.034785 master-0 kubenswrapper[8606]: I1204 22:00:45.033821 8606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:00:45.040990 master-0 kubenswrapper[8606]: I1204 22:00:45.040932 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:00:45.677405 master-0 kubenswrapper[8606]: I1204 22:00:45.676770 8606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:00:45.998876 master-0 kubenswrapper[8606]: I1204 22:00:45.998711 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:45.999068 master-0 kubenswrapper[8606]: E1204 22:00:45.998891 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:45.999068 master-0 kubenswrapper[8606]: I1204 22:00:45.998931 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:45.999068 master-0 kubenswrapper[8606]: E1204 22:00:45.998986 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca podName:83e7e3bd-0f10-4b2c-91e5-191a5b21be4f nodeName:}" failed. No retries permitted until 2025-12-04 22:00:47.998959748 +0000 UTC m=+12.809261963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca") pod "controller-manager-5686ff9f7d-xxnvs" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f") : configmap "client-ca" not found Dec 04 22:00:46.007344 master-0 kubenswrapper[8606]: I1204 22:00:46.007307 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:46.595125 master-0 kubenswrapper[8606]: I1204 22:00:46.595055 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:46.637478 master-0 kubenswrapper[8606]: I1204 22:00:46.636234 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:00:46.684145 master-0 kubenswrapper[8606]: I1204 22:00:46.683981 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerStarted","Data":"55ff2217087c08bbb5a594e4d764f860ce087d9b5069af52b9f8daf47ea1941f"} Dec 04 22:00:46.835535 master-0 kubenswrapper[8606]: I1204 22:00:46.834645 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:00:47.692403 master-0 kubenswrapper[8606]: I1204 22:00:47.690900 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" event={"ID":"74b7c644-ad97-4009-aac7-550edabc55ae","Type":"ContainerStarted","Data":"df603d1634cf022b1fd00682ac83f2febe676517e6d4c121582f932bc1ae578d"} Dec 04 22:00:48.030828 master-0 kubenswrapper[8606]: I1204 22:00:48.030752 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:48.031106 master-0 kubenswrapper[8606]: E1204 22:00:48.030911 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:48.031106 master-0 kubenswrapper[8606]: E1204 22:00:48.030970 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca podName:83e7e3bd-0f10-4b2c-91e5-191a5b21be4f nodeName:}" failed. No retries permitted until 2025-12-04 22:00:52.030951522 +0000 UTC m=+16.841253727 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca") pod "controller-manager-5686ff9f7d-xxnvs" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f") : configmap "client-ca" not found Dec 04 22:00:48.947372 master-0 kubenswrapper[8606]: I1204 22:00:48.941313 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:48.947372 master-0 kubenswrapper[8606]: I1204 22:00:48.941425 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:48.947372 master-0 kubenswrapper[8606]: E1204 22:00:48.941584 8606 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:48.947372 master-0 kubenswrapper[8606]: E1204 22:00:48.941640 8606 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 04 22:00:48.947372 master-0 kubenswrapper[8606]: E1204 22:00:48.941674 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:56.941644157 +0000 UTC m=+21.751946402 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : configmap "client-ca" not found Dec 04 22:00:48.947372 master-0 kubenswrapper[8606]: E1204 22:00:48.941792 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:56.941757189 +0000 UTC m=+21.752059434 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : secret "serving-cert" not found Dec 04 22:00:49.338533 master-0 kubenswrapper[8606]: I1204 22:00:49.335635 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-5f8855d67b-mzflg"] Dec 04 22:00:49.338533 master-0 kubenswrapper[8606]: I1204 22:00:49.338324 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.343228 master-0 kubenswrapper[8606]: I1204 22:00:49.342677 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 22:00:49.343347 master-0 kubenswrapper[8606]: I1204 22:00:49.343294 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 22:00:49.344726 master-0 kubenswrapper[8606]: I1204 22:00:49.343460 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 22:00:49.344726 master-0 kubenswrapper[8606]: I1204 22:00:49.343717 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 22:00:49.344726 master-0 kubenswrapper[8606]: I1204 22:00:49.343803 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Dec 04 22:00:49.344726 master-0 kubenswrapper[8606]: I1204 22:00:49.343844 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 22:00:49.344726 master-0 kubenswrapper[8606]: I1204 22:00:49.343953 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 22:00:49.344726 master-0 kubenswrapper[8606]: I1204 22:00:49.344186 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Dec 04 22:00:49.344726 master-0 kubenswrapper[8606]: I1204 22:00:49.344410 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 22:00:49.355224 master-0 kubenswrapper[8606]: I1204 22:00:49.355181 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 22:00:49.364094 master-0 kubenswrapper[8606]: I1204 22:00:49.364034 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-5f8855d67b-mzflg"] Dec 04 22:00:49.448448 master-0 kubenswrapper[8606]: I1204 22:00:49.448357 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-node-pullsecrets\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.448712 master-0 kubenswrapper[8606]: I1204 22:00:49.448479 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.448712 master-0 kubenswrapper[8606]: I1204 22:00:49.448521 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-trusted-ca-bundle\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.448712 master-0 kubenswrapper[8606]: I1204 22:00:49.448603 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-client\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.448712 master-0 kubenswrapper[8606]: I1204 22:00:49.448626 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-encryption-config\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.448860 master-0 kubenswrapper[8606]: I1204 22:00:49.448792 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvwcn\" (UniqueName: \"kubernetes.io/projected/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-kube-api-access-jvwcn\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.448900 master-0 kubenswrapper[8606]: I1204 22:00:49.448880 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.448979 master-0 kubenswrapper[8606]: I1204 22:00:49.448944 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-serving-ca\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.448979 master-0 kubenswrapper[8606]: I1204 22:00:49.448976 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit-dir\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.449062 master-0 kubenswrapper[8606]: I1204 22:00:49.449048 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-config\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.449112 master-0 kubenswrapper[8606]: I1204 22:00:49.449070 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-image-import-ca\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.550786 master-0 kubenswrapper[8606]: I1204 22:00:49.550719 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-node-pullsecrets\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.550786 master-0 kubenswrapper[8606]: I1204 22:00:49.550781 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.551088 master-0 kubenswrapper[8606]: I1204 22:00:49.550811 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-trusted-ca-bundle\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.551088 master-0 kubenswrapper[8606]: I1204 22:00:49.550837 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-client\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.551088 master-0 kubenswrapper[8606]: I1204 22:00:49.550858 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-encryption-config\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.551088 master-0 kubenswrapper[8606]: I1204 22:00:49.550887 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvwcn\" (UniqueName: \"kubernetes.io/projected/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-kube-api-access-jvwcn\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.551588 master-0 kubenswrapper[8606]: E1204 22:00:49.551384 8606 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 04 22:00:49.551588 master-0 kubenswrapper[8606]: E1204 22:00:49.551477 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit podName:8b994b55-a93f-4a80-b72a-d6c3ee139ef8 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:50.051453128 +0000 UTC m=+14.861755343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit") pod "apiserver-5f8855d67b-mzflg" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8") : configmap "audit-0" not found Dec 04 22:00:49.551588 master-0 kubenswrapper[8606]: I1204 22:00:49.551553 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.551709 master-0 kubenswrapper[8606]: I1204 22:00:49.551601 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-serving-ca\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.551709 master-0 kubenswrapper[8606]: I1204 22:00:49.551639 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit-dir\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.551778 master-0 kubenswrapper[8606]: I1204 22:00:49.551747 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-config\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.551828 master-0 kubenswrapper[8606]: I1204 22:00:49.551790 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-image-import-ca\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.552494 master-0 kubenswrapper[8606]: E1204 22:00:49.552427 8606 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 04 22:00:49.552666 master-0 kubenswrapper[8606]: E1204 22:00:49.552630 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert podName:8b994b55-a93f-4a80-b72a-d6c3ee139ef8 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:50.052571257 +0000 UTC m=+14.862873522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert") pod "apiserver-5f8855d67b-mzflg" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8") : secret "serving-cert" not found Dec 04 22:00:49.552804 master-0 kubenswrapper[8606]: I1204 22:00:49.552741 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit-dir\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.553053 master-0 kubenswrapper[8606]: I1204 22:00:49.553033 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-trusted-ca-bundle\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.559161 master-0 kubenswrapper[8606]: I1204 22:00:49.553038 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-serving-ca\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.559161 master-0 kubenswrapper[8606]: I1204 22:00:49.553165 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-node-pullsecrets\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.559161 master-0 kubenswrapper[8606]: I1204 22:00:49.554204 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-config\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.559161 master-0 kubenswrapper[8606]: I1204 22:00:49.556308 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-image-import-ca\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.565315 master-0 kubenswrapper[8606]: I1204 22:00:49.565275 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-encryption-config\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.565569 master-0 kubenswrapper[8606]: I1204 22:00:49.565480 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-client\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:49.572877 master-0 kubenswrapper[8606]: I1204 22:00:49.572843 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvwcn\" (UniqueName: \"kubernetes.io/projected/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-kube-api-access-jvwcn\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:50.060157 master-0 kubenswrapper[8606]: I1204 22:00:50.060067 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:50.061196 master-0 kubenswrapper[8606]: I1204 22:00:50.060218 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:50.061196 master-0 kubenswrapper[8606]: E1204 22:00:50.060333 8606 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 04 22:00:50.061196 master-0 kubenswrapper[8606]: E1204 22:00:50.060402 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit podName:8b994b55-a93f-4a80-b72a-d6c3ee139ef8 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:51.060383018 +0000 UTC m=+15.870685233 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit") pod "apiserver-5f8855d67b-mzflg" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8") : configmap "audit-0" not found Dec 04 22:00:50.061196 master-0 kubenswrapper[8606]: E1204 22:00:50.060493 8606 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 04 22:00:50.061196 master-0 kubenswrapper[8606]: E1204 22:00:50.060547 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert podName:8b994b55-a93f-4a80-b72a-d6c3ee139ef8 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:51.060537953 +0000 UTC m=+15.870840168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert") pod "apiserver-5f8855d67b-mzflg" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8") : secret "serving-cert" not found Dec 04 22:00:51.084946 master-0 kubenswrapper[8606]: I1204 22:00:51.084865 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:51.084946 master-0 kubenswrapper[8606]: I1204 22:00:51.084958 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:51.085611 master-0 kubenswrapper[8606]: E1204 22:00:51.085261 8606 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 04 22:00:51.085611 master-0 kubenswrapper[8606]: E1204 22:00:51.085409 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit podName:8b994b55-a93f-4a80-b72a-d6c3ee139ef8 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:53.085371949 +0000 UTC m=+17.895674204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit") pod "apiserver-5f8855d67b-mzflg" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8") : configmap "audit-0" not found Dec 04 22:00:51.085611 master-0 kubenswrapper[8606]: E1204 22:00:51.085402 8606 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 04 22:00:51.085611 master-0 kubenswrapper[8606]: E1204 22:00:51.085479 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert podName:8b994b55-a93f-4a80-b72a-d6c3ee139ef8 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:53.085462471 +0000 UTC m=+17.895764726 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert") pod "apiserver-5f8855d67b-mzflg" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8") : secret "serving-cert" not found Dec 04 22:00:51.674796 master-0 kubenswrapper[8606]: I1204 22:00:51.674062 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-jn88h"] Dec 04 22:00:51.675378 master-0 kubenswrapper[8606]: I1204 22:00:51.675325 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.712434 master-0 kubenswrapper[8606]: I1204 22:00:51.712364 8606 generic.go:334] "Generic (PLEG): container finished" podID="465637a4-42be-4a65-a859-7af699960138" containerID="55ff2217087c08bbb5a594e4d764f860ce087d9b5069af52b9f8daf47ea1941f" exitCode=0 Dec 04 22:00:51.712688 master-0 kubenswrapper[8606]: I1204 22:00:51.712463 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerDied","Data":"55ff2217087c08bbb5a594e4d764f860ce087d9b5069af52b9f8daf47ea1941f"} Dec 04 22:00:51.713118 master-0 kubenswrapper[8606]: I1204 22:00:51.713076 8606 scope.go:117] "RemoveContainer" containerID="55ff2217087c08bbb5a594e4d764f860ce087d9b5069af52b9f8daf47ea1941f" Dec 04 22:00:51.714471 master-0 kubenswrapper[8606]: I1204 22:00:51.714425 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" event={"ID":"0beb871c-3bf1-471c-a028-746a650267bf","Type":"ContainerStarted","Data":"3319d82050d7163f2a7f96d02081f2908fac76018c64e251ebfd0f4e73ccabfd"} Dec 04 22:00:51.793369 master-0 kubenswrapper[8606]: I1204 22:00:51.793301 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-lib-modules\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793588 master-0 kubenswrapper[8606]: I1204 22:00:51.793380 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793588 master-0 kubenswrapper[8606]: I1204 22:00:51.793453 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-run\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793588 master-0 kubenswrapper[8606]: I1204 22:00:51.793495 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-kubernetes\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793588 master-0 kubenswrapper[8606]: I1204 22:00:51.793541 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-sys\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793588 master-0 kubenswrapper[8606]: I1204 22:00:51.793580 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-modprobe-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793800 master-0 kubenswrapper[8606]: I1204 22:00:51.793606 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-systemd\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793800 master-0 kubenswrapper[8606]: I1204 22:00:51.793643 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6b4p\" (UniqueName: \"kubernetes.io/projected/fbb8e73f-7e50-451b-b400-e88a86b51e09-kube-api-access-c6b4p\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793800 master-0 kubenswrapper[8606]: I1204 22:00:51.793712 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-var-lib-kubelet\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793800 master-0 kubenswrapper[8606]: I1204 22:00:51.793736 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-tmp\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793800 master-0 kubenswrapper[8606]: I1204 22:00:51.793758 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-conf\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793800 master-0 kubenswrapper[8606]: I1204 22:00:51.793782 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysconfig\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793800 master-0 kubenswrapper[8606]: I1204 22:00:51.793804 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-tuned\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.793990 master-0 kubenswrapper[8606]: I1204 22:00:51.793832 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-host\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.895382 master-0 kubenswrapper[8606]: I1204 22:00:51.895292 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-host\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.896435 master-0 kubenswrapper[8606]: I1204 22:00:51.896310 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-host\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.896435 master-0 kubenswrapper[8606]: I1204 22:00:51.896387 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-lib-modules\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.896596 master-0 kubenswrapper[8606]: I1204 22:00:51.896568 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.896911 master-0 kubenswrapper[8606]: I1204 22:00:51.896798 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-lib-modules\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.896911 master-0 kubenswrapper[8606]: I1204 22:00:51.896806 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.896994 master-0 kubenswrapper[8606]: I1204 22:00:51.896972 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-run\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897086 master-0 kubenswrapper[8606]: I1204 22:00:51.897061 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-kubernetes\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897130 master-0 kubenswrapper[8606]: I1204 22:00:51.897100 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-sys\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897208 master-0 kubenswrapper[8606]: I1204 22:00:51.897184 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-modprobe-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897254 master-0 kubenswrapper[8606]: I1204 22:00:51.897200 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-run\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897254 master-0 kubenswrapper[8606]: I1204 22:00:51.897218 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-systemd\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897382 master-0 kubenswrapper[8606]: I1204 22:00:51.897349 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-kubernetes\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897551 master-0 kubenswrapper[8606]: I1204 22:00:51.897476 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6b4p\" (UniqueName: \"kubernetes.io/projected/fbb8e73f-7e50-451b-b400-e88a86b51e09-kube-api-access-c6b4p\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897684 master-0 kubenswrapper[8606]: I1204 22:00:51.897647 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-systemd\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897775 master-0 kubenswrapper[8606]: I1204 22:00:51.897690 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-var-lib-kubelet\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897872 master-0 kubenswrapper[8606]: I1204 22:00:51.897826 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-var-lib-kubelet\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897913 master-0 kubenswrapper[8606]: I1204 22:00:51.897881 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-sys\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.897950 master-0 kubenswrapper[8606]: I1204 22:00:51.897832 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-modprobe-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.898012 master-0 kubenswrapper[8606]: I1204 22:00:51.897995 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-tmp\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.898132 master-0 kubenswrapper[8606]: I1204 22:00:51.898119 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-conf\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.898210 master-0 kubenswrapper[8606]: I1204 22:00:51.898198 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysconfig\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.898284 master-0 kubenswrapper[8606]: I1204 22:00:51.898273 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-tuned\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.898389 master-0 kubenswrapper[8606]: I1204 22:00:51.898343 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysconfig\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.899465 master-0 kubenswrapper[8606]: I1204 22:00:51.898686 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-conf\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.902894 master-0 kubenswrapper[8606]: I1204 22:00:51.902820 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-tmp\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.904979 master-0 kubenswrapper[8606]: I1204 22:00:51.904540 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-tuned\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.918799 master-0 kubenswrapper[8606]: I1204 22:00:51.918666 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6b4p\" (UniqueName: \"kubernetes.io/projected/fbb8e73f-7e50-451b-b400-e88a86b51e09-kube-api-access-c6b4p\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:51.994551 master-0 kubenswrapper[8606]: I1204 22:00:51.994389 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:00:52.010441 master-0 kubenswrapper[8606]: W1204 22:00:52.010390 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb8e73f_7e50_451b_b400_e88a86b51e09.slice/crio-93503d425e63d7ea9a2401b9ba117e4c428dd61716b8edbd280c970e0f14741d WatchSource:0}: Error finding container 93503d425e63d7ea9a2401b9ba117e4c428dd61716b8edbd280c970e0f14741d: Status 404 returned error can't find the container with id 93503d425e63d7ea9a2401b9ba117e4c428dd61716b8edbd280c970e0f14741d Dec 04 22:00:52.101131 master-0 kubenswrapper[8606]: I1204 22:00:52.101060 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:00:52.119990 master-0 kubenswrapper[8606]: E1204 22:00:52.101369 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:52.119990 master-0 kubenswrapper[8606]: E1204 22:00:52.101479 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca podName:83e7e3bd-0f10-4b2c-91e5-191a5b21be4f nodeName:}" failed. No retries permitted until 2025-12-04 22:01:00.101450904 +0000 UTC m=+24.911753329 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca") pod "controller-manager-5686ff9f7d-xxnvs" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f") : configmap "client-ca" not found Dec 04 22:00:52.202692 master-0 kubenswrapper[8606]: I1204 22:00:52.202619 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:52.202692 master-0 kubenswrapper[8606]: I1204 22:00:52.202690 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:00:52.203236 master-0 kubenswrapper[8606]: I1204 22:00:52.203154 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:52.203343 master-0 kubenswrapper[8606]: I1204 22:00:52.203264 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:52.203474 master-0 kubenswrapper[8606]: I1204 22:00:52.203412 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:52.203748 master-0 kubenswrapper[8606]: E1204 22:00:52.203667 8606 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 04 22:00:52.203855 master-0 kubenswrapper[8606]: E1204 22:00:52.203827 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert podName:813f3ee7-35b5-4ee8-b453-00d16d910eae nodeName:}" failed. No retries permitted until 2025-12-04 22:01:08.203783883 +0000 UTC m=+33.014086138 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-bslb5" (UID: "813f3ee7-35b5-4ee8-b453-00d16d910eae") : secret "package-server-manager-serving-cert" not found Dec 04 22:00:52.204093 master-0 kubenswrapper[8606]: I1204 22:00:52.204045 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:52.204182 master-0 kubenswrapper[8606]: I1204 22:00:52.204150 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:52.204246 master-0 kubenswrapper[8606]: I1204 22:00:52.204221 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:52.208737 master-0 kubenswrapper[8606]: I1204 22:00:52.208300 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:52.208737 master-0 kubenswrapper[8606]: I1204 22:00:52.208720 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:52.208737 master-0 kubenswrapper[8606]: I1204 22:00:52.208413 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-nk4gb\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:52.209011 master-0 kubenswrapper[8606]: I1204 22:00:52.208801 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:52.209011 master-0 kubenswrapper[8606]: I1204 22:00:52.208951 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:52.209215 master-0 kubenswrapper[8606]: I1204 22:00:52.209170 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:52.209490 master-0 kubenswrapper[8606]: I1204 22:00:52.209450 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:52.267385 master-0 kubenswrapper[8606]: I1204 22:00:52.267309 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:00:52.267710 master-0 kubenswrapper[8606]: I1204 22:00:52.267371 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:00:52.267930 master-0 kubenswrapper[8606]: I1204 22:00:52.267887 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:00:52.268199 master-0 kubenswrapper[8606]: I1204 22:00:52.268162 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:00:52.268199 master-0 kubenswrapper[8606]: I1204 22:00:52.268187 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:00:52.268449 master-0 kubenswrapper[8606]: I1204 22:00:52.268414 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:00:52.268837 master-0 kubenswrapper[8606]: I1204 22:00:52.268786 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:00:52.645033 master-0 kubenswrapper[8606]: I1204 22:00:52.644543 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9pfhj"] Dec 04 22:00:52.662809 master-0 kubenswrapper[8606]: I1204 22:00:52.661682 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq"] Dec 04 22:00:52.679100 master-0 kubenswrapper[8606]: W1204 22:00:52.678362 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35821f48_b000_4915_847f_a739b6efc5ee.slice/crio-b5aeee4aff8bc78bf6e36ba938b9609781a2d34417a21d03fdc2c6a101065131 WatchSource:0}: Error finding container b5aeee4aff8bc78bf6e36ba938b9609781a2d34417a21d03fdc2c6a101065131: Status 404 returned error can't find the container with id b5aeee4aff8bc78bf6e36ba938b9609781a2d34417a21d03fdc2c6a101065131 Dec 04 22:00:52.726880 master-0 kubenswrapper[8606]: I1204 22:00:52.726792 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-f797b99b6-m9m4h"] Dec 04 22:00:52.740873 master-0 kubenswrapper[8606]: I1204 22:00:52.740776 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9pfhj" event={"ID":"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa","Type":"ContainerStarted","Data":"fc5560b4f5b75417c091bf8b734a41efa2795ce2d8cceb8a89a66960f1ba3320"} Dec 04 22:00:52.744703 master-0 kubenswrapper[8606]: I1204 22:00:52.744606 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-8649c48786-qlkgh"] Dec 04 22:00:52.746747 master-0 kubenswrapper[8606]: W1204 22:00:52.746702 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6a5d14d_0409_4024_b0a8_200fa2594185.slice/crio-1f66bf17e6a9a3d1673b91cfae48275a45a440300946a8bf5061bac66c63db97 WatchSource:0}: Error finding container 1f66bf17e6a9a3d1673b91cfae48275a45a440300946a8bf5061bac66c63db97: Status 404 returned error can't find the container with id 1f66bf17e6a9a3d1673b91cfae48275a45a440300946a8bf5061bac66c63db97 Dec 04 22:00:52.758059 master-0 kubenswrapper[8606]: I1204 22:00:52.749983 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-7c56cf9b74-sshsd"] Dec 04 22:00:52.758059 master-0 kubenswrapper[8606]: I1204 22:00:52.755654 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerStarted","Data":"206992e5a976be25c0ca246941e52ef047963087cb7fb3d7fae48784f22c1968"} Dec 04 22:00:52.760217 master-0 kubenswrapper[8606]: I1204 22:00:52.759803 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz"] Dec 04 22:00:52.760217 master-0 kubenswrapper[8606]: I1204 22:00:52.760056 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" event={"ID":"35821f48-b000-4915-847f-a739b6efc5ee","Type":"ContainerStarted","Data":"b5aeee4aff8bc78bf6e36ba938b9609781a2d34417a21d03fdc2c6a101065131"} Dec 04 22:00:52.775925 master-0 kubenswrapper[8606]: I1204 22:00:52.774888 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jn88h" event={"ID":"fbb8e73f-7e50-451b-b400-e88a86b51e09","Type":"ContainerStarted","Data":"bfa4071d7b4f3516f069aeaba27743542e3344c32b39bb74e634ff273d539b31"} Dec 04 22:00:52.775925 master-0 kubenswrapper[8606]: I1204 22:00:52.774956 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jn88h" event={"ID":"fbb8e73f-7e50-451b-b400-e88a86b51e09","Type":"ContainerStarted","Data":"93503d425e63d7ea9a2401b9ba117e4c428dd61716b8edbd280c970e0f14741d"} Dec 04 22:00:52.789269 master-0 kubenswrapper[8606]: I1204 22:00:52.789157 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb"] Dec 04 22:00:52.796390 master-0 kubenswrapper[8606]: W1204 22:00:52.796311 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6bd2f1_3f34_40c2_b2cf_7e3881ba51bf.slice/crio-a96deeb0726472a631562696723fe7dacd1bdfdf5107c1d25582d7805e92f14c WatchSource:0}: Error finding container a96deeb0726472a631562696723fe7dacd1bdfdf5107c1d25582d7805e92f14c: Status 404 returned error can't find the container with id a96deeb0726472a631562696723fe7dacd1bdfdf5107c1d25582d7805e92f14c Dec 04 22:00:52.802215 master-0 kubenswrapper[8606]: I1204 22:00:52.801131 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jn88h" podStartSLOduration=1.801109892 podStartE2EDuration="1.801109892s" podCreationTimestamp="2025-12-04 22:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:00:52.79993661 +0000 UTC m=+17.610238855" watchObservedRunningTime="2025-12-04 22:00:52.801109892 +0000 UTC m=+17.611412107" Dec 04 22:00:52.986257 master-0 kubenswrapper[8606]: I1204 22:00:52.986202 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-5f8855d67b-mzflg"] Dec 04 22:00:52.986915 master-0 kubenswrapper[8606]: E1204 22:00:52.986883 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" podUID="8b994b55-a93f-4a80-b72a-d6c3ee139ef8" Dec 04 22:00:53.119669 master-0 kubenswrapper[8606]: I1204 22:00:53.119483 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:53.119669 master-0 kubenswrapper[8606]: I1204 22:00:53.119597 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert\") pod \"apiserver-5f8855d67b-mzflg\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:53.121078 master-0 kubenswrapper[8606]: E1204 22:00:53.119737 8606 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 04 22:00:53.121078 master-0 kubenswrapper[8606]: E1204 22:00:53.119767 8606 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 04 22:00:53.121078 master-0 kubenswrapper[8606]: E1204 22:00:53.119837 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert podName:8b994b55-a93f-4a80-b72a-d6c3ee139ef8 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:57.119819299 +0000 UTC m=+21.930121514 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert") pod "apiserver-5f8855d67b-mzflg" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8") : secret "serving-cert" not found Dec 04 22:00:53.121078 master-0 kubenswrapper[8606]: E1204 22:00:53.119857 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit podName:8b994b55-a93f-4a80-b72a-d6c3ee139ef8 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:57.11984917 +0000 UTC m=+21.930151385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit") pod "apiserver-5f8855d67b-mzflg" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8") : configmap "audit-0" not found Dec 04 22:00:53.783165 master-0 kubenswrapper[8606]: I1204 22:00:53.783109 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"479597d06e399852cde3f4983981240e3b9a935772d2dd22d716d20e734ab158"} Dec 04 22:00:53.784611 master-0 kubenswrapper[8606]: I1204 22:00:53.784575 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" event={"ID":"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e","Type":"ContainerStarted","Data":"650ca7f20c2d1cb1f57ba5643ad53b21f17eea7d93316d18d3c9ccbd27770c35"} Dec 04 22:00:53.785695 master-0 kubenswrapper[8606]: I1204 22:00:53.785639 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" event={"ID":"c6a5d14d-0409-4024-b0a8-200fa2594185","Type":"ContainerStarted","Data":"1f66bf17e6a9a3d1673b91cfae48275a45a440300946a8bf5061bac66c63db97"} Dec 04 22:00:53.786726 master-0 kubenswrapper[8606]: I1204 22:00:53.786642 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" event={"ID":"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5","Type":"ContainerStarted","Data":"b9ac4ee53782e9fd4b340ed2b43fd3025db3cb82bd0881252f116248836951ce"} Dec 04 22:00:53.788356 master-0 kubenswrapper[8606]: I1204 22:00:53.788303 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" event={"ID":"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf","Type":"ContainerStarted","Data":"a96deeb0726472a631562696723fe7dacd1bdfdf5107c1d25582d7805e92f14c"} Dec 04 22:00:53.788356 master-0 kubenswrapper[8606]: I1204 22:00:53.788353 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:53.801577 master-0 kubenswrapper[8606]: I1204 22:00:53.801547 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:53.931167 master-0 kubenswrapper[8606]: I1204 22:00:53.931108 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-image-import-ca\") pod \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " Dec 04 22:00:53.932434 master-0 kubenswrapper[8606]: I1204 22:00:53.931164 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-config\") pod \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " Dec 04 22:00:53.932434 master-0 kubenswrapper[8606]: I1204 22:00:53.932398 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvwcn\" (UniqueName: \"kubernetes.io/projected/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-kube-api-access-jvwcn\") pod \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " Dec 04 22:00:53.932569 master-0 kubenswrapper[8606]: I1204 22:00:53.932443 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-node-pullsecrets\") pod \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " Dec 04 22:00:53.932569 master-0 kubenswrapper[8606]: I1204 22:00:53.932479 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit-dir\") pod \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " Dec 04 22:00:53.932569 master-0 kubenswrapper[8606]: I1204 22:00:53.932520 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-serving-ca\") pod \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " Dec 04 22:00:53.932569 master-0 kubenswrapper[8606]: I1204 22:00:53.932555 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-trusted-ca-bundle\") pod \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " Dec 04 22:00:53.932681 master-0 kubenswrapper[8606]: I1204 22:00:53.932578 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-encryption-config\") pod \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " Dec 04 22:00:53.932681 master-0 kubenswrapper[8606]: I1204 22:00:53.932605 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-client\") pod \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\" (UID: \"8b994b55-a93f-4a80-b72a-d6c3ee139ef8\") " Dec 04 22:00:53.933198 master-0 kubenswrapper[8606]: I1204 22:00:53.933039 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-config" (OuterVolumeSpecName: "config") pod "8b994b55-a93f-4a80-b72a-d6c3ee139ef8" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:00:53.933879 master-0 kubenswrapper[8606]: I1204 22:00:53.933674 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "8b994b55-a93f-4a80-b72a-d6c3ee139ef8" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:00:53.933879 master-0 kubenswrapper[8606]: I1204 22:00:53.933734 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8b994b55-a93f-4a80-b72a-d6c3ee139ef8" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:53.933879 master-0 kubenswrapper[8606]: I1204 22:00:53.933824 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8b994b55-a93f-4a80-b72a-d6c3ee139ef8" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:00:53.936635 master-0 kubenswrapper[8606]: I1204 22:00:53.934208 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8b994b55-a93f-4a80-b72a-d6c3ee139ef8" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:00:53.937098 master-0 kubenswrapper[8606]: I1204 22:00:53.937050 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "8b994b55-a93f-4a80-b72a-d6c3ee139ef8" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:00:53.943713 master-0 kubenswrapper[8606]: I1204 22:00:53.943649 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "8b994b55-a93f-4a80-b72a-d6c3ee139ef8" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:00:53.948452 master-0 kubenswrapper[8606]: I1204 22:00:53.947823 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "8b994b55-a93f-4a80-b72a-d6c3ee139ef8" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:00:53.958418 master-0 kubenswrapper[8606]: I1204 22:00:53.958346 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-kube-api-access-jvwcn" (OuterVolumeSpecName: "kube-api-access-jvwcn") pod "8b994b55-a93f-4a80-b72a-d6c3ee139ef8" (UID: "8b994b55-a93f-4a80-b72a-d6c3ee139ef8"). InnerVolumeSpecName "kube-api-access-jvwcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:00:54.049459 master-0 kubenswrapper[8606]: I1204 22:00:54.049268 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvwcn\" (UniqueName: \"kubernetes.io/projected/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-kube-api-access-jvwcn\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:54.049459 master-0 kubenswrapper[8606]: I1204 22:00:54.049306 8606 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:54.049459 master-0 kubenswrapper[8606]: I1204 22:00:54.049319 8606 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:54.049459 master-0 kubenswrapper[8606]: I1204 22:00:54.049327 8606 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:54.049459 master-0 kubenswrapper[8606]: I1204 22:00:54.049336 8606 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:54.049459 master-0 kubenswrapper[8606]: I1204 22:00:54.049345 8606 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-encryption-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:54.049459 master-0 kubenswrapper[8606]: I1204 22:00:54.049353 8606 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-etcd-client\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:54.049459 master-0 kubenswrapper[8606]: I1204 22:00:54.049361 8606 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-image-import-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:54.049459 master-0 kubenswrapper[8606]: I1204 22:00:54.049369 8606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:54.792602 master-0 kubenswrapper[8606]: I1204 22:00:54.792480 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5f8855d67b-mzflg" Dec 04 22:00:54.854626 master-0 kubenswrapper[8606]: I1204 22:00:54.852562 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-8db7f8d79-rlqbz"] Dec 04 22:00:54.854626 master-0 kubenswrapper[8606]: I1204 22:00:54.854281 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.860945 master-0 kubenswrapper[8606]: I1204 22:00:54.858556 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 22:00:54.860945 master-0 kubenswrapper[8606]: I1204 22:00:54.859720 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-5f8855d67b-mzflg"] Dec 04 22:00:54.860945 master-0 kubenswrapper[8606]: I1204 22:00:54.859952 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 22:00:54.862630 master-0 kubenswrapper[8606]: I1204 22:00:54.862596 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 22:00:54.862630 master-0 kubenswrapper[8606]: I1204 22:00:54.862613 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 22:00:54.863439 master-0 kubenswrapper[8606]: I1204 22:00:54.862633 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 22:00:54.863439 master-0 kubenswrapper[8606]: I1204 22:00:54.863228 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-8db7f8d79-rlqbz"] Dec 04 22:00:54.865449 master-0 kubenswrapper[8606]: I1204 22:00:54.865380 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 22:00:54.865601 master-0 kubenswrapper[8606]: I1204 22:00:54.865470 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 22:00:54.875262 master-0 kubenswrapper[8606]: I1204 22:00:54.874937 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 22:00:54.875262 master-0 kubenswrapper[8606]: I1204 22:00:54.875238 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.875472 master-0 kubenswrapper[8606]: I1204 22:00:54.875272 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-trusted-ca-bundle\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.875472 master-0 kubenswrapper[8606]: I1204 22:00:54.875304 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-node-pullsecrets\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.875472 master-0 kubenswrapper[8606]: I1204 22:00:54.875369 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-encryption-config\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.875472 master-0 kubenswrapper[8606]: I1204 22:00:54.875398 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-config\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.875472 master-0 kubenswrapper[8606]: I1204 22:00:54.875424 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfcv9\" (UniqueName: \"kubernetes.io/projected/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-kube-api-access-bfcv9\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.875472 master-0 kubenswrapper[8606]: I1204 22:00:54.875464 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit-dir\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.875887 master-0 kubenswrapper[8606]: I1204 22:00:54.875529 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-serving-ca\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.875887 master-0 kubenswrapper[8606]: I1204 22:00:54.875555 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-client\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.875887 master-0 kubenswrapper[8606]: I1204 22:00:54.875581 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.875887 master-0 kubenswrapper[8606]: I1204 22:00:54.875631 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-image-import-ca\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.878719 master-0 kubenswrapper[8606]: I1204 22:00:54.878662 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-5f8855d67b-mzflg"] Dec 04 22:00:54.879111 master-0 kubenswrapper[8606]: I1204 22:00:54.879039 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 22:00:54.880979 master-0 kubenswrapper[8606]: I1204 22:00:54.880914 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 22:00:54.981590 master-0 kubenswrapper[8606]: I1204 22:00:54.981479 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-serving-ca\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.982221 master-0 kubenswrapper[8606]: I1204 22:00:54.982165 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-client\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.982337 master-0 kubenswrapper[8606]: I1204 22:00:54.982257 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.982717 master-0 kubenswrapper[8606]: I1204 22:00:54.982657 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-image-import-ca\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.982832 master-0 kubenswrapper[8606]: I1204 22:00:54.982752 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.982832 master-0 kubenswrapper[8606]: I1204 22:00:54.982805 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-trusted-ca-bundle\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.982832 master-0 kubenswrapper[8606]: I1204 22:00:54.982664 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-serving-ca\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.983024 master-0 kubenswrapper[8606]: I1204 22:00:54.982840 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-node-pullsecrets\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.983024 master-0 kubenswrapper[8606]: I1204 22:00:54.982924 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-node-pullsecrets\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.983024 master-0 kubenswrapper[8606]: I1204 22:00:54.982932 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-encryption-config\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.983024 master-0 kubenswrapper[8606]: I1204 22:00:54.983022 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-config\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.983332 master-0 kubenswrapper[8606]: E1204 22:00:54.983051 8606 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 04 22:00:54.983332 master-0 kubenswrapper[8606]: I1204 22:00:54.983077 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfcv9\" (UniqueName: \"kubernetes.io/projected/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-kube-api-access-bfcv9\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.983332 master-0 kubenswrapper[8606]: I1204 22:00:54.983141 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit-dir\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.983332 master-0 kubenswrapper[8606]: E1204 22:00:54.983274 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert podName:5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:55.483249505 +0000 UTC m=+20.293551720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert") pod "apiserver-8db7f8d79-rlqbz" (UID: "5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141") : secret "serving-cert" not found Dec 04 22:00:54.983620 master-0 kubenswrapper[8606]: I1204 22:00:54.983393 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit-dir\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.983620 master-0 kubenswrapper[8606]: I1204 22:00:54.983556 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-image-import-ca\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.983868 master-0 kubenswrapper[8606]: I1204 22:00:54.983813 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-config\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.983957 master-0 kubenswrapper[8606]: I1204 22:00:54.983893 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-trusted-ca-bundle\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.984080 master-0 kubenswrapper[8606]: I1204 22:00:54.984048 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.986197 master-0 kubenswrapper[8606]: I1204 22:00:54.986173 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-encryption-config\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:54.987754 master-0 kubenswrapper[8606]: I1204 22:00:54.987721 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-client\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:55.004636 master-0 kubenswrapper[8606]: I1204 22:00:55.004592 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfcv9\" (UniqueName: \"kubernetes.io/projected/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-kube-api-access-bfcv9\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:55.084418 master-0 kubenswrapper[8606]: I1204 22:00:55.084251 8606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:55.084418 master-0 kubenswrapper[8606]: I1204 22:00:55.084312 8606 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8b994b55-a93f-4a80-b72a-d6c3ee139ef8-audit\") on node \"master-0\" DevicePath \"\"" Dec 04 22:00:55.406429 master-0 kubenswrapper[8606]: I1204 22:00:55.406366 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b994b55-a93f-4a80-b72a-d6c3ee139ef8" path="/var/lib/kubelet/pods/8b994b55-a93f-4a80-b72a-d6c3ee139ef8/volumes" Dec 04 22:00:55.489233 master-0 kubenswrapper[8606]: I1204 22:00:55.489165 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:55.489701 master-0 kubenswrapper[8606]: E1204 22:00:55.489392 8606 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 04 22:00:55.489833 master-0 kubenswrapper[8606]: E1204 22:00:55.489772 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert podName:5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141 nodeName:}" failed. No retries permitted until 2025-12-04 22:00:56.489744552 +0000 UTC m=+21.300046777 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert") pod "apiserver-8db7f8d79-rlqbz" (UID: "5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141") : secret "serving-cert" not found Dec 04 22:00:56.504743 master-0 kubenswrapper[8606]: I1204 22:00:56.504665 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:56.513644 master-0 kubenswrapper[8606]: I1204 22:00:56.509797 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:56.695315 master-0 kubenswrapper[8606]: I1204 22:00:56.695249 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:00:57.027545 master-0 kubenswrapper[8606]: I1204 22:00:57.027472 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:57.027545 master-0 kubenswrapper[8606]: I1204 22:00:57.027536 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:57.027811 master-0 kubenswrapper[8606]: E1204 22:00:57.027723 8606 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:00:57.027811 master-0 kubenswrapper[8606]: E1204 22:00:57.027791 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca podName:de7f9a47-eab7-49ef-8479-2ee6953a4de9 nodeName:}" failed. No retries permitted until 2025-12-04 22:01:13.027768792 +0000 UTC m=+37.838071007 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca") pod "route-controller-manager-bf9b6cb7-nzhsl" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9") : configmap "client-ca" not found Dec 04 22:00:57.032820 master-0 kubenswrapper[8606]: I1204 22:00:57.032783 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert\") pod \"route-controller-manager-bf9b6cb7-nzhsl\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:00:57.974452 master-0 kubenswrapper[8606]: I1204 22:00:57.974358 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 04 22:00:57.975590 master-0 kubenswrapper[8606]: I1204 22:00:57.975269 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:00:57.979491 master-0 kubenswrapper[8606]: I1204 22:00:57.979396 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Dec 04 22:00:58.042247 master-0 kubenswrapper[8606]: I1204 22:00:58.042189 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 04 22:00:58.142903 master-0 kubenswrapper[8606]: I1204 22:00:58.142847 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-var-lock\") pod \"installer-1-master-0\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:00:58.143192 master-0 kubenswrapper[8606]: I1204 22:00:58.143174 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kube-api-access\") pod \"installer-1-master-0\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:00:58.143729 master-0 kubenswrapper[8606]: I1204 22:00:58.143653 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:00:58.244950 master-0 kubenswrapper[8606]: I1204 22:00:58.244777 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:00:58.244950 master-0 kubenswrapper[8606]: I1204 22:00:58.244915 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-var-lock\") pod \"installer-1-master-0\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:00:58.245197 master-0 kubenswrapper[8606]: I1204 22:00:58.244961 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kube-api-access\") pod \"installer-1-master-0\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:00:58.245197 master-0 kubenswrapper[8606]: I1204 22:00:58.244956 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:00:58.245197 master-0 kubenswrapper[8606]: I1204 22:00:58.245068 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-var-lock\") pod \"installer-1-master-0\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:01:00.055717 master-0 kubenswrapper[8606]: I1204 22:01:00.055660 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 04 22:01:00.057117 master-0 kubenswrapper[8606]: I1204 22:01:00.057095 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:00.061832 master-0 kubenswrapper[8606]: I1204 22:01:00.061801 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Dec 04 22:01:00.176209 master-0 kubenswrapper[8606]: I1204 22:01:00.176131 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:00.176604 master-0 kubenswrapper[8606]: I1204 22:01:00.176538 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca\") pod \"controller-manager-5686ff9f7d-xxnvs\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:01:00.176709 master-0 kubenswrapper[8606]: E1204 22:01:00.176613 8606 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 04 22:01:00.176709 master-0 kubenswrapper[8606]: E1204 22:01:00.176700 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca podName:83e7e3bd-0f10-4b2c-91e5-191a5b21be4f nodeName:}" failed. No retries permitted until 2025-12-04 22:01:16.17667769 +0000 UTC m=+40.986979915 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca") pod "controller-manager-5686ff9f7d-xxnvs" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f") : configmap "client-ca" not found Dec 04 22:01:00.176904 master-0 kubenswrapper[8606]: I1204 22:01:00.176786 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kube-api-access\") pod \"installer-1-master-0\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:00.176904 master-0 kubenswrapper[8606]: I1204 22:01:00.176883 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-var-lock\") pod \"installer-1-master-0\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:00.277964 master-0 kubenswrapper[8606]: I1204 22:01:00.277911 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kube-api-access\") pod \"installer-1-master-0\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:00.278345 master-0 kubenswrapper[8606]: I1204 22:01:00.278313 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-var-lock\") pod \"installer-1-master-0\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:00.278703 master-0 kubenswrapper[8606]: I1204 22:01:00.278668 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:00.278949 master-0 kubenswrapper[8606]: I1204 22:01:00.278521 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-var-lock\") pod \"installer-1-master-0\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:00.279116 master-0 kubenswrapper[8606]: I1204 22:01:00.278761 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:00.593017 master-0 kubenswrapper[8606]: I1204 22:01:00.592668 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 04 22:01:00.617482 master-0 kubenswrapper[8606]: I1204 22:01:00.617400 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kube-api-access\") pod \"installer-1-master-0\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:01:00.734947 master-0 kubenswrapper[8606]: I1204 22:01:00.734879 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:01:03.644555 master-0 kubenswrapper[8606]: I1204 22:01:03.643594 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kube-api-access\") pod \"installer-1-master-0\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:03.681169 master-0 kubenswrapper[8606]: I1204 22:01:03.681092 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:06.282532 master-0 kubenswrapper[8606]: I1204 22:01:06.277726 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 04 22:01:06.336536 master-0 kubenswrapper[8606]: I1204 22:01:06.332559 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj"] Dec 04 22:01:06.336536 master-0 kubenswrapper[8606]: I1204 22:01:06.332801 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" podUID="74b7c644-ad97-4009-aac7-550edabc55ae" containerName="cluster-version-operator" containerID="cri-o://df603d1634cf022b1fd00682ac83f2febe676517e6d4c121582f932bc1ae578d" gracePeriod=130 Dec 04 22:01:06.399102 master-0 kubenswrapper[8606]: I1204 22:01:06.399049 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw"] Dec 04 22:01:06.403247 master-0 kubenswrapper[8606]: I1204 22:01:06.399757 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.409030 master-0 kubenswrapper[8606]: I1204 22:01:06.405555 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 04 22:01:06.423454 master-0 kubenswrapper[8606]: I1204 22:01:06.421290 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 04 22:01:06.423454 master-0 kubenswrapper[8606]: I1204 22:01:06.421478 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 04 22:01:06.423454 master-0 kubenswrapper[8606]: I1204 22:01:06.421728 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 04 22:01:06.431348 master-0 kubenswrapper[8606]: I1204 22:01:06.424325 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw"] Dec 04 22:01:06.537296 master-0 kubenswrapper[8606]: I1204 22:01:06.537164 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x"] Dec 04 22:01:06.538712 master-0 kubenswrapper[8606]: I1204 22:01:06.537842 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.547135 master-0 kubenswrapper[8606]: I1204 22:01:06.547093 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 22:01:06.547671 master-0 kubenswrapper[8606]: I1204 22:01:06.547628 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 22:01:06.547787 master-0 kubenswrapper[8606]: I1204 22:01:06.547755 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 22:01:06.547901 master-0 kubenswrapper[8606]: I1204 22:01:06.547884 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 22:01:06.548013 master-0 kubenswrapper[8606]: I1204 22:01:06.547987 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 22:01:06.548013 master-0 kubenswrapper[8606]: I1204 22:01:06.547994 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 22:01:06.548109 master-0 kubenswrapper[8606]: I1204 22:01:06.547952 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 22:01:06.548185 master-0 kubenswrapper[8606]: I1204 22:01:06.548163 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 22:01:06.558690 master-0 kubenswrapper[8606]: I1204 22:01:06.558652 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw"] Dec 04 22:01:06.559660 master-0 kubenswrapper[8606]: I1204 22:01:06.559643 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.566984 master-0 kubenswrapper[8606]: I1204 22:01:06.566946 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 04 22:01:06.577284 master-0 kubenswrapper[8606]: I1204 22:01:06.577244 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 04 22:01:06.621575 master-0 kubenswrapper[8606]: I1204 22:01:06.620944 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 04 22:01:06.622822 master-0 kubenswrapper[8606]: I1204 22:01:06.622763 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x"] Dec 04 22:01:06.623778 master-0 kubenswrapper[8606]: I1204 22:01:06.623714 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cfhv\" (UniqueName: \"kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-kube-api-access-2cfhv\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.623845 master-0 kubenswrapper[8606]: I1204 22:01:06.623788 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fmp4\" (UniqueName: \"kubernetes.io/projected/989a73ce-3898-4f65-a437-2c7061f9375f-kube-api-access-7fmp4\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.623912 master-0 kubenswrapper[8606]: I1204 22:01:06.623884 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4czl\" (UniqueName: \"kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-kube-api-access-r4czl\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.623959 master-0 kubenswrapper[8606]: I1204 22:01:06.623921 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-audit-policies\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.623959 master-0 kubenswrapper[8606]: I1204 22:01:06.623945 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-ca-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.624022 master-0 kubenswrapper[8606]: I1204 22:01:06.623970 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb0274dc-fac1-41f9-b3e5-77253d851fdf-cache\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.624022 master-0 kubenswrapper[8606]: I1204 22:01:06.624004 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.624096 master-0 kubenswrapper[8606]: I1204 22:01:06.624026 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.624096 master-0 kubenswrapper[8606]: I1204 22:01:06.624060 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.624163 master-0 kubenswrapper[8606]: I1204 22:01:06.624108 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-encryption-config\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.624163 master-0 kubenswrapper[8606]: I1204 22:01:06.624137 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/989a73ce-3898-4f65-a437-2c7061f9375f-audit-dir\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.624163 master-0 kubenswrapper[8606]: I1204 22:01:06.624161 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.624248 master-0 kubenswrapper[8606]: I1204 22:01:06.624188 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-serving-cert\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.624248 master-0 kubenswrapper[8606]: I1204 22:01:06.624213 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-trusted-ca-bundle\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.624248 master-0 kubenswrapper[8606]: I1204 22:01:06.624236 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.624338 master-0 kubenswrapper[8606]: I1204 22:01:06.624259 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-cache\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.624338 master-0 kubenswrapper[8606]: I1204 22:01:06.624282 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-client\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.624338 master-0 kubenswrapper[8606]: I1204 22:01:06.624307 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/fb0274dc-fac1-41f9-b3e5-77253d851fdf-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.624338 master-0 kubenswrapper[8606]: I1204 22:01:06.624330 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-serving-ca\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.626483 master-0 kubenswrapper[8606]: I1204 22:01:06.626439 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw"] Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.725453 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4czl\" (UniqueName: \"kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-kube-api-access-r4czl\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.725609 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-audit-policies\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.725668 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-ca-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.725703 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb0274dc-fac1-41f9-b3e5-77253d851fdf-cache\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.725761 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.725785 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.725837 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.725938 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-encryption-config\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.725964 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/989a73ce-3898-4f65-a437-2c7061f9375f-audit-dir\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726019 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726082 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-serving-cert\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726134 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-trusted-ca-bundle\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726159 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726185 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-cache\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726228 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-client\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726254 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/fb0274dc-fac1-41f9-b3e5-77253d851fdf-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726337 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-serving-ca\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726393 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cfhv\" (UniqueName: \"kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-kube-api-access-2cfhv\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726460 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fmp4\" (UniqueName: \"kubernetes.io/projected/989a73ce-3898-4f65-a437-2c7061f9375f-kube-api-access-7fmp4\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.726782 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/989a73ce-3898-4f65-a437-2c7061f9375f-audit-dir\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.727280 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.727988 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-audit-policies\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.728259 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.728323 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-trusted-ca-bundle\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.728801 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.729253 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb0274dc-fac1-41f9-b3e5-77253d851fdf-cache\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.729401 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-serving-ca\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.729569 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.729836 master-0 kubenswrapper[8606]: I1204 22:01:06.729705 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-cache\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.731856 master-0 kubenswrapper[8606]: I1204 22:01:06.731813 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-client\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.732920 master-0 kubenswrapper[8606]: I1204 22:01:06.732874 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-encryption-config\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.733003 master-0 kubenswrapper[8606]: I1204 22:01:06.732922 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/fb0274dc-fac1-41f9-b3e5-77253d851fdf-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.733515 master-0 kubenswrapper[8606]: I1204 22:01:06.733456 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-serving-cert\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.739213 master-0 kubenswrapper[8606]: I1204 22:01:06.739063 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.746579 master-0 kubenswrapper[8606]: I1204 22:01:06.741974 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs"] Dec 04 22:01:06.746579 master-0 kubenswrapper[8606]: E1204 22:01:06.742425 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" podUID="83e7e3bd-0f10-4b2c-91e5-191a5b21be4f" Dec 04 22:01:06.746579 master-0 kubenswrapper[8606]: I1204 22:01:06.743926 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-ca-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.755207 master-0 kubenswrapper[8606]: I1204 22:01:06.755169 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4czl\" (UniqueName: \"kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-kube-api-access-r4czl\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.755489 master-0 kubenswrapper[8606]: I1204 22:01:06.755453 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:06.756003 master-0 kubenswrapper[8606]: I1204 22:01:06.755967 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fmp4\" (UniqueName: \"kubernetes.io/projected/989a73ce-3898-4f65-a437-2c7061f9375f-kube-api-access-7fmp4\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.756742 master-0 kubenswrapper[8606]: I1204 22:01:06.756694 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cfhv\" (UniqueName: \"kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-kube-api-access-2cfhv\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:06.757058 master-0 kubenswrapper[8606]: I1204 22:01:06.756985 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl"] Dec 04 22:01:06.757732 master-0 kubenswrapper[8606]: E1204 22:01:06.757413 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" podUID="de7f9a47-eab7-49ef-8479-2ee6953a4de9" Dec 04 22:01:06.853908 master-0 kubenswrapper[8606]: I1204 22:01:06.853734 8606 generic.go:334] "Generic (PLEG): container finished" podID="74b7c644-ad97-4009-aac7-550edabc55ae" containerID="df603d1634cf022b1fd00682ac83f2febe676517e6d4c121582f932bc1ae578d" exitCode=0 Dec 04 22:01:06.853908 master-0 kubenswrapper[8606]: I1204 22:01:06.853848 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:01:06.853908 master-0 kubenswrapper[8606]: I1204 22:01:06.853862 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" event={"ID":"74b7c644-ad97-4009-aac7-550edabc55ae","Type":"ContainerDied","Data":"df603d1634cf022b1fd00682ac83f2febe676517e6d4c121582f932bc1ae578d"} Dec 04 22:01:06.854245 master-0 kubenswrapper[8606]: I1204 22:01:06.853928 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:01:06.862844 master-0 kubenswrapper[8606]: I1204 22:01:06.862810 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:01:06.868454 master-0 kubenswrapper[8606]: I1204 22:01:06.868419 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:01:06.927910 master-0 kubenswrapper[8606]: I1204 22:01:06.927859 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert\") pod \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " Dec 04 22:01:06.927910 master-0 kubenswrapper[8606]: I1204 22:01:06.927917 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8lls\" (UniqueName: \"kubernetes.io/projected/de7f9a47-eab7-49ef-8479-2ee6953a4de9-kube-api-access-b8lls\") pod \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " Dec 04 22:01:06.928169 master-0 kubenswrapper[8606]: I1204 22:01:06.927996 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-config\") pod \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " Dec 04 22:01:06.928169 master-0 kubenswrapper[8606]: I1204 22:01:06.928044 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-config\") pod \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " Dec 04 22:01:06.928169 master-0 kubenswrapper[8606]: I1204 22:01:06.928078 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r9r6\" (UniqueName: \"kubernetes.io/projected/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-kube-api-access-9r9r6\") pod \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " Dec 04 22:01:06.928169 master-0 kubenswrapper[8606]: I1204 22:01:06.928107 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert\") pod \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\" (UID: \"de7f9a47-eab7-49ef-8479-2ee6953a4de9\") " Dec 04 22:01:06.928169 master-0 kubenswrapper[8606]: I1204 22:01:06.928137 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-proxy-ca-bundles\") pod \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\" (UID: \"83e7e3bd-0f10-4b2c-91e5-191a5b21be4f\") " Dec 04 22:01:06.928626 master-0 kubenswrapper[8606]: I1204 22:01:06.928579 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-config" (OuterVolumeSpecName: "config") pod "de7f9a47-eab7-49ef-8479-2ee6953a4de9" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:01:06.928752 master-0 kubenswrapper[8606]: I1204 22:01:06.928703 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-config" (OuterVolumeSpecName: "config") pod "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:01:06.929056 master-0 kubenswrapper[8606]: I1204 22:01:06.929015 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:01:06.931848 master-0 kubenswrapper[8606]: I1204 22:01:06.931787 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de7f9a47-eab7-49ef-8479-2ee6953a4de9-kube-api-access-b8lls" (OuterVolumeSpecName: "kube-api-access-b8lls") pod "de7f9a47-eab7-49ef-8479-2ee6953a4de9" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9"). InnerVolumeSpecName "kube-api-access-b8lls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:01:06.932436 master-0 kubenswrapper[8606]: I1204 22:01:06.932401 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-kube-api-access-9r9r6" (OuterVolumeSpecName: "kube-api-access-9r9r6") pod "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f"). InnerVolumeSpecName "kube-api-access-9r9r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:01:06.932522 master-0 kubenswrapper[8606]: I1204 22:01:06.932448 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f" (UID: "83e7e3bd-0f10-4b2c-91e5-191a5b21be4f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:01:06.942543 master-0 kubenswrapper[8606]: I1204 22:01:06.942445 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "de7f9a47-eab7-49ef-8479-2ee6953a4de9" (UID: "de7f9a47-eab7-49ef-8479-2ee6953a4de9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:01:06.944810 master-0 kubenswrapper[8606]: I1204 22:01:06.944764 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:06.955400 master-0 kubenswrapper[8606]: I1204 22:01:06.955338 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:07.029915 master-0 kubenswrapper[8606]: I1204 22:01:07.029851 8606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:07.029915 master-0 kubenswrapper[8606]: I1204 22:01:07.029901 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r9r6\" (UniqueName: \"kubernetes.io/projected/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-kube-api-access-9r9r6\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:07.029915 master-0 kubenswrapper[8606]: I1204 22:01:07.029937 8606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de7f9a47-eab7-49ef-8479-2ee6953a4de9-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:07.029915 master-0 kubenswrapper[8606]: I1204 22:01:07.029948 8606 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:07.030675 master-0 kubenswrapper[8606]: I1204 22:01:07.029958 8606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:07.030675 master-0 kubenswrapper[8606]: I1204 22:01:07.029970 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8lls\" (UniqueName: \"kubernetes.io/projected/de7f9a47-eab7-49ef-8479-2ee6953a4de9-kube-api-access-b8lls\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:07.030675 master-0 kubenswrapper[8606]: I1204 22:01:07.029983 8606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:07.857857 master-0 kubenswrapper[8606]: I1204 22:01:07.857781 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl" Dec 04 22:01:07.858754 master-0 kubenswrapper[8606]: I1204 22:01:07.858387 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs" Dec 04 22:01:07.901835 master-0 kubenswrapper[8606]: I1204 22:01:07.901734 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw"] Dec 04 22:01:07.902737 master-0 kubenswrapper[8606]: I1204 22:01:07.902692 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:07.911544 master-0 kubenswrapper[8606]: I1204 22:01:07.905679 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 22:01:07.911544 master-0 kubenswrapper[8606]: I1204 22:01:07.905979 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 22:01:07.911544 master-0 kubenswrapper[8606]: I1204 22:01:07.906120 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 22:01:07.911544 master-0 kubenswrapper[8606]: I1204 22:01:07.906257 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 22:01:07.911544 master-0 kubenswrapper[8606]: I1204 22:01:07.906394 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 22:01:07.911544 master-0 kubenswrapper[8606]: I1204 22:01:07.910153 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl"] Dec 04 22:01:07.919679 master-0 kubenswrapper[8606]: I1204 22:01:07.916287 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bf9b6cb7-nzhsl"] Dec 04 22:01:07.919679 master-0 kubenswrapper[8606]: I1204 22:01:07.919012 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw"] Dec 04 22:01:07.922923 master-0 kubenswrapper[8606]: I1204 22:01:07.922856 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs"] Dec 04 22:01:07.930247 master-0 kubenswrapper[8606]: I1204 22:01:07.930195 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5686ff9f7d-xxnvs"] Dec 04 22:01:07.942334 master-0 kubenswrapper[8606]: I1204 22:01:07.942299 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmt9d\" (UniqueName: \"kubernetes.io/projected/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-kube-api-access-qmt9d\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:07.942574 master-0 kubenswrapper[8606]: I1204 22:01:07.942551 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-serving-cert\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:07.942766 master-0 kubenswrapper[8606]: I1204 22:01:07.942744 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-config\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:07.943113 master-0 kubenswrapper[8606]: I1204 22:01:07.943075 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-client-ca\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:08.044828 master-0 kubenswrapper[8606]: I1204 22:01:08.044758 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-config\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:08.045070 master-0 kubenswrapper[8606]: I1204 22:01:08.044871 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-client-ca\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:08.048083 master-0 kubenswrapper[8606]: I1204 22:01:08.047023 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-config\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:08.048083 master-0 kubenswrapper[8606]: I1204 22:01:08.047713 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-client-ca\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:08.048083 master-0 kubenswrapper[8606]: I1204 22:01:08.047759 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmt9d\" (UniqueName: \"kubernetes.io/projected/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-kube-api-access-qmt9d\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:08.048083 master-0 kubenswrapper[8606]: I1204 22:01:08.047794 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-serving-cert\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:08.048665 master-0 kubenswrapper[8606]: I1204 22:01:08.048452 8606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:08.048665 master-0 kubenswrapper[8606]: I1204 22:01:08.048484 8606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de7f9a47-eab7-49ef-8479-2ee6953a4de9-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:08.053540 master-0 kubenswrapper[8606]: I1204 22:01:08.053362 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-serving-cert\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:08.068183 master-0 kubenswrapper[8606]: I1204 22:01:08.068129 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmt9d\" (UniqueName: \"kubernetes.io/projected/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-kube-api-access-qmt9d\") pod \"route-controller-manager-85f9d6bb6-vswnw\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:08.129292 master-0 kubenswrapper[8606]: I1204 22:01:08.129243 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:01:08.223923 master-0 kubenswrapper[8606]: I1204 22:01:08.223341 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:08.251953 master-0 kubenswrapper[8606]: I1204 22:01:08.251897 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b7c644-ad97-4009-aac7-550edabc55ae-kube-api-access\") pod \"74b7c644-ad97-4009-aac7-550edabc55ae\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " Dec 04 22:01:08.252134 master-0 kubenswrapper[8606]: I1204 22:01:08.251966 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-cvo-updatepayloads\") pod \"74b7c644-ad97-4009-aac7-550edabc55ae\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " Dec 04 22:01:08.252134 master-0 kubenswrapper[8606]: I1204 22:01:08.252012 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74b7c644-ad97-4009-aac7-550edabc55ae-service-ca\") pod \"74b7c644-ad97-4009-aac7-550edabc55ae\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " Dec 04 22:01:08.252134 master-0 kubenswrapper[8606]: I1204 22:01:08.252040 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-ssl-certs\") pod \"74b7c644-ad97-4009-aac7-550edabc55ae\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " Dec 04 22:01:08.252134 master-0 kubenswrapper[8606]: I1204 22:01:08.252111 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") pod \"74b7c644-ad97-4009-aac7-550edabc55ae\" (UID: \"74b7c644-ad97-4009-aac7-550edabc55ae\") " Dec 04 22:01:08.252394 master-0 kubenswrapper[8606]: I1204 22:01:08.252279 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:01:08.253370 master-0 kubenswrapper[8606]: I1204 22:01:08.253315 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74b7c644-ad97-4009-aac7-550edabc55ae-service-ca" (OuterVolumeSpecName: "service-ca") pod "74b7c644-ad97-4009-aac7-550edabc55ae" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:01:08.259139 master-0 kubenswrapper[8606]: I1204 22:01:08.257668 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "74b7c644-ad97-4009-aac7-550edabc55ae" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:08.259139 master-0 kubenswrapper[8606]: I1204 22:01:08.257729 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "74b7c644-ad97-4009-aac7-550edabc55ae" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:08.262069 master-0 kubenswrapper[8606]: I1204 22:01:08.262032 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:01:08.262745 master-0 kubenswrapper[8606]: I1204 22:01:08.262680 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "74b7c644-ad97-4009-aac7-550edabc55ae" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:01:08.282305 master-0 kubenswrapper[8606]: I1204 22:01:08.281569 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74b7c644-ad97-4009-aac7-550edabc55ae-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "74b7c644-ad97-4009-aac7-550edabc55ae" (UID: "74b7c644-ad97-4009-aac7-550edabc55ae"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:01:08.353405 master-0 kubenswrapper[8606]: I1204 22:01:08.353353 8606 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:08.353405 master-0 kubenswrapper[8606]: I1204 22:01:08.353390 8606 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74b7c644-ad97-4009-aac7-550edabc55ae-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:08.353405 master-0 kubenswrapper[8606]: I1204 22:01:08.353400 8606 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/74b7c644-ad97-4009-aac7-550edabc55ae-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:08.353405 master-0 kubenswrapper[8606]: I1204 22:01:08.353409 8606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74b7c644-ad97-4009-aac7-550edabc55ae-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:08.353405 master-0 kubenswrapper[8606]: I1204 22:01:08.353419 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74b7c644-ad97-4009-aac7-550edabc55ae-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:08.466817 master-0 kubenswrapper[8606]: I1204 22:01:08.466712 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:01:08.586467 master-0 kubenswrapper[8606]: I1204 22:01:08.586421 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 04 22:01:08.586713 master-0 kubenswrapper[8606]: E1204 22:01:08.586650 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74b7c644-ad97-4009-aac7-550edabc55ae" containerName="cluster-version-operator" Dec 04 22:01:08.586713 master-0 kubenswrapper[8606]: I1204 22:01:08.586669 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="74b7c644-ad97-4009-aac7-550edabc55ae" containerName="cluster-version-operator" Dec 04 22:01:08.586860 master-0 kubenswrapper[8606]: I1204 22:01:08.586754 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="74b7c644-ad97-4009-aac7-550edabc55ae" containerName="cluster-version-operator" Dec 04 22:01:08.588086 master-0 kubenswrapper[8606]: I1204 22:01:08.587139 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.599867 master-0 kubenswrapper[8606]: I1204 22:01:08.599822 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 04 22:01:08.623490 master-0 kubenswrapper[8606]: I1204 22:01:08.623405 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-8db7f8d79-rlqbz"] Dec 04 22:01:08.665169 master-0 kubenswrapper[8606]: I1204 22:01:08.664919 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-var-lock\") pod \"installer-2-master-0\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.665169 master-0 kubenswrapper[8606]: I1204 22:01:08.665004 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kube-api-access\") pod \"installer-2-master-0\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.665169 master-0 kubenswrapper[8606]: I1204 22:01:08.665024 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.715545 master-0 kubenswrapper[8606]: I1204 22:01:08.712293 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw"] Dec 04 22:01:08.715545 master-0 kubenswrapper[8606]: I1204 22:01:08.713654 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 04 22:01:08.743190 master-0 kubenswrapper[8606]: I1204 22:01:08.727014 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 04 22:01:08.743190 master-0 kubenswrapper[8606]: I1204 22:01:08.728261 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw"] Dec 04 22:01:08.743190 master-0 kubenswrapper[8606]: W1204 22:01:08.733287 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0b9d1530_9fd8_4c69_8ed7_62b7af1f4eab.slice/crio-93283789117b0576a3fa7e5b0b96c59c1e9ecb75d010ca6d14ffe858c88069c2 WatchSource:0}: Error finding container 93283789117b0576a3fa7e5b0b96c59c1e9ecb75d010ca6d14ffe858c88069c2: Status 404 returned error can't find the container with id 93283789117b0576a3fa7e5b0b96c59c1e9ecb75d010ca6d14ffe858c88069c2 Dec 04 22:01:08.765914 master-0 kubenswrapper[8606]: I1204 22:01:08.765875 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-var-lock\") pod \"installer-2-master-0\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.766032 master-0 kubenswrapper[8606]: I1204 22:01:08.765936 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kube-api-access\") pod \"installer-2-master-0\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.766032 master-0 kubenswrapper[8606]: I1204 22:01:08.765962 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.766143 master-0 kubenswrapper[8606]: I1204 22:01:08.766053 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.766143 master-0 kubenswrapper[8606]: I1204 22:01:08.766089 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-var-lock\") pod \"installer-2-master-0\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.797459 master-0 kubenswrapper[8606]: I1204 22:01:08.796518 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kube-api-access\") pod \"installer-2-master-0\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.935529 master-0 kubenswrapper[8606]: I1204 22:01:08.930101 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:08.950257 master-0 kubenswrapper[8606]: I1204 22:01:08.948365 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" event={"ID":"35821f48-b000-4915-847f-a739b6efc5ee","Type":"ContainerStarted","Data":"50388f7f0e80879ab901f5d4bca3a6dc9ea1a39437e9578943365b4294d8b25d"} Dec 04 22:01:08.951045 master-0 kubenswrapper[8606]: I1204 22:01:08.951015 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" event={"ID":"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e","Type":"ContainerStarted","Data":"c66dd95c0f8de04a122a0868d06a2262d196a460f011d7a83f8a847ec862a494"} Dec 04 22:01:08.967305 master-0 kubenswrapper[8606]: I1204 22:01:08.963314 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw"] Dec 04 22:01:08.967305 master-0 kubenswrapper[8606]: I1204 22:01:08.963615 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x"] Dec 04 22:01:09.021616 master-0 kubenswrapper[8606]: I1204 22:01:09.021494 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab","Type":"ContainerStarted","Data":"93283789117b0576a3fa7e5b0b96c59c1e9ecb75d010ca6d14ffe858c88069c2"} Dec 04 22:01:09.028611 master-0 kubenswrapper[8606]: I1204 22:01:09.028573 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5"] Dec 04 22:01:09.028872 master-0 kubenswrapper[8606]: I1204 22:01:09.028835 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" event={"ID":"c6a5d14d-0409-4024-b0a8-200fa2594185","Type":"ContainerStarted","Data":"bad146c5ce315f7f5070081135587df7a077e864def57e2c38a773560069cf17"} Dec 04 22:01:09.031978 master-0 kubenswrapper[8606]: I1204 22:01:09.031944 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:01:09.039395 master-0 kubenswrapper[8606]: I1204 22:01:09.037205 8606 patch_prober.go:28] interesting pod/marketplace-operator-f797b99b6-m9m4h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" start-of-body= Dec 04 22:01:09.039395 master-0 kubenswrapper[8606]: I1204 22:01:09.037252 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" podUID="c6a5d14d-0409-4024-b0a8-200fa2594185" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" Dec 04 22:01:09.039809 master-0 kubenswrapper[8606]: I1204 22:01:09.039624 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" event={"ID":"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141","Type":"ContainerStarted","Data":"7d02c679c1b193ea195c44f77ce5059c11b500930cda814d106399c1a88668f1"} Dec 04 22:01:09.041924 master-0 kubenswrapper[8606]: I1204 22:01:09.041897 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerStarted","Data":"fe41c35d4fc12b10f7c0380ded0175f838a7cb9e3aad0aa5a08446be17e65126"} Dec 04 22:01:09.049029 master-0 kubenswrapper[8606]: I1204 22:01:09.048920 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" event={"ID":"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf","Type":"ContainerStarted","Data":"7e4aac9f5c23b83a163652977c7ded79014cb793917982a18845f9835af09587"} Dec 04 22:01:09.073979 master-0 kubenswrapper[8606]: I1204 22:01:09.073922 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8","Type":"ContainerStarted","Data":"2a1fefd29f05b97e4c5ee27b34767e0d80a32d26202bf1b9a739ea0fdd39de79"} Dec 04 22:01:09.084425 master-0 kubenswrapper[8606]: I1204 22:01:09.084366 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerStarted","Data":"5af5cfe128eaa351f012440567883f2b0f5ad3e1b0e50ea2b67166561450dd28"} Dec 04 22:01:09.129563 master-0 kubenswrapper[8606]: I1204 22:01:09.129435 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"7d50a2402c9a263e61e9e85f0a1f6b2e94c325a730cd34ce03c35d10609073b3"} Dec 04 22:01:09.129563 master-0 kubenswrapper[8606]: I1204 22:01:09.129535 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"f32a0325771ce40043e2990b6e044b2e673986f92037baf7df71e61135c7bd82"} Dec 04 22:01:09.162866 master-0 kubenswrapper[8606]: I1204 22:01:09.162580 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" event={"ID":"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5","Type":"ContainerStarted","Data":"8633710bdfee80e425263a29ac50ccbbe837adb739d1f5f7dd89215987ff9bbf"} Dec 04 22:01:09.167404 master-0 kubenswrapper[8606]: I1204 22:01:09.166181 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" event={"ID":"74b7c644-ad97-4009-aac7-550edabc55ae","Type":"ContainerDied","Data":"9ba3114789d579136ad84d27a499346b11ba547f0a0d6e213113615a12220157"} Dec 04 22:01:09.167404 master-0 kubenswrapper[8606]: I1204 22:01:09.166224 8606 scope.go:117] "RemoveContainer" containerID="df603d1634cf022b1fd00682ac83f2febe676517e6d4c121582f932bc1ae578d" Dec 04 22:01:09.167404 master-0 kubenswrapper[8606]: I1204 22:01:09.166348 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj" Dec 04 22:01:09.207719 master-0 kubenswrapper[8606]: I1204 22:01:09.206428 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9pfhj" event={"ID":"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa","Type":"ContainerStarted","Data":"5c975667384a07097695d15e2e30aab1bd6d4d9f872c5d99e129896563421798"} Dec 04 22:01:09.276532 master-0 kubenswrapper[8606]: I1204 22:01:09.276412 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj"] Dec 04 22:01:09.278797 master-0 kubenswrapper[8606]: I1204 22:01:09.278384 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-77dfcc565f-2smgj"] Dec 04 22:01:09.382617 master-0 kubenswrapper[8606]: I1204 22:01:09.382006 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5"] Dec 04 22:01:09.383150 master-0 kubenswrapper[8606]: I1204 22:01:09.383112 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.387313 master-0 kubenswrapper[8606]: I1204 22:01:09.387032 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 22:01:09.387478 master-0 kubenswrapper[8606]: I1204 22:01:09.387345 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 22:01:09.391316 master-0 kubenswrapper[8606]: I1204 22:01:09.387703 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 22:01:09.405577 master-0 kubenswrapper[8606]: I1204 22:01:09.405482 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74b7c644-ad97-4009-aac7-550edabc55ae" path="/var/lib/kubelet/pods/74b7c644-ad97-4009-aac7-550edabc55ae/volumes" Dec 04 22:01:09.407761 master-0 kubenswrapper[8606]: I1204 22:01:09.407707 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e7e3bd-0f10-4b2c-91e5-191a5b21be4f" path="/var/lib/kubelet/pods/83e7e3bd-0f10-4b2c-91e5-191a5b21be4f/volumes" Dec 04 22:01:09.408445 master-0 kubenswrapper[8606]: I1204 22:01:09.408411 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de7f9a47-eab7-49ef-8479-2ee6953a4de9" path="/var/lib/kubelet/pods/de7f9a47-eab7-49ef-8479-2ee6953a4de9/volumes" Dec 04 22:01:09.483732 master-0 kubenswrapper[8606]: I1204 22:01:09.483359 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vvs9c"] Dec 04 22:01:09.484440 master-0 kubenswrapper[8606]: I1204 22:01:09.484405 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:09.487862 master-0 kubenswrapper[8606]: I1204 22:01:09.487816 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 22:01:09.490462 master-0 kubenswrapper[8606]: I1204 22:01:09.490417 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vvs9c"] Dec 04 22:01:09.493158 master-0 kubenswrapper[8606]: I1204 22:01:09.493130 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 22:01:09.494438 master-0 kubenswrapper[8606]: I1204 22:01:09.494419 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 22:01:09.495827 master-0 kubenswrapper[8606]: I1204 22:01:09.495786 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 22:01:09.508124 master-0 kubenswrapper[8606]: I1204 22:01:09.508084 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8636bd7-fa9e-44b9-82df-9d37b398736d-kube-api-access\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.508253 master-0 kubenswrapper[8606]: I1204 22:01:09.508173 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.508253 master-0 kubenswrapper[8606]: I1204 22:01:09.508228 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.508517 master-0 kubenswrapper[8606]: I1204 22:01:09.508435 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8636bd7-fa9e-44b9-82df-9d37b398736d-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.508564 master-0 kubenswrapper[8606]: I1204 22:01:09.508551 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8636bd7-fa9e-44b9-82df-9d37b398736d-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.613192 master-0 kubenswrapper[8606]: I1204 22:01:09.613114 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5c2d3b8-41c0-4531-b770-57b7c567fe30-metrics-tls\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:09.613192 master-0 kubenswrapper[8606]: I1204 22:01:09.613196 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8636bd7-fa9e-44b9-82df-9d37b398736d-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.613454 master-0 kubenswrapper[8606]: I1204 22:01:09.613246 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8636bd7-fa9e-44b9-82df-9d37b398736d-kube-api-access\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.613454 master-0 kubenswrapper[8606]: I1204 22:01:09.613290 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5c2d3b8-41c0-4531-b770-57b7c567fe30-config-volume\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:09.613454 master-0 kubenswrapper[8606]: I1204 22:01:09.613331 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.613454 master-0 kubenswrapper[8606]: I1204 22:01:09.613363 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vpbl\" (UniqueName: \"kubernetes.io/projected/a5c2d3b8-41c0-4531-b770-57b7c567fe30-kube-api-access-5vpbl\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:09.613454 master-0 kubenswrapper[8606]: I1204 22:01:09.613393 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8636bd7-fa9e-44b9-82df-9d37b398736d-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.613618 master-0 kubenswrapper[8606]: I1204 22:01:09.613470 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.613618 master-0 kubenswrapper[8606]: I1204 22:01:09.613580 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.615626 master-0 kubenswrapper[8606]: I1204 22:01:09.614821 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.617314 master-0 kubenswrapper[8606]: I1204 22:01:09.615911 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8636bd7-fa9e-44b9-82df-9d37b398736d-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.617482 master-0 kubenswrapper[8606]: I1204 22:01:09.617441 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 04 22:01:09.618646 master-0 kubenswrapper[8606]: I1204 22:01:09.618598 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8636bd7-fa9e-44b9-82df-9d37b398736d-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.639734 master-0 kubenswrapper[8606]: I1204 22:01:09.639656 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8636bd7-fa9e-44b9-82df-9d37b398736d-kube-api-access\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.711199 master-0 kubenswrapper[8606]: I1204 22:01:09.711155 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:01:09.714299 master-0 kubenswrapper[8606]: I1204 22:01:09.714257 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5c2d3b8-41c0-4531-b770-57b7c567fe30-config-volume\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:09.714370 master-0 kubenswrapper[8606]: I1204 22:01:09.714319 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vpbl\" (UniqueName: \"kubernetes.io/projected/a5c2d3b8-41c0-4531-b770-57b7c567fe30-kube-api-access-5vpbl\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:09.714430 master-0 kubenswrapper[8606]: I1204 22:01:09.714406 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5c2d3b8-41c0-4531-b770-57b7c567fe30-metrics-tls\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:09.714733 master-0 kubenswrapper[8606]: E1204 22:01:09.714688 8606 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Dec 04 22:01:09.714811 master-0 kubenswrapper[8606]: E1204 22:01:09.714779 8606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5c2d3b8-41c0-4531-b770-57b7c567fe30-metrics-tls podName:a5c2d3b8-41c0-4531-b770-57b7c567fe30 nodeName:}" failed. No retries permitted until 2025-12-04 22:01:10.214751116 +0000 UTC m=+35.025053331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a5c2d3b8-41c0-4531-b770-57b7c567fe30-metrics-tls") pod "dns-default-vvs9c" (UID: "a5c2d3b8-41c0-4531-b770-57b7c567fe30") : secret "dns-default-metrics-tls" not found Dec 04 22:01:09.715123 master-0 kubenswrapper[8606]: I1204 22:01:09.715080 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5c2d3b8-41c0-4531-b770-57b7c567fe30-config-volume\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:09.735944 master-0 kubenswrapper[8606]: I1204 22:01:09.735904 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vpbl\" (UniqueName: \"kubernetes.io/projected/a5c2d3b8-41c0-4531-b770-57b7c567fe30-kube-api-access-5vpbl\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:09.747227 master-0 kubenswrapper[8606]: W1204 22:01:09.747171 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8636bd7_fa9e_44b9_82df_9d37b398736d.slice/crio-cf7bfab376e6cd33db16a88818ab6aaf3d6cee23a9706c34952502952f7ad2f6 WatchSource:0}: Error finding container cf7bfab376e6cd33db16a88818ab6aaf3d6cee23a9706c34952502952f7ad2f6: Status 404 returned error can't find the container with id cf7bfab376e6cd33db16a88818ab6aaf3d6cee23a9706c34952502952f7ad2f6 Dec 04 22:01:09.853129 master-0 kubenswrapper[8606]: I1204 22:01:09.851298 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6mgn6"] Dec 04 22:01:09.853129 master-0 kubenswrapper[8606]: I1204 22:01:09.852118 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:01:10.020434 master-0 kubenswrapper[8606]: I1204 22:01:10.020376 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd5zx\" (UniqueName: \"kubernetes.io/projected/c2279404-fa75-4de2-a302-d7b15ead5232-kube-api-access-dd5zx\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:01:10.020434 master-0 kubenswrapper[8606]: I1204 22:01:10.020449 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2279404-fa75-4de2-a302-d7b15ead5232-hosts-file\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:01:10.126522 master-0 kubenswrapper[8606]: I1204 22:01:10.122018 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd5zx\" (UniqueName: \"kubernetes.io/projected/c2279404-fa75-4de2-a302-d7b15ead5232-kube-api-access-dd5zx\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:01:10.126522 master-0 kubenswrapper[8606]: I1204 22:01:10.122103 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2279404-fa75-4de2-a302-d7b15ead5232-hosts-file\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:01:10.126522 master-0 kubenswrapper[8606]: I1204 22:01:10.122216 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2279404-fa75-4de2-a302-d7b15ead5232-hosts-file\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:01:10.153786 master-0 kubenswrapper[8606]: I1204 22:01:10.151643 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd5zx\" (UniqueName: \"kubernetes.io/projected/c2279404-fa75-4de2-a302-d7b15ead5232-kube-api-access-dd5zx\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:01:10.204424 master-0 kubenswrapper[8606]: I1204 22:01:10.201754 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:01:10.216480 master-0 kubenswrapper[8606]: I1204 22:01:10.216286 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" event={"ID":"e81bd90d-2cb1-4f15-b85c-e141f2595f4e","Type":"ContainerStarted","Data":"1e2196dfe7da86556386896e68c289757b5874df82ea7b1910a060ddbec879ba"} Dec 04 22:01:10.218800 master-0 kubenswrapper[8606]: I1204 22:01:10.218766 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" event={"ID":"989a73ce-3898-4f65-a437-2c7061f9375f","Type":"ContainerStarted","Data":"cf87cc00ba78c6e3cc8680200b1afa8d433e342bc7744db35e1f64b4a3e5a078"} Dec 04 22:01:10.223074 master-0 kubenswrapper[8606]: I1204 22:01:10.223045 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5c2d3b8-41c0-4531-b770-57b7c567fe30-metrics-tls\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:10.229705 master-0 kubenswrapper[8606]: I1204 22:01:10.229662 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" event={"ID":"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf","Type":"ContainerStarted","Data":"815be3bb78086065271ecb4d4b9b7c7f847598761d2c9ee58e7b745732e5f4f4"} Dec 04 22:01:10.229831 master-0 kubenswrapper[8606]: I1204 22:01:10.229804 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5c2d3b8-41c0-4531-b770-57b7c567fe30-metrics-tls\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:10.237042 master-0 kubenswrapper[8606]: I1204 22:01:10.236754 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"b977a2cf-4e95-4456-957d-b1ba05c0d1ff","Type":"ContainerStarted","Data":"8772a454066677ab94d735965a9fe39b0c5a1577a466faea547f6edb43aab31c"} Dec 04 22:01:10.237042 master-0 kubenswrapper[8606]: I1204 22:01:10.236818 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"b977a2cf-4e95-4456-957d-b1ba05c0d1ff","Type":"ContainerStarted","Data":"6285458f93d5be2507cf69523363385cf66539c4aef333534749d36e6229ace7"} Dec 04 22:01:10.237711 master-0 kubenswrapper[8606]: W1204 22:01:10.237677 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2279404_fa75_4de2_a302_d7b15ead5232.slice/crio-21f00834ca375d484385e21696e2d8aa4483b916d5393969f0246cfd9dc0471a WatchSource:0}: Error finding container 21f00834ca375d484385e21696e2d8aa4483b916d5393969f0246cfd9dc0471a: Status 404 returned error can't find the container with id 21f00834ca375d484385e21696e2d8aa4483b916d5393969f0246cfd9dc0471a Dec 04 22:01:10.242425 master-0 kubenswrapper[8606]: I1204 22:01:10.239802 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8/installer/0.log" Dec 04 22:01:10.242425 master-0 kubenswrapper[8606]: I1204 22:01:10.239887 8606 generic.go:334] "Generic (PLEG): container finished" podID="eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8" containerID="1a998c31642e8c9a57019706713574a958fedef13633c495742d1e893d4d1bc3" exitCode=1 Dec 04 22:01:10.242425 master-0 kubenswrapper[8606]: I1204 22:01:10.240130 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8","Type":"ContainerDied","Data":"1a998c31642e8c9a57019706713574a958fedef13633c495742d1e893d4d1bc3"} Dec 04 22:01:10.262728 master-0 kubenswrapper[8606]: I1204 22:01:10.260151 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" event={"ID":"a8636bd7-fa9e-44b9-82df-9d37b398736d","Type":"ContainerStarted","Data":"74597696ddcec56d10e68ca1d29cef5d1bfa40762646f7e9dc8729a8c66636fd"} Dec 04 22:01:10.262728 master-0 kubenswrapper[8606]: I1204 22:01:10.260238 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" event={"ID":"a8636bd7-fa9e-44b9-82df-9d37b398736d","Type":"ContainerStarted","Data":"cf7bfab376e6cd33db16a88818ab6aaf3d6cee23a9706c34952502952f7ad2f6"} Dec 04 22:01:10.269051 master-0 kubenswrapper[8606]: I1204 22:01:10.267019 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" event={"ID":"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e","Type":"ContainerStarted","Data":"62384a8d67b4be58bf2e7f4ec5f4cd98b0f7dbf3c8121990f105d429db9c0a66"} Dec 04 22:01:10.293529 master-0 kubenswrapper[8606]: I1204 22:01:10.293293 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=2.293270799 podStartE2EDuration="2.293270799s" podCreationTimestamp="2025-12-04 22:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:10.290756838 +0000 UTC m=+35.101059073" watchObservedRunningTime="2025-12-04 22:01:10.293270799 +0000 UTC m=+35.103573014" Dec 04 22:01:10.329539 master-0 kubenswrapper[8606]: I1204 22:01:10.322189 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab","Type":"ContainerStarted","Data":"4c4fa6995a939a53e102917b86fbd0f10791e85887df9e375f44a27329f6b171"} Dec 04 22:01:10.329539 master-0 kubenswrapper[8606]: I1204 22:01:10.328181 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" event={"ID":"813f3ee7-35b5-4ee8-b453-00d16d910eae","Type":"ContainerStarted","Data":"0744bc69885cb8b27025aac2761602ed1dd53a9e628a283b5e3ef1e171da58fa"} Dec 04 22:01:10.329539 master-0 kubenswrapper[8606]: I1204 22:01:10.328260 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" event={"ID":"813f3ee7-35b5-4ee8-b453-00d16d910eae","Type":"ContainerStarted","Data":"4c52307b147fc1f96631f9272147cbdbb3ffe8d871369692fc386dc96586c86f"} Dec 04 22:01:10.338539 master-0 kubenswrapper[8606]: I1204 22:01:10.338240 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9pfhj" event={"ID":"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa","Type":"ContainerStarted","Data":"b75befb683524bef4216c78e58648138696c5c0ab8c9682dcb3f075c7c87b206"} Dec 04 22:01:10.347857 master-0 kubenswrapper[8606]: I1204 22:01:10.340050 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" podStartSLOduration=1.3399944590000001 podStartE2EDuration="1.339994459s" podCreationTimestamp="2025-12-04 22:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:10.334623947 +0000 UTC m=+35.144926162" watchObservedRunningTime="2025-12-04 22:01:10.339994459 +0000 UTC m=+35.150296674" Dec 04 22:01:10.347857 master-0 kubenswrapper[8606]: I1204 22:01:10.340960 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw"] Dec 04 22:01:10.347857 master-0 kubenswrapper[8606]: I1204 22:01:10.341744 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.347857 master-0 kubenswrapper[8606]: I1204 22:01:10.344760 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 22:01:10.347857 master-0 kubenswrapper[8606]: I1204 22:01:10.345459 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 22:01:10.347857 master-0 kubenswrapper[8606]: I1204 22:01:10.345795 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 22:01:10.347857 master-0 kubenswrapper[8606]: I1204 22:01:10.346427 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 22:01:10.348369 master-0 kubenswrapper[8606]: I1204 22:01:10.347976 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 22:01:10.358434 master-0 kubenswrapper[8606]: I1204 22:01:10.350660 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw"] Dec 04 22:01:10.366552 master-0 kubenswrapper[8606]: I1204 22:01:10.363645 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 22:01:10.367835 master-0 kubenswrapper[8606]: I1204 22:01:10.367155 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerStarted","Data":"66b2e3479fe940e234e57684065e0fce7888af45d6710422bc86d256ccfc2307"} Dec 04 22:01:10.367835 master-0 kubenswrapper[8606]: I1204 22:01:10.367213 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerStarted","Data":"037f05faa0b4635e20f5127ded6c5b63a2893aa9715387918fd80e11092dcfbb"} Dec 04 22:01:10.367835 master-0 kubenswrapper[8606]: I1204 22:01:10.367294 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:10.403696 master-0 kubenswrapper[8606]: I1204 22:01:10.372730 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerStarted","Data":"5b7837bb8d893076191e798bbe6f7756d536495c527346610e4cc8ec29e29fe5"} Dec 04 22:01:10.403696 master-0 kubenswrapper[8606]: I1204 22:01:10.372768 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerStarted","Data":"6944a3b8b194a602e8e8da3a2a8db2470b7b88d403a9cce77f1224cd0d653cf1"} Dec 04 22:01:10.403696 master-0 kubenswrapper[8606]: I1204 22:01:10.372786 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:10.403696 master-0 kubenswrapper[8606]: I1204 22:01:10.379608 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:01:10.403696 master-0 kubenswrapper[8606]: I1204 22:01:10.390272 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=12.390247669 podStartE2EDuration="12.390247669s" podCreationTimestamp="2025-12-04 22:00:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:10.389718453 +0000 UTC m=+35.200020668" watchObservedRunningTime="2025-12-04 22:01:10.390247669 +0000 UTC m=+35.200549884" Dec 04 22:01:10.418455 master-0 kubenswrapper[8606]: I1204 22:01:10.414738 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" podStartSLOduration=4.413796593 podStartE2EDuration="4.413796593s" podCreationTimestamp="2025-12-04 22:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:10.413068553 +0000 UTC m=+35.223370788" watchObservedRunningTime="2025-12-04 22:01:10.413796593 +0000 UTC m=+35.224098808" Dec 04 22:01:10.418455 master-0 kubenswrapper[8606]: I1204 22:01:10.418321 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:10.426287 master-0 kubenswrapper[8606]: I1204 22:01:10.425709 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7xcp\" (UniqueName: \"kubernetes.io/projected/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-kube-api-access-x7xcp\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.426287 master-0 kubenswrapper[8606]: I1204 22:01:10.425760 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-proxy-ca-bundles\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.426287 master-0 kubenswrapper[8606]: I1204 22:01:10.425854 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-serving-cert\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.426287 master-0 kubenswrapper[8606]: I1204 22:01:10.425893 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-client-ca\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.426287 master-0 kubenswrapper[8606]: I1204 22:01:10.425929 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-config\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.485036 master-0 kubenswrapper[8606]: I1204 22:01:10.484678 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" podStartSLOduration=4.484651605 podStartE2EDuration="4.484651605s" podCreationTimestamp="2025-12-04 22:01:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:10.483345198 +0000 UTC m=+35.293647413" watchObservedRunningTime="2025-12-04 22:01:10.484651605 +0000 UTC m=+35.294953820" Dec 04 22:01:10.538064 master-0 kubenswrapper[8606]: I1204 22:01:10.528084 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-proxy-ca-bundles\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.538064 master-0 kubenswrapper[8606]: I1204 22:01:10.528213 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-serving-cert\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.538064 master-0 kubenswrapper[8606]: I1204 22:01:10.528259 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-client-ca\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.538064 master-0 kubenswrapper[8606]: I1204 22:01:10.528295 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-config\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.538064 master-0 kubenswrapper[8606]: I1204 22:01:10.528421 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7xcp\" (UniqueName: \"kubernetes.io/projected/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-kube-api-access-x7xcp\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.541118 master-0 kubenswrapper[8606]: I1204 22:01:10.539495 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-proxy-ca-bundles\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.541118 master-0 kubenswrapper[8606]: I1204 22:01:10.540749 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-config\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.542726 master-0 kubenswrapper[8606]: I1204 22:01:10.542525 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-serving-cert\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.544284 master-0 kubenswrapper[8606]: I1204 22:01:10.542999 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-client-ca\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.603888 master-0 kubenswrapper[8606]: I1204 22:01:10.603807 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7xcp\" (UniqueName: \"kubernetes.io/projected/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-kube-api-access-x7xcp\") pod \"controller-manager-5fcd8fbcb8-dhxmw\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.677280 master-0 kubenswrapper[8606]: I1204 22:01:10.677222 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:10.687866 master-0 kubenswrapper[8606]: I1204 22:01:10.687205 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8/installer/0.log" Dec 04 22:01:10.688061 master-0 kubenswrapper[8606]: I1204 22:01:10.687982 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:01:10.832942 master-0 kubenswrapper[8606]: I1204 22:01:10.832867 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kubelet-dir\") pod \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " Dec 04 22:01:10.833262 master-0 kubenswrapper[8606]: I1204 22:01:10.833033 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kube-api-access\") pod \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " Dec 04 22:01:10.833262 master-0 kubenswrapper[8606]: I1204 22:01:10.833073 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-var-lock\") pod \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\" (UID: \"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8\") " Dec 04 22:01:10.833262 master-0 kubenswrapper[8606]: I1204 22:01:10.833052 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8" (UID: "eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:10.833414 master-0 kubenswrapper[8606]: I1204 22:01:10.833266 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:10.833414 master-0 kubenswrapper[8606]: I1204 22:01:10.833324 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-var-lock" (OuterVolumeSpecName: "var-lock") pod "eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8" (UID: "eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:10.856528 master-0 kubenswrapper[8606]: I1204 22:01:10.841712 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8" (UID: "eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:01:10.934470 master-0 kubenswrapper[8606]: I1204 22:01:10.934419 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:10.934470 master-0 kubenswrapper[8606]: I1204 22:01:10.934459 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:10.968308 master-0 kubenswrapper[8606]: I1204 22:01:10.968246 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vvs9c"] Dec 04 22:01:10.973072 master-0 kubenswrapper[8606]: W1204 22:01:10.973027 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5c2d3b8_41c0_4531_b770_57b7c567fe30.slice/crio-9c65ccceea882cd7d898803d924bbc08d4f3c8fef9c388b0db802fee0be2d9fc WatchSource:0}: Error finding container 9c65ccceea882cd7d898803d924bbc08d4f3c8fef9c388b0db802fee0be2d9fc: Status 404 returned error can't find the container with id 9c65ccceea882cd7d898803d924bbc08d4f3c8fef9c388b0db802fee0be2d9fc Dec 04 22:01:11.186656 master-0 kubenswrapper[8606]: I1204 22:01:11.186588 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw"] Dec 04 22:01:11.386546 master-0 kubenswrapper[8606]: I1204 22:01:11.381542 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8/installer/0.log" Dec 04 22:01:11.386546 master-0 kubenswrapper[8606]: I1204 22:01:11.381945 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 04 22:01:11.386546 master-0 kubenswrapper[8606]: I1204 22:01:11.385060 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8","Type":"ContainerDied","Data":"2a1fefd29f05b97e4c5ee27b34767e0d80a32d26202bf1b9a739ea0fdd39de79"} Dec 04 22:01:11.386546 master-0 kubenswrapper[8606]: I1204 22:01:11.385133 8606 scope.go:117] "RemoveContainer" containerID="1a998c31642e8c9a57019706713574a958fedef13633c495742d1e893d4d1bc3" Dec 04 22:01:11.390888 master-0 kubenswrapper[8606]: I1204 22:01:11.390466 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vvs9c" event={"ID":"a5c2d3b8-41c0-4531-b770-57b7c567fe30","Type":"ContainerStarted","Data":"9c65ccceea882cd7d898803d924bbc08d4f3c8fef9c388b0db802fee0be2d9fc"} Dec 04 22:01:11.409756 master-0 kubenswrapper[8606]: I1204 22:01:11.409684 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6mgn6" event={"ID":"c2279404-fa75-4de2-a302-d7b15ead5232","Type":"ContainerStarted","Data":"5b4129a1c8cb6bbffa14cdb3068fee5202673442596555d1d242e01740dd7ca6"} Dec 04 22:01:11.409756 master-0 kubenswrapper[8606]: I1204 22:01:11.409727 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6mgn6" event={"ID":"c2279404-fa75-4de2-a302-d7b15ead5232","Type":"ContainerStarted","Data":"21f00834ca375d484385e21696e2d8aa4483b916d5393969f0246cfd9dc0471a"} Dec 04 22:01:11.409756 master-0 kubenswrapper[8606]: I1204 22:01:11.409740 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" event={"ID":"92572d6d-d7fc-4986-8b8e-df534cfa6ebb","Type":"ContainerStarted","Data":"482cd59136431bdb5a93c083c18b3f1c52f50b62465cb16d1a188f38f7a152a1"} Dec 04 22:01:11.421484 master-0 kubenswrapper[8606]: I1204 22:01:11.419684 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 04 22:01:11.422654 master-0 kubenswrapper[8606]: I1204 22:01:11.422193 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 04 22:01:11.436977 master-0 kubenswrapper[8606]: I1204 22:01:11.436891 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6mgn6" podStartSLOduration=2.436863723 podStartE2EDuration="2.436863723s" podCreationTimestamp="2025-12-04 22:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:11.435709061 +0000 UTC m=+36.246011266" watchObservedRunningTime="2025-12-04 22:01:11.436863723 +0000 UTC m=+36.247165928" Dec 04 22:01:13.404716 master-0 kubenswrapper[8606]: I1204 22:01:13.404650 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8" path="/var/lib/kubelet/pods/eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8/volumes" Dec 04 22:01:13.407025 master-0 kubenswrapper[8606]: I1204 22:01:13.407008 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 04 22:01:13.408017 master-0 kubenswrapper[8606]: E1204 22:01:13.408001 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8" containerName="installer" Dec 04 22:01:13.408109 master-0 kubenswrapper[8606]: I1204 22:01:13.408090 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8" containerName="installer" Dec 04 22:01:13.408277 master-0 kubenswrapper[8606]: I1204 22:01:13.408266 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb280caa-f1ba-4f2e-ac9e-3ecd94f627f8" containerName="installer" Dec 04 22:01:13.408909 master-0 kubenswrapper[8606]: I1204 22:01:13.408896 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 04 22:01:13.409109 master-0 kubenswrapper[8606]: I1204 22:01:13.409097 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:13.412422 master-0 kubenswrapper[8606]: I1204 22:01:13.411282 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 22:01:13.580097 master-0 kubenswrapper[8606]: I1204 22:01:13.580001 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:13.580097 master-0 kubenswrapper[8606]: I1204 22:01:13.580098 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-var-lock\") pod \"installer-1-master-0\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:13.580565 master-0 kubenswrapper[8606]: I1204 22:01:13.580148 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:13.681643 master-0 kubenswrapper[8606]: I1204 22:01:13.681474 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:13.681643 master-0 kubenswrapper[8606]: I1204 22:01:13.681562 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-var-lock\") pod \"installer-1-master-0\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:13.681643 master-0 kubenswrapper[8606]: I1204 22:01:13.681595 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:13.682204 master-0 kubenswrapper[8606]: I1204 22:01:13.682012 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:13.682204 master-0 kubenswrapper[8606]: I1204 22:01:13.682117 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-var-lock\") pod \"installer-1-master-0\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:13.702214 master-0 kubenswrapper[8606]: I1204 22:01:13.702166 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:13.730766 master-0 kubenswrapper[8606]: I1204 22:01:13.730271 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:01:15.044116 master-0 kubenswrapper[8606]: I1204 22:01:15.044008 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 04 22:01:15.444197 master-0 kubenswrapper[8606]: I1204 22:01:15.444121 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" event={"ID":"e81bd90d-2cb1-4f15-b85c-e141f2595f4e","Type":"ContainerStarted","Data":"13b020beb274eb5d9a2c536487e04c6e58c1dec75ff79002129442fa1045ca3e"} Dec 04 22:01:15.444197 master-0 kubenswrapper[8606]: I1204 22:01:15.444198 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:15.450156 master-0 kubenswrapper[8606]: I1204 22:01:15.450098 8606 generic.go:334] "Generic (PLEG): container finished" podID="989a73ce-3898-4f65-a437-2c7061f9375f" containerID="e1ab85fa23f372e6c12039f42a8215b4ecb7099a306302bdcd4c1624786fb3f7" exitCode=0 Dec 04 22:01:15.450325 master-0 kubenswrapper[8606]: I1204 22:01:15.450204 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" event={"ID":"989a73ce-3898-4f65-a437-2c7061f9375f","Type":"ContainerDied","Data":"e1ab85fa23f372e6c12039f42a8215b4ecb7099a306302bdcd4c1624786fb3f7"} Dec 04 22:01:15.453187 master-0 kubenswrapper[8606]: I1204 22:01:15.453151 8606 generic.go:334] "Generic (PLEG): container finished" podID="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" containerID="2e48ce38bedd0ef286f4eb2d0319a994f1a61d767e4cb03996c6448334d88c07" exitCode=0 Dec 04 22:01:15.453270 master-0 kubenswrapper[8606]: I1204 22:01:15.453158 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:15.453307 master-0 kubenswrapper[8606]: I1204 22:01:15.453273 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" event={"ID":"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141","Type":"ContainerDied","Data":"2e48ce38bedd0ef286f4eb2d0319a994f1a61d767e4cb03996c6448334d88c07"} Dec 04 22:01:15.455083 master-0 kubenswrapper[8606]: I1204 22:01:15.454921 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vvs9c" event={"ID":"a5c2d3b8-41c0-4531-b770-57b7c567fe30","Type":"ContainerStarted","Data":"45450357cf840130f50887c8a0378cc1abdb04813e3dcc85b0c07540beaa459f"} Dec 04 22:01:15.592538 master-0 kubenswrapper[8606]: I1204 22:01:15.592436 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" podStartSLOduration=4.024353853 podStartE2EDuration="9.59241004s" podCreationTimestamp="2025-12-04 22:01:06 +0000 UTC" firstStartedPulling="2025-12-04 22:01:09.053013734 +0000 UTC m=+33.863315949" lastFinishedPulling="2025-12-04 22:01:14.621069921 +0000 UTC m=+39.431372136" observedRunningTime="2025-12-04 22:01:15.567707252 +0000 UTC m=+40.378009487" watchObservedRunningTime="2025-12-04 22:01:15.59241004 +0000 UTC m=+40.402712245" Dec 04 22:01:16.113982 master-0 kubenswrapper[8606]: W1204 22:01:16.113817 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4b9fbd90_66d5_4637_9821_22242aa6f6d7.slice/crio-5aea2f5066e056e4a369b8871e8461c5b3fa8918d7cce8402f38ffd4c90c32d6 WatchSource:0}: Error finding container 5aea2f5066e056e4a369b8871e8461c5b3fa8918d7cce8402f38ffd4c90c32d6: Status 404 returned error can't find the container with id 5aea2f5066e056e4a369b8871e8461c5b3fa8918d7cce8402f38ffd4c90c32d6 Dec 04 22:01:16.400016 master-0 kubenswrapper[8606]: I1204 22:01:16.399955 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 04 22:01:16.400325 master-0 kubenswrapper[8606]: I1204 22:01:16.400259 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="b977a2cf-4e95-4456-957d-b1ba05c0d1ff" containerName="installer" containerID="cri-o://8772a454066677ab94d735965a9fe39b0c5a1577a466faea547f6edb43aab31c" gracePeriod=30 Dec 04 22:01:16.462702 master-0 kubenswrapper[8606]: I1204 22:01:16.462645 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"4b9fbd90-66d5-4637-9821-22242aa6f6d7","Type":"ContainerStarted","Data":"5aea2f5066e056e4a369b8871e8461c5b3fa8918d7cce8402f38ffd4c90c32d6"} Dec 04 22:01:16.473000 master-0 kubenswrapper[8606]: I1204 22:01:16.472940 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vvs9c" event={"ID":"a5c2d3b8-41c0-4531-b770-57b7c567fe30","Type":"ContainerStarted","Data":"493fc52c68401fa5964bb3eeaf4d67a35d8bc2236f565e9f9553b7ddae6747d8"} Dec 04 22:01:16.473167 master-0 kubenswrapper[8606]: I1204 22:01:16.473033 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:16.479348 master-0 kubenswrapper[8606]: I1204 22:01:16.479280 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" event={"ID":"92572d6d-d7fc-4986-8b8e-df534cfa6ebb","Type":"ContainerStarted","Data":"c70e8f850d7b8f9319325ca8a026a557fc6f36d8d9fa77ddab42d7f2a5c30f93"} Dec 04 22:01:16.479530 master-0 kubenswrapper[8606]: I1204 22:01:16.479457 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:16.486334 master-0 kubenswrapper[8606]: I1204 22:01:16.486289 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:16.580395 master-0 kubenswrapper[8606]: I1204 22:01:16.578018 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vvs9c" podStartSLOduration=3.894009585 podStartE2EDuration="7.57797854s" podCreationTimestamp="2025-12-04 22:01:09 +0000 UTC" firstStartedPulling="2025-12-04 22:01:10.976039646 +0000 UTC m=+35.786341861" lastFinishedPulling="2025-12-04 22:01:14.660008601 +0000 UTC m=+39.470310816" observedRunningTime="2025-12-04 22:01:16.497999181 +0000 UTC m=+41.308301396" watchObservedRunningTime="2025-12-04 22:01:16.57797854 +0000 UTC m=+41.388280755" Dec 04 22:01:16.580395 master-0 kubenswrapper[8606]: I1204 22:01:16.579989 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" podStartSLOduration=5.062381324 podStartE2EDuration="10.579977126s" podCreationTimestamp="2025-12-04 22:01:06 +0000 UTC" firstStartedPulling="2025-12-04 22:01:09.103827599 +0000 UTC m=+33.914129814" lastFinishedPulling="2025-12-04 22:01:14.621423361 +0000 UTC m=+39.431725616" observedRunningTime="2025-12-04 22:01:16.574119871 +0000 UTC m=+41.384422086" watchObservedRunningTime="2025-12-04 22:01:16.579977126 +0000 UTC m=+41.390279341" Dec 04 22:01:16.602171 master-0 kubenswrapper[8606]: I1204 22:01:16.601231 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" podStartSLOduration=5.627600451 podStartE2EDuration="10.601204686s" podCreationTimestamp="2025-12-04 22:01:06 +0000 UTC" firstStartedPulling="2025-12-04 22:01:11.228386464 +0000 UTC m=+36.038688699" lastFinishedPulling="2025-12-04 22:01:16.201990709 +0000 UTC m=+41.012292934" observedRunningTime="2025-12-04 22:01:16.600595378 +0000 UTC m=+41.410897593" watchObservedRunningTime="2025-12-04 22:01:16.601204686 +0000 UTC m=+41.411506901" Dec 04 22:01:16.762601 master-0 kubenswrapper[8606]: I1204 22:01:16.761646 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:01:16.851598 master-0 kubenswrapper[8606]: I1204 22:01:16.840869 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:01:16.946164 master-0 kubenswrapper[8606]: I1204 22:01:16.946077 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:16.946164 master-0 kubenswrapper[8606]: I1204 22:01:16.946163 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:16.959466 master-0 kubenswrapper[8606]: I1204 22:01:16.959400 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:01:17.243019 master-0 kubenswrapper[8606]: I1204 22:01:17.240124 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:17.496447 master-0 kubenswrapper[8606]: I1204 22:01:17.496229 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"4b9fbd90-66d5-4637-9821-22242aa6f6d7","Type":"ContainerStarted","Data":"05ebda65d53028c7345257866dac633a27c8894eb475430d761e1c0a053ea020"} Dec 04 22:01:17.499519 master-0 kubenswrapper[8606]: I1204 22:01:17.499331 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" event={"ID":"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141","Type":"ContainerStarted","Data":"0a1af3e7058502a8428a136432bedb480886bf3096e6470d4e79d520cc6f00b5"} Dec 04 22:01:17.502863 master-0 kubenswrapper[8606]: I1204 22:01:17.502799 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" event={"ID":"989a73ce-3898-4f65-a437-2c7061f9375f","Type":"ContainerStarted","Data":"28fef5f99f6f6e677e593fa649674c67bdc15138dbeae6953397e00648b6d669"} Dec 04 22:01:17.510193 master-0 kubenswrapper[8606]: I1204 22:01:17.509900 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:01:17.514791 master-0 kubenswrapper[8606]: I1204 22:01:17.512821 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=4.512805627 podStartE2EDuration="4.512805627s" podCreationTimestamp="2025-12-04 22:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:17.512787107 +0000 UTC m=+42.323089342" watchObservedRunningTime="2025-12-04 22:01:17.512805627 +0000 UTC m=+42.323107852" Dec 04 22:01:18.786367 master-0 kubenswrapper[8606]: I1204 22:01:18.786304 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 04 22:01:18.787288 master-0 kubenswrapper[8606]: I1204 22:01:18.787263 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:18.799223 master-0 kubenswrapper[8606]: I1204 22:01:18.797085 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 04 22:01:18.820698 master-0 kubenswrapper[8606]: I1204 22:01:18.820329 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw"] Dec 04 22:01:18.843785 master-0 kubenswrapper[8606]: I1204 22:01:18.843696 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw"] Dec 04 22:01:18.844196 master-0 kubenswrapper[8606]: I1204 22:01:18.843992 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" podUID="e81bd90d-2cb1-4f15-b85c-e141f2595f4e" containerName="route-controller-manager" containerID="cri-o://13b020beb274eb5d9a2c536487e04c6e58c1dec75ff79002129442fa1045ca3e" gracePeriod=30 Dec 04 22:01:18.922765 master-0 kubenswrapper[8606]: I1204 22:01:18.922719 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-var-lock\") pod \"installer-3-master-0\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:18.923021 master-0 kubenswrapper[8606]: I1204 22:01:18.922775 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ef2ace4-469d-437b-97c7-d31bf075a107-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:18.923021 master-0 kubenswrapper[8606]: I1204 22:01:18.922838 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:19.040601 master-0 kubenswrapper[8606]: I1204 22:01:19.039289 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:19.040601 master-0 kubenswrapper[8606]: I1204 22:01:19.039381 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-var-lock\") pod \"installer-3-master-0\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:19.040601 master-0 kubenswrapper[8606]: I1204 22:01:19.039400 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ef2ace4-469d-437b-97c7-d31bf075a107-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:19.040601 master-0 kubenswrapper[8606]: I1204 22:01:19.039846 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:19.040601 master-0 kubenswrapper[8606]: I1204 22:01:19.039885 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-var-lock\") pod \"installer-3-master-0\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:19.075019 master-0 kubenswrapper[8606]: I1204 22:01:19.072192 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ef2ace4-469d-437b-97c7-d31bf075a107-kube-api-access\") pod \"installer-3-master-0\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:19.116642 master-0 kubenswrapper[8606]: I1204 22:01:19.116581 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:19.515554 master-0 kubenswrapper[8606]: I1204 22:01:19.515384 8606 generic.go:334] "Generic (PLEG): container finished" podID="e81bd90d-2cb1-4f15-b85c-e141f2595f4e" containerID="13b020beb274eb5d9a2c536487e04c6e58c1dec75ff79002129442fa1045ca3e" exitCode=0 Dec 04 22:01:19.515554 master-0 kubenswrapper[8606]: I1204 22:01:19.515434 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" event={"ID":"e81bd90d-2cb1-4f15-b85c-e141f2595f4e","Type":"ContainerDied","Data":"13b020beb274eb5d9a2c536487e04c6e58c1dec75ff79002129442fa1045ca3e"} Dec 04 22:01:19.515822 master-0 kubenswrapper[8606]: I1204 22:01:19.515612 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" podUID="92572d6d-d7fc-4986-8b8e-df534cfa6ebb" containerName="controller-manager" containerID="cri-o://c70e8f850d7b8f9319325ca8a026a557fc6f36d8d9fa77ddab42d7f2a5c30f93" gracePeriod=30 Dec 04 22:01:20.521652 master-0 kubenswrapper[8606]: I1204 22:01:20.521561 8606 generic.go:334] "Generic (PLEG): container finished" podID="92572d6d-d7fc-4986-8b8e-df534cfa6ebb" containerID="c70e8f850d7b8f9319325ca8a026a557fc6f36d8d9fa77ddab42d7f2a5c30f93" exitCode=0 Dec 04 22:01:20.521652 master-0 kubenswrapper[8606]: I1204 22:01:20.521642 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" event={"ID":"92572d6d-d7fc-4986-8b8e-df534cfa6ebb","Type":"ContainerDied","Data":"c70e8f850d7b8f9319325ca8a026a557fc6f36d8d9fa77ddab42d7f2a5c30f93"} Dec 04 22:01:20.679122 master-0 kubenswrapper[8606]: I1204 22:01:20.679026 8606 patch_prober.go:28] interesting pod/controller-manager-5fcd8fbcb8-dhxmw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.40:8443/healthz\": dial tcp 10.128.0.40:8443: connect: connection refused" start-of-body= Dec 04 22:01:20.679122 master-0 kubenswrapper[8606]: I1204 22:01:20.679108 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" podUID="92572d6d-d7fc-4986-8b8e-df534cfa6ebb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.40:8443/healthz\": dial tcp 10.128.0.40:8443: connect: connection refused" Dec 04 22:01:20.813208 master-0 kubenswrapper[8606]: I1204 22:01:20.813050 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 04 22:01:20.820488 master-0 kubenswrapper[8606]: I1204 22:01:20.817724 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:20.820488 master-0 kubenswrapper[8606]: I1204 22:01:20.820166 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 04 22:01:20.856901 master-0 kubenswrapper[8606]: I1204 22:01:20.856675 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 04 22:01:20.966602 master-0 kubenswrapper[8606]: I1204 22:01:20.965416 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kube-api-access\") pod \"installer-1-master-0\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:20.966602 master-0 kubenswrapper[8606]: I1204 22:01:20.965468 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:20.966602 master-0 kubenswrapper[8606]: I1204 22:01:20.965517 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-var-lock\") pod \"installer-1-master-0\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:21.066906 master-0 kubenswrapper[8606]: I1204 22:01:21.066823 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-var-lock\") pod \"installer-1-master-0\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:21.067009 master-0 kubenswrapper[8606]: I1204 22:01:21.066987 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kube-api-access\") pod \"installer-1-master-0\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:21.067065 master-0 kubenswrapper[8606]: I1204 22:01:21.067030 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:21.067111 master-0 kubenswrapper[8606]: I1204 22:01:21.067069 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-var-lock\") pod \"installer-1-master-0\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:21.067156 master-0 kubenswrapper[8606]: I1204 22:01:21.067130 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:21.092285 master-0 kubenswrapper[8606]: I1204 22:01:21.092237 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:21.184322 master-0 kubenswrapper[8606]: I1204 22:01:21.184262 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kube-api-access\") pod \"installer-1-master-0\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:21.194900 master-0 kubenswrapper[8606]: I1204 22:01:21.194861 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:21.269702 master-0 kubenswrapper[8606]: I1204 22:01:21.269646 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-proxy-ca-bundles\") pod \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " Dec 04 22:01:21.269702 master-0 kubenswrapper[8606]: I1204 22:01:21.269699 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-client-ca\") pod \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " Dec 04 22:01:21.269993 master-0 kubenswrapper[8606]: I1204 22:01:21.269758 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7xcp\" (UniqueName: \"kubernetes.io/projected/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-kube-api-access-x7xcp\") pod \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " Dec 04 22:01:21.269993 master-0 kubenswrapper[8606]: I1204 22:01:21.269886 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-serving-cert\") pod \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " Dec 04 22:01:21.269993 master-0 kubenswrapper[8606]: I1204 22:01:21.269955 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-config\") pod \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\" (UID: \"92572d6d-d7fc-4986-8b8e-df534cfa6ebb\") " Dec 04 22:01:21.271062 master-0 kubenswrapper[8606]: I1204 22:01:21.270428 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-client-ca" (OuterVolumeSpecName: "client-ca") pod "92572d6d-d7fc-4986-8b8e-df534cfa6ebb" (UID: "92572d6d-d7fc-4986-8b8e-df534cfa6ebb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:01:21.271062 master-0 kubenswrapper[8606]: I1204 22:01:21.270530 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "92572d6d-d7fc-4986-8b8e-df534cfa6ebb" (UID: "92572d6d-d7fc-4986-8b8e-df534cfa6ebb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:01:21.271062 master-0 kubenswrapper[8606]: I1204 22:01:21.270697 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-config" (OuterVolumeSpecName: "config") pod "92572d6d-d7fc-4986-8b8e-df534cfa6ebb" (UID: "92572d6d-d7fc-4986-8b8e-df534cfa6ebb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:01:21.275292 master-0 kubenswrapper[8606]: I1204 22:01:21.275219 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86785576d9-t7jrz"] Dec 04 22:01:21.276554 master-0 kubenswrapper[8606]: E1204 22:01:21.275545 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92572d6d-d7fc-4986-8b8e-df534cfa6ebb" containerName="controller-manager" Dec 04 22:01:21.276554 master-0 kubenswrapper[8606]: I1204 22:01:21.275581 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="92572d6d-d7fc-4986-8b8e-df534cfa6ebb" containerName="controller-manager" Dec 04 22:01:21.276554 master-0 kubenswrapper[8606]: E1204 22:01:21.275602 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81bd90d-2cb1-4f15-b85c-e141f2595f4e" containerName="route-controller-manager" Dec 04 22:01:21.276554 master-0 kubenswrapper[8606]: I1204 22:01:21.275614 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81bd90d-2cb1-4f15-b85c-e141f2595f4e" containerName="route-controller-manager" Dec 04 22:01:21.276554 master-0 kubenswrapper[8606]: I1204 22:01:21.275748 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81bd90d-2cb1-4f15-b85c-e141f2595f4e" containerName="route-controller-manager" Dec 04 22:01:21.276554 master-0 kubenswrapper[8606]: I1204 22:01:21.275761 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="92572d6d-d7fc-4986-8b8e-df534cfa6ebb" containerName="controller-manager" Dec 04 22:01:21.276554 master-0 kubenswrapper[8606]: I1204 22:01:21.276228 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.278352 master-0 kubenswrapper[8606]: I1204 22:01:21.278271 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-kube-api-access-x7xcp" (OuterVolumeSpecName: "kube-api-access-x7xcp") pod "92572d6d-d7fc-4986-8b8e-df534cfa6ebb" (UID: "92572d6d-d7fc-4986-8b8e-df534cfa6ebb"). InnerVolumeSpecName "kube-api-access-x7xcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:01:21.279983 master-0 kubenswrapper[8606]: I1204 22:01:21.279914 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "92572d6d-d7fc-4986-8b8e-df534cfa6ebb" (UID: "92572d6d-d7fc-4986-8b8e-df534cfa6ebb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:01:21.284619 master-0 kubenswrapper[8606]: I1204 22:01:21.284565 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 04 22:01:21.292146 master-0 kubenswrapper[8606]: I1204 22:01:21.292086 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86785576d9-t7jrz"] Dec 04 22:01:21.372143 master-0 kubenswrapper[8606]: I1204 22:01:21.371851 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmt9d\" (UniqueName: \"kubernetes.io/projected/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-kube-api-access-qmt9d\") pod \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " Dec 04 22:01:21.372143 master-0 kubenswrapper[8606]: I1204 22:01:21.372034 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-client-ca\") pod \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " Dec 04 22:01:21.372143 master-0 kubenswrapper[8606]: I1204 22:01:21.372115 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-serving-cert\") pod \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " Dec 04 22:01:21.372143 master-0 kubenswrapper[8606]: I1204 22:01:21.372140 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-config\") pod \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\" (UID: \"e81bd90d-2cb1-4f15-b85c-e141f2595f4e\") " Dec 04 22:01:21.372817 master-0 kubenswrapper[8606]: I1204 22:01:21.372361 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-client-ca\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.372817 master-0 kubenswrapper[8606]: I1204 22:01:21.372399 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc4z5\" (UniqueName: \"kubernetes.io/projected/c3863c74-8f22-4c67-bef5-2d0d39df4abd-kube-api-access-pc4z5\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.372817 master-0 kubenswrapper[8606]: I1204 22:01:21.372543 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3863c74-8f22-4c67-bef5-2d0d39df4abd-serving-cert\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.372817 master-0 kubenswrapper[8606]: I1204 22:01:21.372570 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-config\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.372817 master-0 kubenswrapper[8606]: I1204 22:01:21.372633 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-proxy-ca-bundles\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.372817 master-0 kubenswrapper[8606]: I1204 22:01:21.372726 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7xcp\" (UniqueName: \"kubernetes.io/projected/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-kube-api-access-x7xcp\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:21.372817 master-0 kubenswrapper[8606]: I1204 22:01:21.372759 8606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:21.372817 master-0 kubenswrapper[8606]: I1204 22:01:21.372773 8606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:21.372817 master-0 kubenswrapper[8606]: I1204 22:01:21.372784 8606 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:21.372817 master-0 kubenswrapper[8606]: I1204 22:01:21.372794 8606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/92572d6d-d7fc-4986-8b8e-df534cfa6ebb-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:21.373209 master-0 kubenswrapper[8606]: I1204 22:01:21.372912 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-client-ca" (OuterVolumeSpecName: "client-ca") pod "e81bd90d-2cb1-4f15-b85c-e141f2595f4e" (UID: "e81bd90d-2cb1-4f15-b85c-e141f2595f4e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:01:21.373209 master-0 kubenswrapper[8606]: I1204 22:01:21.373012 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-config" (OuterVolumeSpecName: "config") pod "e81bd90d-2cb1-4f15-b85c-e141f2595f4e" (UID: "e81bd90d-2cb1-4f15-b85c-e141f2595f4e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:01:21.376196 master-0 kubenswrapper[8606]: I1204 22:01:21.376154 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-kube-api-access-qmt9d" (OuterVolumeSpecName: "kube-api-access-qmt9d") pod "e81bd90d-2cb1-4f15-b85c-e141f2595f4e" (UID: "e81bd90d-2cb1-4f15-b85c-e141f2595f4e"). InnerVolumeSpecName "kube-api-access-qmt9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:01:21.376196 master-0 kubenswrapper[8606]: I1204 22:01:21.376159 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e81bd90d-2cb1-4f15-b85c-e141f2595f4e" (UID: "e81bd90d-2cb1-4f15-b85c-e141f2595f4e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:01:21.473558 master-0 kubenswrapper[8606]: I1204 22:01:21.473457 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:21.475362 master-0 kubenswrapper[8606]: I1204 22:01:21.475219 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-client-ca\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.475482 master-0 kubenswrapper[8606]: I1204 22:01:21.475457 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc4z5\" (UniqueName: \"kubernetes.io/projected/c3863c74-8f22-4c67-bef5-2d0d39df4abd-kube-api-access-pc4z5\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.475570 master-0 kubenswrapper[8606]: I1204 22:01:21.475549 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3863c74-8f22-4c67-bef5-2d0d39df4abd-serving-cert\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.475619 master-0 kubenswrapper[8606]: I1204 22:01:21.475573 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-config\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.475673 master-0 kubenswrapper[8606]: I1204 22:01:21.475630 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-proxy-ca-bundles\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.475725 master-0 kubenswrapper[8606]: I1204 22:01:21.475701 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmt9d\" (UniqueName: \"kubernetes.io/projected/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-kube-api-access-qmt9d\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:21.475725 master-0 kubenswrapper[8606]: I1204 22:01:21.475718 8606 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:21.475966 master-0 kubenswrapper[8606]: I1204 22:01:21.475727 8606 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:21.475966 master-0 kubenswrapper[8606]: I1204 22:01:21.475737 8606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81bd90d-2cb1-4f15-b85c-e141f2595f4e-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:21.476487 master-0 kubenswrapper[8606]: I1204 22:01:21.476406 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-client-ca\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.477412 master-0 kubenswrapper[8606]: I1204 22:01:21.477341 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-proxy-ca-bundles\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.479138 master-0 kubenswrapper[8606]: I1204 22:01:21.479097 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-config\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.481437 master-0 kubenswrapper[8606]: I1204 22:01:21.481386 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3863c74-8f22-4c67-bef5-2d0d39df4abd-serving-cert\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.501605 master-0 kubenswrapper[8606]: I1204 22:01:21.501515 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc4z5\" (UniqueName: \"kubernetes.io/projected/c3863c74-8f22-4c67-bef5-2d0d39df4abd-kube-api-access-pc4z5\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.539690 master-0 kubenswrapper[8606]: I1204 22:01:21.538510 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"1ef2ace4-469d-437b-97c7-d31bf075a107","Type":"ContainerStarted","Data":"d81d8ae011528a6b5174aee453c874d8dd6fd43f7ab2bf232177f38cc74381cc"} Dec 04 22:01:21.548154 master-0 kubenswrapper[8606]: I1204 22:01:21.547978 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" event={"ID":"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141","Type":"ContainerStarted","Data":"420541efa65ce9474ef160ccf8e59df368e797415eddfcf4b7828985afa52ca7"} Dec 04 22:01:21.556739 master-0 kubenswrapper[8606]: I1204 22:01:21.556467 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" Dec 04 22:01:21.558057 master-0 kubenswrapper[8606]: I1204 22:01:21.556490 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw" event={"ID":"92572d6d-d7fc-4986-8b8e-df534cfa6ebb","Type":"ContainerDied","Data":"482cd59136431bdb5a93c083c18b3f1c52f50b62465cb16d1a188f38f7a152a1"} Dec 04 22:01:21.558057 master-0 kubenswrapper[8606]: I1204 22:01:21.557639 8606 scope.go:117] "RemoveContainer" containerID="c70e8f850d7b8f9319325ca8a026a557fc6f36d8d9fa77ddab42d7f2a5c30f93" Dec 04 22:01:21.563405 master-0 kubenswrapper[8606]: I1204 22:01:21.563358 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" Dec 04 22:01:21.564177 master-0 kubenswrapper[8606]: I1204 22:01:21.564101 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw" event={"ID":"e81bd90d-2cb1-4f15-b85c-e141f2595f4e","Type":"ContainerDied","Data":"1e2196dfe7da86556386896e68c289757b5874df82ea7b1910a060ddbec879ba"} Dec 04 22:01:21.571782 master-0 kubenswrapper[8606]: I1204 22:01:21.571645 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" event={"ID":"813f3ee7-35b5-4ee8-b453-00d16d910eae","Type":"ContainerStarted","Data":"f11072c38e40de60dafeffc2c5ef9e1780820ca0ce672700aba155fa414fe72c"} Dec 04 22:01:21.572145 master-0 kubenswrapper[8606]: I1204 22:01:21.571869 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:01:21.593772 master-0 kubenswrapper[8606]: I1204 22:01:21.592715 8606 scope.go:117] "RemoveContainer" containerID="13b020beb274eb5d9a2c536487e04c6e58c1dec75ff79002129442fa1045ca3e" Dec 04 22:01:21.609149 master-0 kubenswrapper[8606]: I1204 22:01:21.608555 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" podStartSLOduration=22.595746114 podStartE2EDuration="28.608526804s" podCreationTimestamp="2025-12-04 22:00:53 +0000 UTC" firstStartedPulling="2025-12-04 22:01:08.630652613 +0000 UTC m=+33.440954828" lastFinishedPulling="2025-12-04 22:01:14.643433303 +0000 UTC m=+39.453735518" observedRunningTime="2025-12-04 22:01:21.579021331 +0000 UTC m=+46.389323576" watchObservedRunningTime="2025-12-04 22:01:21.608526804 +0000 UTC m=+46.418829019" Dec 04 22:01:21.610147 master-0 kubenswrapper[8606]: I1204 22:01:21.610079 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:21.644680 master-0 kubenswrapper[8606]: I1204 22:01:21.644615 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw"] Dec 04 22:01:21.653823 master-0 kubenswrapper[8606]: I1204 22:01:21.653753 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5fcd8fbcb8-dhxmw"] Dec 04 22:01:21.676153 master-0 kubenswrapper[8606]: I1204 22:01:21.676078 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw"] Dec 04 22:01:21.678441 master-0 kubenswrapper[8606]: I1204 22:01:21.678363 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85f9d6bb6-vswnw"] Dec 04 22:01:21.696106 master-0 kubenswrapper[8606]: I1204 22:01:21.696045 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:01:21.696106 master-0 kubenswrapper[8606]: I1204 22:01:21.696108 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:01:21.714640 master-0 kubenswrapper[8606]: I1204 22:01:21.713810 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:01:21.894883 master-0 kubenswrapper[8606]: I1204 22:01:21.894812 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 04 22:01:21.901678 master-0 kubenswrapper[8606]: W1204 22:01:21.901641 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9fca9b57_0b34_46d4_9f3a_dbd4acd630f6.slice/crio-32c3997205f44d7d981e61cc8c878310736400e3977f5531db32f6189ad28d9e WatchSource:0}: Error finding container 32c3997205f44d7d981e61cc8c878310736400e3977f5531db32f6189ad28d9e: Status 404 returned error can't find the container with id 32c3997205f44d7d981e61cc8c878310736400e3977f5531db32f6189ad28d9e Dec 04 22:01:22.038864 master-0 kubenswrapper[8606]: I1204 22:01:22.038779 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86785576d9-t7jrz"] Dec 04 22:01:22.580293 master-0 kubenswrapper[8606]: I1204 22:01:22.580216 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6","Type":"ContainerStarted","Data":"63eed725435ffc8fd80ef462cf5e0dcb22612c336b3cabe2d142c84feedc099e"} Dec 04 22:01:22.580293 master-0 kubenswrapper[8606]: I1204 22:01:22.580292 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6","Type":"ContainerStarted","Data":"32c3997205f44d7d981e61cc8c878310736400e3977f5531db32f6189ad28d9e"} Dec 04 22:01:22.583345 master-0 kubenswrapper[8606]: I1204 22:01:22.583296 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"1ef2ace4-469d-437b-97c7-d31bf075a107","Type":"ContainerStarted","Data":"38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01"} Dec 04 22:01:22.594402 master-0 kubenswrapper[8606]: I1204 22:01:22.594329 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerStarted","Data":"ab6c10b9a3e0637d5c7a14c6df7c632b34ad06eac467a51eec2ac60a0a5a71c4"} Dec 04 22:01:22.594632 master-0 kubenswrapper[8606]: I1204 22:01:22.594410 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerStarted","Data":"56273487d972eb4bce7bb2a2d532d0ecc4790cf320f72d236deb00d6ee12d734"} Dec 04 22:01:22.595451 master-0 kubenswrapper[8606]: I1204 22:01:22.595179 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:22.601557 master-0 kubenswrapper[8606]: I1204 22:01:22.600065 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=2.600039522 podStartE2EDuration="2.600039522s" podCreationTimestamp="2025-12-04 22:01:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:22.599650061 +0000 UTC m=+47.409952276" watchObservedRunningTime="2025-12-04 22:01:22.600039522 +0000 UTC m=+47.410341747" Dec 04 22:01:22.609528 master-0 kubenswrapper[8606]: I1204 22:01:22.606948 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:01:22.609528 master-0 kubenswrapper[8606]: I1204 22:01:22.608295 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:01:22.627125 master-0 kubenswrapper[8606]: I1204 22:01:22.627032 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=4.627002214 podStartE2EDuration="4.627002214s" podCreationTimestamp="2025-12-04 22:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:22.625407759 +0000 UTC m=+47.435709984" watchObservedRunningTime="2025-12-04 22:01:22.627002214 +0000 UTC m=+47.437304429" Dec 04 22:01:22.677282 master-0 kubenswrapper[8606]: I1204 22:01:22.677154 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" podStartSLOduration=4.67710481 podStartE2EDuration="4.67710481s" podCreationTimestamp="2025-12-04 22:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:22.649116109 +0000 UTC m=+47.459418334" watchObservedRunningTime="2025-12-04 22:01:22.67710481 +0000 UTC m=+47.487407045" Dec 04 22:01:22.993344 master-0 kubenswrapper[8606]: I1204 22:01:22.992797 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn"] Dec 04 22:01:22.993344 master-0 kubenswrapper[8606]: I1204 22:01:22.993347 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:01:22.999875 master-0 kubenswrapper[8606]: I1204 22:01:22.999833 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 22:01:23.003387 master-0 kubenswrapper[8606]: I1204 22:01:23.003352 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 22:01:23.003831 master-0 kubenswrapper[8606]: I1204 22:01:23.003409 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 22:01:23.016288 master-0 kubenswrapper[8606]: I1204 22:01:23.016239 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn"] Dec 04 22:01:23.107676 master-0 kubenswrapper[8606]: I1204 22:01:23.107587 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1534e25-7add-46a1-8f4e-0065c232aa4e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:01:23.107676 master-0 kubenswrapper[8606]: I1204 22:01:23.107661 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d7pj\" (UniqueName: \"kubernetes.io/projected/f1534e25-7add-46a1-8f4e-0065c232aa4e-kube-api-access-4d7pj\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:01:23.209528 master-0 kubenswrapper[8606]: I1204 22:01:23.209420 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1534e25-7add-46a1-8f4e-0065c232aa4e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:01:23.209812 master-0 kubenswrapper[8606]: I1204 22:01:23.209547 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d7pj\" (UniqueName: \"kubernetes.io/projected/f1534e25-7add-46a1-8f4e-0065c232aa4e-kube-api-access-4d7pj\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:01:23.218427 master-0 kubenswrapper[8606]: I1204 22:01:23.217321 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1534e25-7add-46a1-8f4e-0065c232aa4e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:01:23.232578 master-0 kubenswrapper[8606]: I1204 22:01:23.230900 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d7pj\" (UniqueName: \"kubernetes.io/projected/f1534e25-7add-46a1-8f4e-0065c232aa4e-kube-api-access-4d7pj\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:01:23.325778 master-0 kubenswrapper[8606]: I1204 22:01:23.325694 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:01:23.346326 master-0 kubenswrapper[8606]: I1204 22:01:23.345953 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg"] Dec 04 22:01:23.346711 master-0 kubenswrapper[8606]: I1204 22:01:23.346676 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.350491 master-0 kubenswrapper[8606]: I1204 22:01:23.350451 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 22:01:23.352099 master-0 kubenswrapper[8606]: I1204 22:01:23.351313 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 22:01:23.352099 master-0 kubenswrapper[8606]: I1204 22:01:23.351485 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 22:01:23.352434 master-0 kubenswrapper[8606]: I1204 22:01:23.352360 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 22:01:23.352857 master-0 kubenswrapper[8606]: I1204 22:01:23.352798 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 22:01:23.364763 master-0 kubenswrapper[8606]: I1204 22:01:23.364700 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg"] Dec 04 22:01:23.413854 master-0 kubenswrapper[8606]: I1204 22:01:23.413340 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92572d6d-d7fc-4986-8b8e-df534cfa6ebb" path="/var/lib/kubelet/pods/92572d6d-d7fc-4986-8b8e-df534cfa6ebb/volumes" Dec 04 22:01:23.414132 master-0 kubenswrapper[8606]: I1204 22:01:23.413894 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81bd90d-2cb1-4f15-b85c-e141f2595f4e" path="/var/lib/kubelet/pods/e81bd90d-2cb1-4f15-b85c-e141f2595f4e/volumes" Dec 04 22:01:23.415671 master-0 kubenswrapper[8606]: I1204 22:01:23.415197 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vpxd\" (UniqueName: \"kubernetes.io/projected/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-kube-api-access-2vpxd\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.415671 master-0 kubenswrapper[8606]: I1204 22:01:23.415288 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.415671 master-0 kubenswrapper[8606]: I1204 22:01:23.415328 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.415671 master-0 kubenswrapper[8606]: I1204 22:01:23.415354 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-serving-cert\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.551581 master-0 kubenswrapper[8606]: I1204 22:01:23.550441 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vpxd\" (UniqueName: \"kubernetes.io/projected/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-kube-api-access-2vpxd\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.551581 master-0 kubenswrapper[8606]: I1204 22:01:23.550550 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.551581 master-0 kubenswrapper[8606]: I1204 22:01:23.550577 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.551947 master-0 kubenswrapper[8606]: I1204 22:01:23.551679 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.551947 master-0 kubenswrapper[8606]: I1204 22:01:23.551732 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-serving-cert\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.552828 master-0 kubenswrapper[8606]: I1204 22:01:23.552287 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.569797 master-0 kubenswrapper[8606]: I1204 22:01:23.569733 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-serving-cert\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.590324 master-0 kubenswrapper[8606]: I1204 22:01:23.590222 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vpxd\" (UniqueName: \"kubernetes.io/projected/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-kube-api-access-2vpxd\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.704129 master-0 kubenswrapper[8606]: I1204 22:01:23.702844 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:23.916626 master-0 kubenswrapper[8606]: I1204 22:01:23.915552 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn"] Dec 04 22:01:24.124366 master-0 kubenswrapper[8606]: I1204 22:01:24.124317 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg"] Dec 04 22:01:24.615587 master-0 kubenswrapper[8606]: I1204 22:01:24.615536 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" event={"ID":"b86ff0e8-2c72-4dc6-ac55-3c21940d044f","Type":"ContainerStarted","Data":"577801a549fb8e8b80c730f6cb1e1c0076264ab71677f9e7afd6abe1e4f77036"} Dec 04 22:01:24.615587 master-0 kubenswrapper[8606]: I1204 22:01:24.615588 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" event={"ID":"b86ff0e8-2c72-4dc6-ac55-3c21940d044f","Type":"ContainerStarted","Data":"2d1e4c21a00903c83707b4497502b9b73283be3d30ce9a5aaab7e54bf8f72dba"} Dec 04 22:01:24.616231 master-0 kubenswrapper[8606]: I1204 22:01:24.616162 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:24.620340 master-0 kubenswrapper[8606]: I1204 22:01:24.619774 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" event={"ID":"f1534e25-7add-46a1-8f4e-0065c232aa4e","Type":"ContainerStarted","Data":"264531cb97973b0deb400a67899ce39a8e7e6bd105e2fd0acd10b7958dc4add3"} Dec 04 22:01:24.640594 master-0 kubenswrapper[8606]: I1204 22:01:24.640524 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" podStartSLOduration=6.640480491 podStartE2EDuration="6.640480491s" podCreationTimestamp="2025-12-04 22:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:24.640122251 +0000 UTC m=+49.450424496" watchObservedRunningTime="2025-12-04 22:01:24.640480491 +0000 UTC m=+49.450782716" Dec 04 22:01:24.653817 master-0 kubenswrapper[8606]: I1204 22:01:24.653757 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:01:25.423624 master-0 kubenswrapper[8606]: I1204 22:01:25.423570 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vvs9c" Dec 04 22:01:26.037280 master-0 kubenswrapper[8606]: I1204 22:01:26.037204 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd"] Dec 04 22:01:26.038196 master-0 kubenswrapper[8606]: I1204 22:01:26.037958 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.041559 master-0 kubenswrapper[8606]: I1204 22:01:26.041516 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 22:01:26.041716 master-0 kubenswrapper[8606]: I1204 22:01:26.041684 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 22:01:26.041771 master-0 kubenswrapper[8606]: I1204 22:01:26.041743 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 22:01:26.042041 master-0 kubenswrapper[8606]: I1204 22:01:26.042000 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 22:01:26.042041 master-0 kubenswrapper[8606]: I1204 22:01:26.042021 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 22:01:26.044139 master-0 kubenswrapper[8606]: I1204 22:01:26.044107 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-shf98" Dec 04 22:01:26.104383 master-0 kubenswrapper[8606]: I1204 22:01:26.104302 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c52111ac-30f6-47b7-a8ca-13659fbd71b4-machine-approver-tls\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.104383 master-0 kubenswrapper[8606]: I1204 22:01:26.104387 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-config\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.104726 master-0 kubenswrapper[8606]: I1204 22:01:26.104419 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-auth-proxy-config\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.104846 master-0 kubenswrapper[8606]: I1204 22:01:26.104736 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcdst\" (UniqueName: \"kubernetes.io/projected/c52111ac-30f6-47b7-a8ca-13659fbd71b4-kube-api-access-rcdst\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.205527 master-0 kubenswrapper[8606]: I1204 22:01:26.205447 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcdst\" (UniqueName: \"kubernetes.io/projected/c52111ac-30f6-47b7-a8ca-13659fbd71b4-kube-api-access-rcdst\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.206057 master-0 kubenswrapper[8606]: I1204 22:01:26.206021 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c52111ac-30f6-47b7-a8ca-13659fbd71b4-machine-approver-tls\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.206114 master-0 kubenswrapper[8606]: I1204 22:01:26.206069 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-config\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.206114 master-0 kubenswrapper[8606]: I1204 22:01:26.206106 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-auth-proxy-config\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.207486 master-0 kubenswrapper[8606]: I1204 22:01:26.207450 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-auth-proxy-config\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.208731 master-0 kubenswrapper[8606]: I1204 22:01:26.207849 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-config\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.212856 master-0 kubenswrapper[8606]: I1204 22:01:26.212800 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c52111ac-30f6-47b7-a8ca-13659fbd71b4-machine-approver-tls\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.227106 master-0 kubenswrapper[8606]: I1204 22:01:26.226618 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcdst\" (UniqueName: \"kubernetes.io/projected/c52111ac-30f6-47b7-a8ca-13659fbd71b4-kube-api-access-rcdst\") pod \"machine-approver-f797d8546-4g7dd\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:26.366383 master-0 kubenswrapper[8606]: I1204 22:01:26.366135 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:01:28.307121 master-0 kubenswrapper[8606]: I1204 22:01:28.307048 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn"] Dec 04 22:01:28.308038 master-0 kubenswrapper[8606]: I1204 22:01:28.308012 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.311316 master-0 kubenswrapper[8606]: I1204 22:01:28.311281 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 04 22:01:28.311439 master-0 kubenswrapper[8606]: I1204 22:01:28.311411 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 04 22:01:28.311513 master-0 kubenswrapper[8606]: I1204 22:01:28.311479 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 04 22:01:28.311683 master-0 kubenswrapper[8606]: I1204 22:01:28.311639 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-fvwtt" Dec 04 22:01:28.319487 master-0 kubenswrapper[8606]: I1204 22:01:28.319436 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 04 22:01:28.331159 master-0 kubenswrapper[8606]: I1204 22:01:28.331098 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn"] Dec 04 22:01:28.442484 master-0 kubenswrapper[8606]: I1204 22:01:28.442404 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngkqz\" (UniqueName: \"kubernetes.io/projected/800f436c-145d-4281-8d4d-644ba2cb0ebb-kube-api-access-ngkqz\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.442484 master-0 kubenswrapper[8606]: I1204 22:01:28.442487 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/800f436c-145d-4281-8d4d-644ba2cb0ebb-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.442847 master-0 kubenswrapper[8606]: I1204 22:01:28.442560 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/800f436c-145d-4281-8d4d-644ba2cb0ebb-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.544152 master-0 kubenswrapper[8606]: I1204 22:01:28.544078 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/800f436c-145d-4281-8d4d-644ba2cb0ebb-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.544152 master-0 kubenswrapper[8606]: I1204 22:01:28.544162 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngkqz\" (UniqueName: \"kubernetes.io/projected/800f436c-145d-4281-8d4d-644ba2cb0ebb-kube-api-access-ngkqz\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.544594 master-0 kubenswrapper[8606]: I1204 22:01:28.544203 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/800f436c-145d-4281-8d4d-644ba2cb0ebb-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.545654 master-0 kubenswrapper[8606]: I1204 22:01:28.545603 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/800f436c-145d-4281-8d4d-644ba2cb0ebb-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.550656 master-0 kubenswrapper[8606]: I1204 22:01:28.550599 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/800f436c-145d-4281-8d4d-644ba2cb0ebb-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.574936 master-0 kubenswrapper[8606]: I1204 22:01:28.574864 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngkqz\" (UniqueName: \"kubernetes.io/projected/800f436c-145d-4281-8d4d-644ba2cb0ebb-kube-api-access-ngkqz\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.634282 master-0 kubenswrapper[8606]: I1204 22:01:28.634212 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:01:28.654675 master-0 kubenswrapper[8606]: I1204 22:01:28.653778 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" event={"ID":"c52111ac-30f6-47b7-a8ca-13659fbd71b4","Type":"ContainerStarted","Data":"ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663"} Dec 04 22:01:28.654675 master-0 kubenswrapper[8606]: I1204 22:01:28.653869 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" event={"ID":"c52111ac-30f6-47b7-a8ca-13659fbd71b4","Type":"ContainerStarted","Data":"6a357f5e42bd47d188514cbe5323e44289a00a8afb8894fb4fbbebb634c903b1"} Dec 04 22:01:28.657072 master-0 kubenswrapper[8606]: I1204 22:01:28.656995 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" event={"ID":"f1534e25-7add-46a1-8f4e-0065c232aa4e","Type":"ContainerStarted","Data":"efec9b80d16091e3ba4473728d27aba3a23ca799a67ec448c19c49d6e7be1b22"} Dec 04 22:01:28.691058 master-0 kubenswrapper[8606]: I1204 22:01:28.690902 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" podStartSLOduration=2.627784223 podStartE2EDuration="6.690862247s" podCreationTimestamp="2025-12-04 22:01:22 +0000 UTC" firstStartedPulling="2025-12-04 22:01:23.941934058 +0000 UTC m=+48.752236273" lastFinishedPulling="2025-12-04 22:01:28.005012082 +0000 UTC m=+52.815314297" observedRunningTime="2025-12-04 22:01:28.689076377 +0000 UTC m=+53.499378622" watchObservedRunningTime="2025-12-04 22:01:28.690862247 +0000 UTC m=+53.501164512" Dec 04 22:01:28.793435 master-0 kubenswrapper[8606]: I1204 22:01:28.793354 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 04 22:01:28.793756 master-0 kubenswrapper[8606]: I1204 22:01:28.793713 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="1ef2ace4-469d-437b-97c7-d31bf075a107" containerName="installer" containerID="cri-o://38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01" gracePeriod=30 Dec 04 22:01:29.158991 master-0 kubenswrapper[8606]: I1204 22:01:29.157671 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn"] Dec 04 22:01:29.167722 master-0 kubenswrapper[8606]: W1204 22:01:29.167602 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800f436c_145d_4281_8d4d_644ba2cb0ebb.slice/crio-1f07b0c4938e582ee1cf91b0ddf3d74df5a116aa0098efb24e115a5e8116176b WatchSource:0}: Error finding container 1f07b0c4938e582ee1cf91b0ddf3d74df5a116aa0098efb24e115a5e8116176b: Status 404 returned error can't find the container with id 1f07b0c4938e582ee1cf91b0ddf3d74df5a116aa0098efb24e115a5e8116176b Dec 04 22:01:29.310985 master-0 kubenswrapper[8606]: I1204 22:01:29.310935 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_1ef2ace4-469d-437b-97c7-d31bf075a107/installer/0.log" Dec 04 22:01:29.311374 master-0 kubenswrapper[8606]: I1204 22:01:29.311023 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:29.365937 master-0 kubenswrapper[8606]: I1204 22:01:29.365815 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-kubelet-dir\") pod \"1ef2ace4-469d-437b-97c7-d31bf075a107\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " Dec 04 22:01:29.367013 master-0 kubenswrapper[8606]: I1204 22:01:29.366090 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ef2ace4-469d-437b-97c7-d31bf075a107-kube-api-access\") pod \"1ef2ace4-469d-437b-97c7-d31bf075a107\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " Dec 04 22:01:29.367013 master-0 kubenswrapper[8606]: I1204 22:01:29.366169 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-var-lock\") pod \"1ef2ace4-469d-437b-97c7-d31bf075a107\" (UID: \"1ef2ace4-469d-437b-97c7-d31bf075a107\") " Dec 04 22:01:29.367013 master-0 kubenswrapper[8606]: I1204 22:01:29.365762 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1ef2ace4-469d-437b-97c7-d31bf075a107" (UID: "1ef2ace4-469d-437b-97c7-d31bf075a107"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:29.367013 master-0 kubenswrapper[8606]: I1204 22:01:29.366792 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-var-lock" (OuterVolumeSpecName: "var-lock") pod "1ef2ace4-469d-437b-97c7-d31bf075a107" (UID: "1ef2ace4-469d-437b-97c7-d31bf075a107"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:29.370786 master-0 kubenswrapper[8606]: I1204 22:01:29.370730 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef2ace4-469d-437b-97c7-d31bf075a107-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1ef2ace4-469d-437b-97c7-d31bf075a107" (UID: "1ef2ace4-469d-437b-97c7-d31bf075a107"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:01:29.444569 master-0 kubenswrapper[8606]: I1204 22:01:29.443851 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d"] Dec 04 22:01:29.444569 master-0 kubenswrapper[8606]: E1204 22:01:29.444127 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef2ace4-469d-437b-97c7-d31bf075a107" containerName="installer" Dec 04 22:01:29.444569 master-0 kubenswrapper[8606]: I1204 22:01:29.444144 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef2ace4-469d-437b-97c7-d31bf075a107" containerName="installer" Dec 04 22:01:29.444569 master-0 kubenswrapper[8606]: I1204 22:01:29.444268 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef2ace4-469d-437b-97c7-d31bf075a107" containerName="installer" Dec 04 22:01:29.453171 master-0 kubenswrapper[8606]: I1204 22:01:29.445085 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:01:29.453171 master-0 kubenswrapper[8606]: I1204 22:01:29.449187 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 22:01:29.453171 master-0 kubenswrapper[8606]: I1204 22:01:29.449334 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 22:01:29.453171 master-0 kubenswrapper[8606]: I1204 22:01:29.449538 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 22:01:29.453171 master-0 kubenswrapper[8606]: I1204 22:01:29.449691 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-nvnbb" Dec 04 22:01:29.470518 master-0 kubenswrapper[8606]: I1204 22:01:29.470243 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2ghk\" (UniqueName: \"kubernetes.io/projected/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-kube-api-access-g2ghk\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:01:29.470518 master-0 kubenswrapper[8606]: I1204 22:01:29.470367 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:01:29.470518 master-0 kubenswrapper[8606]: I1204 22:01:29.470435 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ef2ace4-469d-437b-97c7-d31bf075a107-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:29.470518 master-0 kubenswrapper[8606]: I1204 22:01:29.470451 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:29.470518 master-0 kubenswrapper[8606]: I1204 22:01:29.470463 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ef2ace4-469d-437b-97c7-d31bf075a107-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:29.475565 master-0 kubenswrapper[8606]: I1204 22:01:29.475376 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d"] Dec 04 22:01:29.572841 master-0 kubenswrapper[8606]: I1204 22:01:29.572002 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:01:29.572841 master-0 kubenswrapper[8606]: I1204 22:01:29.572241 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2ghk\" (UniqueName: \"kubernetes.io/projected/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-kube-api-access-g2ghk\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:01:29.666760 master-0 kubenswrapper[8606]: I1204 22:01:29.664926 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_1ef2ace4-469d-437b-97c7-d31bf075a107/installer/0.log" Dec 04 22:01:29.666760 master-0 kubenswrapper[8606]: I1204 22:01:29.665013 8606 generic.go:334] "Generic (PLEG): container finished" podID="1ef2ace4-469d-437b-97c7-d31bf075a107" containerID="38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01" exitCode=1 Dec 04 22:01:29.666760 master-0 kubenswrapper[8606]: I1204 22:01:29.665136 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 04 22:01:29.666760 master-0 kubenswrapper[8606]: I1204 22:01:29.665151 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"1ef2ace4-469d-437b-97c7-d31bf075a107","Type":"ContainerDied","Data":"38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01"} Dec 04 22:01:29.666760 master-0 kubenswrapper[8606]: I1204 22:01:29.665213 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"1ef2ace4-469d-437b-97c7-d31bf075a107","Type":"ContainerDied","Data":"d81d8ae011528a6b5174aee453c874d8dd6fd43f7ab2bf232177f38cc74381cc"} Dec 04 22:01:29.666760 master-0 kubenswrapper[8606]: I1204 22:01:29.665244 8606 scope.go:117] "RemoveContainer" containerID="38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01" Dec 04 22:01:29.673705 master-0 kubenswrapper[8606]: I1204 22:01:29.672881 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" event={"ID":"800f436c-145d-4281-8d4d-644ba2cb0ebb","Type":"ContainerStarted","Data":"b55d508726ae8df0222e55687319ab2bf975a4e6e2983f8b547ae30ae19307c0"} Dec 04 22:01:29.673705 master-0 kubenswrapper[8606]: I1204 22:01:29.672975 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" event={"ID":"800f436c-145d-4281-8d4d-644ba2cb0ebb","Type":"ContainerStarted","Data":"1f07b0c4938e582ee1cf91b0ddf3d74df5a116aa0098efb24e115a5e8116176b"} Dec 04 22:01:29.686336 master-0 kubenswrapper[8606]: I1204 22:01:29.686262 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:01:29.688053 master-0 kubenswrapper[8606]: I1204 22:01:29.687993 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2ghk\" (UniqueName: \"kubernetes.io/projected/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-kube-api-access-g2ghk\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:01:29.694875 master-0 kubenswrapper[8606]: I1204 22:01:29.694817 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 04 22:01:29.696581 master-0 kubenswrapper[8606]: I1204 22:01:29.696523 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 04 22:01:29.767268 master-0 kubenswrapper[8606]: I1204 22:01:29.765888 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:01:30.314080 master-0 kubenswrapper[8606]: I1204 22:01:30.313664 8606 scope.go:117] "RemoveContainer" containerID="38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01" Dec 04 22:01:30.314841 master-0 kubenswrapper[8606]: E1204 22:01:30.314560 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01\": container with ID starting with 38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01 not found: ID does not exist" containerID="38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01" Dec 04 22:01:30.314841 master-0 kubenswrapper[8606]: I1204 22:01:30.314631 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01"} err="failed to get container status \"38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01\": rpc error: code = NotFound desc = could not find container \"38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01\": container with ID starting with 38ab0c1ac010cadb0ea51f0fe28ad0f14047bc569779ddb6e5ea5dc37dce8f01 not found: ID does not exist" Dec 04 22:01:30.745054 master-0 kubenswrapper[8606]: I1204 22:01:30.745000 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d"] Dec 04 22:01:30.981375 master-0 kubenswrapper[8606]: I1204 22:01:30.981318 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 04 22:01:30.982032 master-0 kubenswrapper[8606]: I1204 22:01:30.982010 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:30.984338 master-0 kubenswrapper[8606]: I1204 22:01:30.984287 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-t57bp" Dec 04 22:01:30.989379 master-0 kubenswrapper[8606]: I1204 22:01:30.989347 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:30.989451 master-0 kubenswrapper[8606]: I1204 22:01:30.989399 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-var-lock\") pod \"installer-4-master-0\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:30.989451 master-0 kubenswrapper[8606]: I1204 22:01:30.989438 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kube-api-access\") pod \"installer-4-master-0\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:31.003332 master-0 kubenswrapper[8606]: I1204 22:01:31.003259 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 04 22:01:31.092851 master-0 kubenswrapper[8606]: I1204 22:01:31.092773 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-var-lock\") pod \"installer-4-master-0\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:31.093084 master-0 kubenswrapper[8606]: I1204 22:01:31.092881 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kube-api-access\") pod \"installer-4-master-0\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:31.093084 master-0 kubenswrapper[8606]: I1204 22:01:31.092889 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-var-lock\") pod \"installer-4-master-0\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:31.093084 master-0 kubenswrapper[8606]: I1204 22:01:31.092945 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:31.093084 master-0 kubenswrapper[8606]: I1204 22:01:31.093066 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:31.114524 master-0 kubenswrapper[8606]: I1204 22:01:31.114378 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9"] Dec 04 22:01:31.116009 master-0 kubenswrapper[8606]: I1204 22:01:31.115803 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.118365 master-0 kubenswrapper[8606]: I1204 22:01:31.118311 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 04 22:01:31.118664 master-0 kubenswrapper[8606]: I1204 22:01:31.118638 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 04 22:01:31.119262 master-0 kubenswrapper[8606]: I1204 22:01:31.119219 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-n25ns" Dec 04 22:01:31.129952 master-0 kubenswrapper[8606]: I1204 22:01:31.129852 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9"] Dec 04 22:01:31.190648 master-0 kubenswrapper[8606]: I1204 22:01:31.183089 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kube-api-access\") pod \"installer-4-master-0\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:31.194085 master-0 kubenswrapper[8606]: I1204 22:01:31.194045 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5598683a-cd32-486d-8839-205829d55cc2-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.194222 master-0 kubenswrapper[8606]: I1204 22:01:31.194092 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2lwr\" (UniqueName: \"kubernetes.io/projected/5598683a-cd32-486d-8839-205829d55cc2-kube-api-access-s2lwr\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.194222 master-0 kubenswrapper[8606]: I1204 22:01:31.194129 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5598683a-cd32-486d-8839-205829d55cc2-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.295637 master-0 kubenswrapper[8606]: I1204 22:01:31.295582 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5598683a-cd32-486d-8839-205829d55cc2-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.295950 master-0 kubenswrapper[8606]: I1204 22:01:31.295929 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2lwr\" (UniqueName: \"kubernetes.io/projected/5598683a-cd32-486d-8839-205829d55cc2-kube-api-access-s2lwr\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.296090 master-0 kubenswrapper[8606]: I1204 22:01:31.296073 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5598683a-cd32-486d-8839-205829d55cc2-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.296624 master-0 kubenswrapper[8606]: I1204 22:01:31.296584 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5598683a-cd32-486d-8839-205829d55cc2-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.299727 master-0 kubenswrapper[8606]: I1204 22:01:31.299671 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5598683a-cd32-486d-8839-205829d55cc2-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.312198 master-0 kubenswrapper[8606]: I1204 22:01:31.312161 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2lwr\" (UniqueName: \"kubernetes.io/projected/5598683a-cd32-486d-8839-205829d55cc2-kube-api-access-s2lwr\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.370113 master-0 kubenswrapper[8606]: I1204 22:01:31.369976 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:01:31.409217 master-0 kubenswrapper[8606]: I1204 22:01:31.409171 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef2ace4-469d-437b-97c7-d31bf075a107" path="/var/lib/kubelet/pods/1ef2ace4-469d-437b-97c7-d31bf075a107/volumes" Dec 04 22:01:31.437583 master-0 kubenswrapper[8606]: I1204 22:01:31.437514 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:01:31.697144 master-0 kubenswrapper[8606]: I1204 22:01:31.697051 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" event={"ID":"c52111ac-30f6-47b7-a8ca-13659fbd71b4","Type":"ContainerStarted","Data":"ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6"} Dec 04 22:01:31.698589 master-0 kubenswrapper[8606]: I1204 22:01:31.698550 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" event={"ID":"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b","Type":"ContainerStarted","Data":"7d740441407a329d552a22d88957f884c55899842a2703505cfb149663a1e6ff"} Dec 04 22:01:31.935440 master-0 kubenswrapper[8606]: I1204 22:01:31.933344 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj"] Dec 04 22:01:31.935440 master-0 kubenswrapper[8606]: I1204 22:01:31.934128 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:31.945544 master-0 kubenswrapper[8606]: I1204 22:01:31.943457 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-zpcfd" Dec 04 22:01:31.945544 master-0 kubenswrapper[8606]: I1204 22:01:31.943570 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 04 22:01:31.945544 master-0 kubenswrapper[8606]: I1204 22:01:31.943603 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 04 22:01:31.945544 master-0 kubenswrapper[8606]: I1204 22:01:31.943952 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 04 22:01:31.945544 master-0 kubenswrapper[8606]: I1204 22:01:31.943943 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 04 22:01:31.959315 master-0 kubenswrapper[8606]: I1204 22:01:31.957578 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 04 22:01:31.961804 master-0 kubenswrapper[8606]: I1204 22:01:31.961763 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj"] Dec 04 22:01:31.969382 master-0 kubenswrapper[8606]: I1204 22:01:31.969259 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" podStartSLOduration=3.86948446 podStartE2EDuration="5.969227634s" podCreationTimestamp="2025-12-04 22:01:26 +0000 UTC" firstStartedPulling="2025-12-04 22:01:28.257955908 +0000 UTC m=+53.068258133" lastFinishedPulling="2025-12-04 22:01:30.357699092 +0000 UTC m=+55.168001307" observedRunningTime="2025-12-04 22:01:31.957822622 +0000 UTC m=+56.768124837" watchObservedRunningTime="2025-12-04 22:01:31.969227634 +0000 UTC m=+56.779529859" Dec 04 22:01:32.009271 master-0 kubenswrapper[8606]: I1204 22:01:32.001376 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9"] Dec 04 22:01:32.009271 master-0 kubenswrapper[8606]: I1204 22:01:32.008330 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-images\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.009271 master-0 kubenswrapper[8606]: I1204 22:01:32.008393 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.009271 master-0 kubenswrapper[8606]: I1204 22:01:32.008432 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-config\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.009271 master-0 kubenswrapper[8606]: I1204 22:01:32.008656 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cert\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.009271 master-0 kubenswrapper[8606]: I1204 22:01:32.008731 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt2jq\" (UniqueName: \"kubernetes.io/projected/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-kube-api-access-pt2jq\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.021025 master-0 kubenswrapper[8606]: I1204 22:01:32.015370 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8"] Dec 04 22:01:32.021025 master-0 kubenswrapper[8606]: I1204 22:01:32.016581 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.021025 master-0 kubenswrapper[8606]: I1204 22:01:32.018690 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-fj6qn" Dec 04 22:01:32.021025 master-0 kubenswrapper[8606]: I1204 22:01:32.019718 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 22:01:32.021025 master-0 kubenswrapper[8606]: I1204 22:01:32.019735 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 22:01:32.029659 master-0 kubenswrapper[8606]: I1204 22:01:32.025288 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8"] Dec 04 22:01:32.112585 master-0 kubenswrapper[8606]: I1204 22:01:32.112204 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-config\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.112585 master-0 kubenswrapper[8606]: I1204 22:01:32.112254 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsk29\" (UniqueName: \"kubernetes.io/projected/967bf4ac-f025-4296-8ed9-183a345f6b7c-kube-api-access-hsk29\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.113942 master-0 kubenswrapper[8606]: I1204 22:01:32.112833 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.113942 master-0 kubenswrapper[8606]: I1204 22:01:32.112986 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-srv-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.113942 master-0 kubenswrapper[8606]: I1204 22:01:32.113028 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cert\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.113942 master-0 kubenswrapper[8606]: I1204 22:01:32.113062 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-config\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.113942 master-0 kubenswrapper[8606]: I1204 22:01:32.113065 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2jq\" (UniqueName: \"kubernetes.io/projected/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-kube-api-access-pt2jq\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.113942 master-0 kubenswrapper[8606]: I1204 22:01:32.113131 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-images\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.113942 master-0 kubenswrapper[8606]: I1204 22:01:32.113176 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.113942 master-0 kubenswrapper[8606]: I1204 22:01:32.113885 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-images\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.122647 master-0 kubenswrapper[8606]: I1204 22:01:32.121678 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cert\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.122647 master-0 kubenswrapper[8606]: I1204 22:01:32.121819 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.139302 master-0 kubenswrapper[8606]: I1204 22:01:32.139250 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2jq\" (UniqueName: \"kubernetes.io/projected/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-kube-api-access-pt2jq\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.141823 master-0 kubenswrapper[8606]: I1204 22:01:32.141778 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx"] Dec 04 22:01:32.142640 master-0 kubenswrapper[8606]: I1204 22:01:32.142585 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.145557 master-0 kubenswrapper[8606]: I1204 22:01:32.145519 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-x7b78" Dec 04 22:01:32.148529 master-0 kubenswrapper[8606]: I1204 22:01:32.145753 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 22:01:32.148529 master-0 kubenswrapper[8606]: I1204 22:01:32.145869 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 22:01:32.148529 master-0 kubenswrapper[8606]: I1204 22:01:32.147347 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 22:01:32.194463 master-0 kubenswrapper[8606]: I1204 22:01:32.193701 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx"] Dec 04 22:01:32.219058 master-0 kubenswrapper[8606]: I1204 22:01:32.213943 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nrj9\" (UniqueName: \"kubernetes.io/projected/810c363b-a4c7-428d-a2fb-285adc29f477-kube-api-access-2nrj9\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.219058 master-0 kubenswrapper[8606]: I1204 22:01:32.214025 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.219058 master-0 kubenswrapper[8606]: I1204 22:01:32.214079 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-srv-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.219058 master-0 kubenswrapper[8606]: I1204 22:01:32.214105 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/810c363b-a4c7-428d-a2fb-285adc29f477-available-featuregates\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.219058 master-0 kubenswrapper[8606]: I1204 22:01:32.214161 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810c363b-a4c7-428d-a2fb-285adc29f477-serving-cert\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.219058 master-0 kubenswrapper[8606]: I1204 22:01:32.214189 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsk29\" (UniqueName: \"kubernetes.io/projected/967bf4ac-f025-4296-8ed9-183a345f6b7c-kube-api-access-hsk29\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.233106 master-0 kubenswrapper[8606]: I1204 22:01:32.233053 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-srv-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.235213 master-0 kubenswrapper[8606]: I1204 22:01:32.235175 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsk29\" (UniqueName: \"kubernetes.io/projected/967bf4ac-f025-4296-8ed9-183a345f6b7c-kube-api-access-hsk29\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.238166 master-0 kubenswrapper[8606]: I1204 22:01:32.238012 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.250153 master-0 kubenswrapper[8606]: I1204 22:01:32.249486 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt"] Dec 04 22:01:32.250760 master-0 kubenswrapper[8606]: I1204 22:01:32.250741 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.254396 master-0 kubenswrapper[8606]: I1204 22:01:32.254338 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 22:01:32.274827 master-0 kubenswrapper[8606]: I1204 22:01:32.274756 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt"] Dec 04 22:01:32.285073 master-0 kubenswrapper[8606]: I1204 22:01:32.285013 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:01:32.308902 master-0 kubenswrapper[8606]: I1204 22:01:32.308852 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-55965856b6-7vlpp"] Dec 04 22:01:32.309575 master-0 kubenswrapper[8606]: I1204 22:01:32.309546 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.311771 master-0 kubenswrapper[8606]: I1204 22:01:32.311719 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 04 22:01:32.313314 master-0 kubenswrapper[8606]: I1204 22:01:32.313283 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-n8qln" Dec 04 22:01:32.313453 master-0 kubenswrapper[8606]: I1204 22:01:32.313430 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 04 22:01:32.313581 master-0 kubenswrapper[8606]: I1204 22:01:32.313556 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314614 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810c363b-a4c7-428d-a2fb-285adc29f477-serving-cert\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314645 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nrj9\" (UniqueName: \"kubernetes.io/projected/810c363b-a4c7-428d-a2fb-285adc29f477-kube-api-access-2nrj9\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314691 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7d9j\" (UniqueName: \"kubernetes.io/projected/e7a7f632-2442-4837-b068-c22b03c71fb0-kube-api-access-c7d9j\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314714 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/810c363b-a4c7-428d-a2fb-285adc29f477-available-featuregates\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314730 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d142201-6e77-4828-b86b-05d4144a2f08-serving-cert\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314767 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314795 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2d142201-6e77-4828-b86b-05d4144a2f08-snapshots\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314813 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-service-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314832 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cblk\" (UniqueName: \"kubernetes.io/projected/2d142201-6e77-4828-b86b-05d4144a2f08-kube-api-access-6cblk\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314854 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-trusted-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.314944 master-0 kubenswrapper[8606]: I1204 22:01:32.314876 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-srv-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.315622 master-0 kubenswrapper[8606]: I1204 22:01:32.315567 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 04 22:01:32.316375 master-0 kubenswrapper[8606]: I1204 22:01:32.316332 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/810c363b-a4c7-428d-a2fb-285adc29f477-available-featuregates\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.321998 master-0 kubenswrapper[8606]: I1204 22:01:32.321948 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 04 22:01:32.327488 master-0 kubenswrapper[8606]: I1204 22:01:32.327432 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810c363b-a4c7-428d-a2fb-285adc29f477-serving-cert\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.331005 master-0 kubenswrapper[8606]: I1204 22:01:32.330964 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-55965856b6-7vlpp"] Dec 04 22:01:32.333231 master-0 kubenswrapper[8606]: I1204 22:01:32.333159 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nrj9\" (UniqueName: \"kubernetes.io/projected/810c363b-a4c7-428d-a2fb-285adc29f477-kube-api-access-2nrj9\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.349958 master-0 kubenswrapper[8606]: I1204 22:01:32.349884 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:32.418467 master-0 kubenswrapper[8606]: I1204 22:01:32.418394 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7d9j\" (UniqueName: \"kubernetes.io/projected/e7a7f632-2442-4837-b068-c22b03c71fb0-kube-api-access-c7d9j\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.419183 master-0 kubenswrapper[8606]: I1204 22:01:32.418478 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d142201-6e77-4828-b86b-05d4144a2f08-serving-cert\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.419183 master-0 kubenswrapper[8606]: I1204 22:01:32.418565 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.419183 master-0 kubenswrapper[8606]: I1204 22:01:32.418611 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2d142201-6e77-4828-b86b-05d4144a2f08-snapshots\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.419183 master-0 kubenswrapper[8606]: I1204 22:01:32.418632 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-service-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.419183 master-0 kubenswrapper[8606]: I1204 22:01:32.418656 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cblk\" (UniqueName: \"kubernetes.io/projected/2d142201-6e77-4828-b86b-05d4144a2f08-kube-api-access-6cblk\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.419183 master-0 kubenswrapper[8606]: I1204 22:01:32.418687 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-trusted-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.419183 master-0 kubenswrapper[8606]: I1204 22:01:32.418717 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-srv-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.424734 master-0 kubenswrapper[8606]: I1204 22:01:32.424688 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.425302 master-0 kubenswrapper[8606]: I1204 22:01:32.425239 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2d142201-6e77-4828-b86b-05d4144a2f08-snapshots\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.425368 master-0 kubenswrapper[8606]: I1204 22:01:32.425314 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-srv-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.427294 master-0 kubenswrapper[8606]: I1204 22:01:32.427269 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d142201-6e77-4828-b86b-05d4144a2f08-serving-cert\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.427387 master-0 kubenswrapper[8606]: I1204 22:01:32.427295 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-service-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.428013 master-0 kubenswrapper[8606]: I1204 22:01:32.427959 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-trusted-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.442797 master-0 kubenswrapper[8606]: I1204 22:01:32.442739 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7d9j\" (UniqueName: \"kubernetes.io/projected/e7a7f632-2442-4837-b068-c22b03c71fb0-kube-api-access-c7d9j\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.442998 master-0 kubenswrapper[8606]: I1204 22:01:32.442922 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cblk\" (UniqueName: \"kubernetes.io/projected/2d142201-6e77-4828-b86b-05d4144a2f08-kube-api-access-6cblk\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.526539 master-0 kubenswrapper[8606]: I1204 22:01:32.526235 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw"] Dec 04 22:01:32.527045 master-0 kubenswrapper[8606]: I1204 22:01:32.526973 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:01:32.531896 master-0 kubenswrapper[8606]: I1204 22:01:32.531830 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 04 22:01:32.532445 master-0 kubenswrapper[8606]: I1204 22:01:32.532417 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-69625" Dec 04 22:01:32.542022 master-0 kubenswrapper[8606]: I1204 22:01:32.541941 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw"] Dec 04 22:01:32.594765 master-0 kubenswrapper[8606]: I1204 22:01:32.594639 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:32.628426 master-0 kubenswrapper[8606]: I1204 22:01:32.628368 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a043ea49-97f9-4ae6-83b9-733f12754d94-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:01:32.628426 master-0 kubenswrapper[8606]: I1204 22:01:32.628441 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jlvp\" (UniqueName: \"kubernetes.io/projected/a043ea49-97f9-4ae6-83b9-733f12754d94-kube-api-access-6jlvp\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:01:32.662967 master-0 kubenswrapper[8606]: I1204 22:01:32.662909 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:32.683432 master-0 kubenswrapper[8606]: I1204 22:01:32.682817 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:01:32.706300 master-0 kubenswrapper[8606]: I1204 22:01:32.706202 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba","Type":"ContainerStarted","Data":"7f95f72da52c53d3c8d88cdae7b632b1e707bccffe42c9e45b84331a1108d0c6"} Dec 04 22:01:32.706575 master-0 kubenswrapper[8606]: I1204 22:01:32.706346 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba","Type":"ContainerStarted","Data":"ec44f98a134fed3f7d27e7c218ca88ef4cd2ac21b667420e0029267e424b27bd"} Dec 04 22:01:32.708653 master-0 kubenswrapper[8606]: I1204 22:01:32.708607 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" event={"ID":"5598683a-cd32-486d-8839-205829d55cc2","Type":"ContainerStarted","Data":"9cd533ead3cae5dfe671d9ed582ce0c4fe3846e3703c3c3ebeec9d68db23459a"} Dec 04 22:01:32.708731 master-0 kubenswrapper[8606]: I1204 22:01:32.708656 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" event={"ID":"5598683a-cd32-486d-8839-205829d55cc2","Type":"ContainerStarted","Data":"f19df2e06dca5a2f80ab8037e49629477ed1cac1328bfc7445b4bdab076568fc"} Dec 04 22:01:32.747388 master-0 kubenswrapper[8606]: I1204 22:01:32.742961 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj"] Dec 04 22:01:32.747388 master-0 kubenswrapper[8606]: I1204 22:01:32.743174 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=2.7431469870000003 podStartE2EDuration="2.743146987s" podCreationTimestamp="2025-12-04 22:01:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:32.739313998 +0000 UTC m=+57.549616213" watchObservedRunningTime="2025-12-04 22:01:32.743146987 +0000 UTC m=+57.553449202" Dec 04 22:01:32.747388 master-0 kubenswrapper[8606]: I1204 22:01:32.744901 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jlvp\" (UniqueName: \"kubernetes.io/projected/a043ea49-97f9-4ae6-83b9-733f12754d94-kube-api-access-6jlvp\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:01:32.747388 master-0 kubenswrapper[8606]: I1204 22:01:32.745266 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a043ea49-97f9-4ae6-83b9-733f12754d94-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:01:32.754595 master-0 kubenswrapper[8606]: I1204 22:01:32.749236 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a043ea49-97f9-4ae6-83b9-733f12754d94-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:01:32.761740 master-0 kubenswrapper[8606]: I1204 22:01:32.761676 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jlvp\" (UniqueName: \"kubernetes.io/projected/a043ea49-97f9-4ae6-83b9-733f12754d94-kube-api-access-6jlvp\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:01:32.819361 master-0 kubenswrapper[8606]: I1204 22:01:32.819299 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8"] Dec 04 22:01:32.870189 master-0 kubenswrapper[8606]: I1204 22:01:32.870138 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:01:33.565959 master-0 kubenswrapper[8606]: I1204 22:01:33.562225 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx"] Dec 04 22:01:33.591540 master-0 kubenswrapper[8606]: W1204 22:01:33.589818 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3899a38_39b8_4b48_81e5_4d8854ecc8ab.slice/crio-331b7dcdf8e2d87b8a376fb7581bf5553e5209af627a95ddb5944fe5be12cb6d WatchSource:0}: Error finding container 331b7dcdf8e2d87b8a376fb7581bf5553e5209af627a95ddb5944fe5be12cb6d: Status 404 returned error can't find the container with id 331b7dcdf8e2d87b8a376fb7581bf5553e5209af627a95ddb5944fe5be12cb6d Dec 04 22:01:33.625546 master-0 kubenswrapper[8606]: I1204 22:01:33.622359 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx"] Dec 04 22:01:33.625546 master-0 kubenswrapper[8606]: I1204 22:01:33.623311 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.633593 master-0 kubenswrapper[8606]: I1204 22:01:33.630821 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx"] Dec 04 22:01:33.633593 master-0 kubenswrapper[8606]: I1204 22:01:33.631410 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-dtp6d" Dec 04 22:01:33.633593 master-0 kubenswrapper[8606]: I1204 22:01:33.631625 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 22:01:33.633593 master-0 kubenswrapper[8606]: I1204 22:01:33.631842 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 22:01:33.633593 master-0 kubenswrapper[8606]: I1204 22:01:33.632006 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 22:01:33.633593 master-0 kubenswrapper[8606]: I1204 22:01:33.633246 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 22:01:33.639533 master-0 kubenswrapper[8606]: I1204 22:01:33.637733 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 22:01:33.667783 master-0 kubenswrapper[8606]: I1204 22:01:33.663322 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8g99\" (UniqueName: \"kubernetes.io/projected/55c4f1e1-1b78-45ec-915d-8055ab3e2786-kube-api-access-b8g99\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.667783 master-0 kubenswrapper[8606]: I1204 22:01:33.663447 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55c4f1e1-1b78-45ec-915d-8055ab3e2786-proxy-tls\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.667783 master-0 kubenswrapper[8606]: I1204 22:01:33.663482 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.667783 master-0 kubenswrapper[8606]: I1204 22:01:33.663562 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-images\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.715178 master-0 kubenswrapper[8606]: I1204 22:01:33.715126 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" event={"ID":"967bf4ac-f025-4296-8ed9-183a345f6b7c","Type":"ContainerStarted","Data":"aab8d5d7d7caaf80016fb84803d68f187962ac87f50be8e340ea0edecd46547b"} Dec 04 22:01:33.715926 master-0 kubenswrapper[8606]: I1204 22:01:33.715898 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerStarted","Data":"0a8ac4004225e98679de5f00828ef4b72b059bfd913e3c0b107c1aef5ccb1667"} Dec 04 22:01:33.718313 master-0 kubenswrapper[8606]: I1204 22:01:33.718269 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerStarted","Data":"331b7dcdf8e2d87b8a376fb7581bf5553e5209af627a95ddb5944fe5be12cb6d"} Dec 04 22:01:33.766575 master-0 kubenswrapper[8606]: I1204 22:01:33.764087 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8g99\" (UniqueName: \"kubernetes.io/projected/55c4f1e1-1b78-45ec-915d-8055ab3e2786-kube-api-access-b8g99\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.766575 master-0 kubenswrapper[8606]: I1204 22:01:33.764155 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55c4f1e1-1b78-45ec-915d-8055ab3e2786-proxy-tls\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.766575 master-0 kubenswrapper[8606]: I1204 22:01:33.764400 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.766575 master-0 kubenswrapper[8606]: I1204 22:01:33.764644 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-images\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.766575 master-0 kubenswrapper[8606]: I1204 22:01:33.765330 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.767025 master-0 kubenswrapper[8606]: I1204 22:01:33.766765 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-images\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.768690 master-0 kubenswrapper[8606]: I1204 22:01:33.768462 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55c4f1e1-1b78-45ec-915d-8055ab3e2786-proxy-tls\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.783324 master-0 kubenswrapper[8606]: I1204 22:01:33.783253 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8g99\" (UniqueName: \"kubernetes.io/projected/55c4f1e1-1b78-45ec-915d-8055ab3e2786-kube-api-access-b8g99\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.814109 master-0 kubenswrapper[8606]: I1204 22:01:33.813778 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p"] Dec 04 22:01:33.815089 master-0 kubenswrapper[8606]: I1204 22:01:33.815050 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.818918 master-0 kubenswrapper[8606]: I1204 22:01:33.818014 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 04 22:01:33.818918 master-0 kubenswrapper[8606]: I1204 22:01:33.818247 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 04 22:01:33.819590 master-0 kubenswrapper[8606]: I1204 22:01:33.819363 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-sf4xn" Dec 04 22:01:33.819800 master-0 kubenswrapper[8606]: I1204 22:01:33.819752 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 04 22:01:33.820538 master-0 kubenswrapper[8606]: I1204 22:01:33.820491 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 22:01:33.821028 master-0 kubenswrapper[8606]: I1204 22:01:33.820942 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:01:33.866079 master-0 kubenswrapper[8606]: I1204 22:01:33.866020 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d178411-75a4-4ff8-9764-6f3e3944eca4-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.866079 master-0 kubenswrapper[8606]: I1204 22:01:33.866073 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwfwd\" (UniqueName: \"kubernetes.io/projected/3d178411-75a4-4ff8-9764-6f3e3944eca4-kube-api-access-jwfwd\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.866247 master-0 kubenswrapper[8606]: I1204 22:01:33.866109 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.866247 master-0 kubenswrapper[8606]: I1204 22:01:33.866147 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-images\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.866247 master-0 kubenswrapper[8606]: I1204 22:01:33.866171 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3d178411-75a4-4ff8-9764-6f3e3944eca4-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.970581 master-0 kubenswrapper[8606]: I1204 22:01:33.969104 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-images\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.970581 master-0 kubenswrapper[8606]: I1204 22:01:33.969151 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3d178411-75a4-4ff8-9764-6f3e3944eca4-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.970581 master-0 kubenswrapper[8606]: I1204 22:01:33.969210 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d178411-75a4-4ff8-9764-6f3e3944eca4-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.970581 master-0 kubenswrapper[8606]: I1204 22:01:33.969243 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwfwd\" (UniqueName: \"kubernetes.io/projected/3d178411-75a4-4ff8-9764-6f3e3944eca4-kube-api-access-jwfwd\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.970581 master-0 kubenswrapper[8606]: I1204 22:01:33.969268 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.970581 master-0 kubenswrapper[8606]: I1204 22:01:33.970034 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-images\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.970581 master-0 kubenswrapper[8606]: I1204 22:01:33.970097 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.971277 master-0 kubenswrapper[8606]: I1204 22:01:33.970687 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3d178411-75a4-4ff8-9764-6f3e3944eca4-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.973859 master-0 kubenswrapper[8606]: I1204 22:01:33.973821 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d178411-75a4-4ff8-9764-6f3e3944eca4-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:33.992102 master-0 kubenswrapper[8606]: I1204 22:01:33.992062 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:01:33.993366 master-0 kubenswrapper[8606]: I1204 22:01:33.993303 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwfwd\" (UniqueName: \"kubernetes.io/projected/3d178411-75a4-4ff8-9764-6f3e3944eca4-kube-api-access-jwfwd\") pod \"cluster-cloud-controller-manager-operator-74f484689c-nr72p\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:34.169403 master-0 kubenswrapper[8606]: I1204 22:01:34.169353 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:01:34.197313 master-0 kubenswrapper[8606]: I1204 22:01:34.184863 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt"] Dec 04 22:01:34.228949 master-0 kubenswrapper[8606]: W1204 22:01:34.228891 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7a7f632_2442_4837_b068_c22b03c71fb0.slice/crio-4d378b74b84c73d0247bb3e1b1ed1084c6d9b778481316b2ed8d047f73fdee53 WatchSource:0}: Error finding container 4d378b74b84c73d0247bb3e1b1ed1084c6d9b778481316b2ed8d047f73fdee53: Status 404 returned error can't find the container with id 4d378b74b84c73d0247bb3e1b1ed1084c6d9b778481316b2ed8d047f73fdee53 Dec 04 22:01:34.249205 master-0 kubenswrapper[8606]: I1204 22:01:34.249175 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw"] Dec 04 22:01:34.368664 master-0 kubenswrapper[8606]: I1204 22:01:34.367258 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-55965856b6-7vlpp"] Dec 04 22:01:34.474719 master-0 kubenswrapper[8606]: I1204 22:01:34.474678 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-88d48b57d-pp4fd"] Dec 04 22:01:34.475877 master-0 kubenswrapper[8606]: I1204 22:01:34.475861 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.493981 master-0 kubenswrapper[8606]: I1204 22:01:34.493940 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mw665" Dec 04 22:01:34.494261 master-0 kubenswrapper[8606]: I1204 22:01:34.494230 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 22:01:34.494662 master-0 kubenswrapper[8606]: I1204 22:01:34.494626 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 22:01:34.494908 master-0 kubenswrapper[8606]: I1204 22:01:34.494883 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 22:01:34.497608 master-0 kubenswrapper[8606]: I1204 22:01:34.497495 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-88d48b57d-pp4fd"] Dec 04 22:01:34.505890 master-0 kubenswrapper[8606]: I1204 22:01:34.502377 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.505890 master-0 kubenswrapper[8606]: I1204 22:01:34.502455 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-images\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.505890 master-0 kubenswrapper[8606]: I1204 22:01:34.502481 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-config\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.505890 master-0 kubenswrapper[8606]: I1204 22:01:34.502551 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g7n9\" (UniqueName: \"kubernetes.io/projected/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-kube-api-access-4g7n9\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.604689 master-0 kubenswrapper[8606]: I1204 22:01:34.604637 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-config\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.604689 master-0 kubenswrapper[8606]: I1204 22:01:34.604695 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g7n9\" (UniqueName: \"kubernetes.io/projected/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-kube-api-access-4g7n9\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.605247 master-0 kubenswrapper[8606]: I1204 22:01:34.604804 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.606026 master-0 kubenswrapper[8606]: I1204 22:01:34.605974 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-config\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.608078 master-0 kubenswrapper[8606]: I1204 22:01:34.608052 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-images\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.609748 master-0 kubenswrapper[8606]: I1204 22:01:34.608854 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-images\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.615698 master-0 kubenswrapper[8606]: I1204 22:01:34.614928 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.634680 master-0 kubenswrapper[8606]: I1204 22:01:34.630545 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g7n9\" (UniqueName: \"kubernetes.io/projected/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-kube-api-access-4g7n9\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.652305 master-0 kubenswrapper[8606]: I1204 22:01:34.651921 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx"] Dec 04 22:01:34.681522 master-0 kubenswrapper[8606]: W1204 22:01:34.679646 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c4f1e1_1b78_45ec_915d_8055ab3e2786.slice/crio-7e789ca56ff169f6e94ee218684222ae21d64421f7ece2b51fa33799d9ab1ccc WatchSource:0}: Error finding container 7e789ca56ff169f6e94ee218684222ae21d64421f7ece2b51fa33799d9ab1ccc: Status 404 returned error can't find the container with id 7e789ca56ff169f6e94ee218684222ae21d64421f7ece2b51fa33799d9ab1ccc Dec 04 22:01:34.740322 master-0 kubenswrapper[8606]: I1204 22:01:34.738193 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-7vlpp" event={"ID":"2d142201-6e77-4828-b86b-05d4144a2f08","Type":"ContainerStarted","Data":"1ed0b431491d7769d0a806c2775a07d37b29bbfb434d8d9d3536f46e64b03c26"} Dec 04 22:01:34.740660 master-0 kubenswrapper[8606]: I1204 22:01:34.740619 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" event={"ID":"a043ea49-97f9-4ae6-83b9-733f12754d94","Type":"ContainerStarted","Data":"db48b3f4e1a29b857cceceb534352169660eed12f652161e8b97983c91525c06"} Dec 04 22:01:34.750537 master-0 kubenswrapper[8606]: I1204 22:01:34.749671 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" event={"ID":"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b","Type":"ContainerStarted","Data":"6528b925137b080482c3192c669cf9e80d961a303d43da94e9d385cdc0ad11de"} Dec 04 22:01:34.750537 master-0 kubenswrapper[8606]: I1204 22:01:34.749738 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" event={"ID":"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b","Type":"ContainerStarted","Data":"58d0dda8eccec66389fe599f3b47f626740432dcc607d4fd26c725e332dfe13e"} Dec 04 22:01:34.753560 master-0 kubenswrapper[8606]: I1204 22:01:34.753479 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" event={"ID":"3d178411-75a4-4ff8-9764-6f3e3944eca4","Type":"ContainerStarted","Data":"118140813e360bb571639fba61c3203c7502dddd911daa90a8dc26b3b0128dfb"} Dec 04 22:01:34.757658 master-0 kubenswrapper[8606]: I1204 22:01:34.757541 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" event={"ID":"e7a7f632-2442-4837-b068-c22b03c71fb0","Type":"ContainerStarted","Data":"a27f538b778d0ea81aed785fdee98c12d30d1580e7e14efd70e9fce3d624a217"} Dec 04 22:01:34.757658 master-0 kubenswrapper[8606]: I1204 22:01:34.757605 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" event={"ID":"e7a7f632-2442-4837-b068-c22b03c71fb0","Type":"ContainerStarted","Data":"4d378b74b84c73d0247bb3e1b1ed1084c6d9b778481316b2ed8d047f73fdee53"} Dec 04 22:01:34.758684 master-0 kubenswrapper[8606]: I1204 22:01:34.758577 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:34.761122 master-0 kubenswrapper[8606]: I1204 22:01:34.761097 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" event={"ID":"55c4f1e1-1b78-45ec-915d-8055ab3e2786","Type":"ContainerStarted","Data":"7e789ca56ff169f6e94ee218684222ae21d64421f7ece2b51fa33799d9ab1ccc"} Dec 04 22:01:34.763300 master-0 kubenswrapper[8606]: I1204 22:01:34.763277 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" event={"ID":"967bf4ac-f025-4296-8ed9-183a345f6b7c","Type":"ContainerStarted","Data":"83b6f31f110e8f986286c8858fee161cd2cfbc203132898174a8f254b84462a0"} Dec 04 22:01:34.763951 master-0 kubenswrapper[8606]: I1204 22:01:34.763928 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:34.773881 master-0 kubenswrapper[8606]: I1204 22:01:34.773717 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" podStartSLOduration=2.93222081 podStartE2EDuration="5.773699316s" podCreationTimestamp="2025-12-04 22:01:29 +0000 UTC" firstStartedPulling="2025-12-04 22:01:30.857658645 +0000 UTC m=+55.667960870" lastFinishedPulling="2025-12-04 22:01:33.699137171 +0000 UTC m=+58.509439376" observedRunningTime="2025-12-04 22:01:34.770862656 +0000 UTC m=+59.581164881" watchObservedRunningTime="2025-12-04 22:01:34.773699316 +0000 UTC m=+59.584001531" Dec 04 22:01:34.774718 master-0 kubenswrapper[8606]: I1204 22:01:34.774571 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:01:34.780595 master-0 kubenswrapper[8606]: I1204 22:01:34.777120 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:01:34.792647 master-0 kubenswrapper[8606]: I1204 22:01:34.792481 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" podStartSLOduration=3.792455865 podStartE2EDuration="3.792455865s" podCreationTimestamp="2025-12-04 22:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:34.790233242 +0000 UTC m=+59.600535457" watchObservedRunningTime="2025-12-04 22:01:34.792455865 +0000 UTC m=+59.602758080" Dec 04 22:01:34.808801 master-0 kubenswrapper[8606]: I1204 22:01:34.808773 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:01:34.824844 master-0 kubenswrapper[8606]: I1204 22:01:34.824757 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" podStartSLOduration=2.824734238 podStartE2EDuration="2.824734238s" podCreationTimestamp="2025-12-04 22:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:34.819844369 +0000 UTC m=+59.630146604" watchObservedRunningTime="2025-12-04 22:01:34.824734238 +0000 UTC m=+59.635036453" Dec 04 22:01:35.288688 master-0 kubenswrapper[8606]: I1204 22:01:35.288626 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-88d48b57d-pp4fd"] Dec 04 22:01:35.671555 master-0 kubenswrapper[8606]: I1204 22:01:35.671250 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vvkjf"] Dec 04 22:01:35.672540 master-0 kubenswrapper[8606]: I1204 22:01:35.672295 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:35.675576 master-0 kubenswrapper[8606]: I1204 22:01:35.674895 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-4nxv4" Dec 04 22:01:35.701035 master-0 kubenswrapper[8606]: I1204 22:01:35.700996 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvkjf"] Dec 04 22:01:35.722672 master-0 kubenswrapper[8606]: I1204 22:01:35.722612 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn"] Dec 04 22:01:35.724065 master-0 kubenswrapper[8606]: I1204 22:01:35.724038 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.730912 master-0 kubenswrapper[8606]: I1204 22:01:35.730750 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn"] Dec 04 22:01:35.731196 master-0 kubenswrapper[8606]: I1204 22:01:35.731182 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 22:01:35.752979 master-0 kubenswrapper[8606]: I1204 22:01:35.752919 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-catalog-content\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:35.753543 master-0 kubenswrapper[8606]: I1204 22:01:35.753479 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-utilities\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:35.753673 master-0 kubenswrapper[8606]: I1204 22:01:35.753649 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwx5k\" (UniqueName: \"kubernetes.io/projected/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-kube-api-access-mwx5k\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:35.776983 master-0 kubenswrapper[8606]: I1204 22:01:35.776827 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" event={"ID":"55c4f1e1-1b78-45ec-915d-8055ab3e2786","Type":"ContainerStarted","Data":"546a138e440107c12bdd5a4d067e8d3169d68e83def0740c6c5856f29b09acfd"} Dec 04 22:01:35.776983 master-0 kubenswrapper[8606]: I1204 22:01:35.776906 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" event={"ID":"55c4f1e1-1b78-45ec-915d-8055ab3e2786","Type":"ContainerStarted","Data":"1326348e3e2d8bc1da0a2a933c011acb7d00a92e48a2890745437b6f15960271"} Dec 04 22:01:35.801897 master-0 kubenswrapper[8606]: I1204 22:01:35.800914 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" podStartSLOduration=2.8008729409999997 podStartE2EDuration="2.800872941s" podCreationTimestamp="2025-12-04 22:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:01:35.799585365 +0000 UTC m=+60.609887580" watchObservedRunningTime="2025-12-04 22:01:35.800872941 +0000 UTC m=+60.611175156" Dec 04 22:01:35.856227 master-0 kubenswrapper[8606]: I1204 22:01:35.854777 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-utilities\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:35.856227 master-0 kubenswrapper[8606]: I1204 22:01:35.854868 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-webhook-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.856227 master-0 kubenswrapper[8606]: I1204 22:01:35.854914 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-tmpfs\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.856227 master-0 kubenswrapper[8606]: I1204 22:01:35.854957 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpksd\" (UniqueName: \"kubernetes.io/projected/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-kube-api-access-gpksd\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.856227 master-0 kubenswrapper[8606]: I1204 22:01:35.854984 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwx5k\" (UniqueName: \"kubernetes.io/projected/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-kube-api-access-mwx5k\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:35.856227 master-0 kubenswrapper[8606]: I1204 22:01:35.855014 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-catalog-content\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:35.856227 master-0 kubenswrapper[8606]: I1204 22:01:35.855032 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-apiservice-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.856227 master-0 kubenswrapper[8606]: I1204 22:01:35.855536 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-utilities\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:35.856227 master-0 kubenswrapper[8606]: I1204 22:01:35.856184 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-catalog-content\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:35.865752 master-0 kubenswrapper[8606]: I1204 22:01:35.865675 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sw6sx"] Dec 04 22:01:35.869136 master-0 kubenswrapper[8606]: I1204 22:01:35.869094 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:35.876999 master-0 kubenswrapper[8606]: I1204 22:01:35.875387 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-pjht7" Dec 04 22:01:35.886691 master-0 kubenswrapper[8606]: I1204 22:01:35.886596 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwx5k\" (UniqueName: \"kubernetes.io/projected/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-kube-api-access-mwx5k\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:35.887626 master-0 kubenswrapper[8606]: I1204 22:01:35.887570 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sw6sx"] Dec 04 22:01:35.972953 master-0 kubenswrapper[8606]: I1204 22:01:35.972211 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-webhook-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.972953 master-0 kubenswrapper[8606]: I1204 22:01:35.972348 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-tmpfs\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.972953 master-0 kubenswrapper[8606]: I1204 22:01:35.972875 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpksd\" (UniqueName: \"kubernetes.io/projected/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-kube-api-access-gpksd\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.974104 master-0 kubenswrapper[8606]: I1204 22:01:35.973452 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-apiservice-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.976209 master-0 kubenswrapper[8606]: I1204 22:01:35.975670 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-tmpfs\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.989246 master-0 kubenswrapper[8606]: I1204 22:01:35.989131 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-apiservice-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:35.989709 master-0 kubenswrapper[8606]: I1204 22:01:35.989666 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-webhook-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:36.016630 master-0 kubenswrapper[8606]: I1204 22:01:36.000240 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpksd\" (UniqueName: \"kubernetes.io/projected/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-kube-api-access-gpksd\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:36.016630 master-0 kubenswrapper[8606]: I1204 22:01:36.011605 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:01:36.054017 master-0 kubenswrapper[8606]: I1204 22:01:36.053947 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:01:36.075036 master-0 kubenswrapper[8606]: I1204 22:01:36.074964 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9rxt\" (UniqueName: \"kubernetes.io/projected/29828f55-427b-4fe3-8713-03bcd6ac9dec-kube-api-access-t9rxt\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:36.075036 master-0 kubenswrapper[8606]: I1204 22:01:36.075020 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-catalog-content\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:36.075739 master-0 kubenswrapper[8606]: I1204 22:01:36.075682 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-utilities\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:36.178076 master-0 kubenswrapper[8606]: I1204 22:01:36.177822 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-utilities\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:36.178076 master-0 kubenswrapper[8606]: I1204 22:01:36.177962 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rxt\" (UniqueName: \"kubernetes.io/projected/29828f55-427b-4fe3-8713-03bcd6ac9dec-kube-api-access-t9rxt\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:36.178076 master-0 kubenswrapper[8606]: I1204 22:01:36.177997 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-catalog-content\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:36.178859 master-0 kubenswrapper[8606]: I1204 22:01:36.178811 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-utilities\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:36.178955 master-0 kubenswrapper[8606]: I1204 22:01:36.178928 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-catalog-content\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:36.197427 master-0 kubenswrapper[8606]: I1204 22:01:36.197344 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rxt\" (UniqueName: \"kubernetes.io/projected/29828f55-427b-4fe3-8713-03bcd6ac9dec-kube-api-access-t9rxt\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:36.205368 master-0 kubenswrapper[8606]: I1204 22:01:36.205292 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 04 22:01:36.205673 master-0 kubenswrapper[8606]: I1204 22:01:36.205552 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="9fca9b57-0b34-46d4-9f3a-dbd4acd630f6" containerName="installer" containerID="cri-o://63eed725435ffc8fd80ef462cf5e0dcb22612c336b3cabe2d142c84feedc099e" gracePeriod=30 Dec 04 22:01:36.218050 master-0 kubenswrapper[8606]: I1204 22:01:36.217990 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:01:36.783669 master-0 kubenswrapper[8606]: I1204 22:01:36.783445 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" event={"ID":"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9","Type":"ContainerStarted","Data":"c33c6e351b6426a43cd389bbd81cef5f132f38999fc440de5ea48da556537499"} Dec 04 22:01:37.265177 master-0 kubenswrapper[8606]: I1204 22:01:37.265117 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sdrkm"] Dec 04 22:01:37.266372 master-0 kubenswrapper[8606]: I1204 22:01:37.266342 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.268408 master-0 kubenswrapper[8606]: I1204 22:01:37.268365 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-psdkj" Dec 04 22:01:37.275894 master-0 kubenswrapper[8606]: I1204 22:01:37.275831 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sdrkm"] Dec 04 22:01:37.397600 master-0 kubenswrapper[8606]: I1204 22:01:37.397534 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-utilities\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.397826 master-0 kubenswrapper[8606]: I1204 22:01:37.397722 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-catalog-content\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.397826 master-0 kubenswrapper[8606]: I1204 22:01:37.397797 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gj4j\" (UniqueName: \"kubernetes.io/projected/ae107ad4-104c-4264-9844-afb3af28b19e-kube-api-access-9gj4j\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.499353 master-0 kubenswrapper[8606]: I1204 22:01:37.499264 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-utilities\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.499621 master-0 kubenswrapper[8606]: I1204 22:01:37.499374 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-catalog-content\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.499621 master-0 kubenswrapper[8606]: I1204 22:01:37.499419 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gj4j\" (UniqueName: \"kubernetes.io/projected/ae107ad4-104c-4264-9844-afb3af28b19e-kube-api-access-9gj4j\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.500139 master-0 kubenswrapper[8606]: I1204 22:01:37.499975 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-catalog-content\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.500424 master-0 kubenswrapper[8606]: I1204 22:01:37.500399 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-utilities\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.520526 master-0 kubenswrapper[8606]: I1204 22:01:37.520467 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gj4j\" (UniqueName: \"kubernetes.io/projected/ae107ad4-104c-4264-9844-afb3af28b19e-kube-api-access-9gj4j\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.595206 master-0 kubenswrapper[8606]: I1204 22:01:37.595010 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:01:37.760082 master-0 kubenswrapper[8606]: I1204 22:01:37.759975 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-ppnv8"] Dec 04 22:01:37.761028 master-0 kubenswrapper[8606]: I1204 22:01:37.760955 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:37.765043 master-0 kubenswrapper[8606]: I1204 22:01:37.764999 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-sdd6h" Dec 04 22:01:37.766474 master-0 kubenswrapper[8606]: I1204 22:01:37.766451 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 22:01:37.904006 master-0 kubenswrapper[8606]: I1204 22:01:37.903908 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-mcd-auth-proxy-config\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:37.904918 master-0 kubenswrapper[8606]: I1204 22:01:37.904061 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-proxy-tls\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:37.904918 master-0 kubenswrapper[8606]: I1204 22:01:37.904088 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc47q\" (UniqueName: \"kubernetes.io/projected/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-kube-api-access-jc47q\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:37.904918 master-0 kubenswrapper[8606]: I1204 22:01:37.904114 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-rootfs\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:38.005707 master-0 kubenswrapper[8606]: I1204 22:01:38.005617 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-proxy-tls\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:38.005707 master-0 kubenswrapper[8606]: I1204 22:01:38.005668 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc47q\" (UniqueName: \"kubernetes.io/projected/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-kube-api-access-jc47q\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:38.005707 master-0 kubenswrapper[8606]: I1204 22:01:38.005703 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-rootfs\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:38.006301 master-0 kubenswrapper[8606]: I1204 22:01:38.005759 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-mcd-auth-proxy-config\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:38.006301 master-0 kubenswrapper[8606]: I1204 22:01:38.006013 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-rootfs\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:38.009549 master-0 kubenswrapper[8606]: I1204 22:01:38.009474 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-proxy-tls\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:38.011099 master-0 kubenswrapper[8606]: I1204 22:01:38.011058 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-mcd-auth-proxy-config\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:38.023017 master-0 kubenswrapper[8606]: I1204 22:01:38.022960 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc47q\" (UniqueName: \"kubernetes.io/projected/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-kube-api-access-jc47q\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:38.092365 master-0 kubenswrapper[8606]: I1204 22:01:38.092178 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:01:38.459046 master-0 kubenswrapper[8606]: I1204 22:01:38.458830 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zt44t"] Dec 04 22:01:38.469271 master-0 kubenswrapper[8606]: I1204 22:01:38.461708 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:38.471069 master-0 kubenswrapper[8606]: I1204 22:01:38.470823 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-h7rbd" Dec 04 22:01:38.484478 master-0 kubenswrapper[8606]: I1204 22:01:38.483196 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zt44t"] Dec 04 22:01:38.613845 master-0 kubenswrapper[8606]: I1204 22:01:38.613767 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-utilities\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:38.613845 master-0 kubenswrapper[8606]: I1204 22:01:38.613839 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-catalog-content\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:38.614536 master-0 kubenswrapper[8606]: I1204 22:01:38.614449 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87gv4\" (UniqueName: \"kubernetes.io/projected/ce6002bb-4948-45ab-bb1d-ed65e86b6466-kube-api-access-87gv4\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:38.716388 master-0 kubenswrapper[8606]: I1204 22:01:38.716194 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87gv4\" (UniqueName: \"kubernetes.io/projected/ce6002bb-4948-45ab-bb1d-ed65e86b6466-kube-api-access-87gv4\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:38.716388 master-0 kubenswrapper[8606]: I1204 22:01:38.716312 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-utilities\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:38.716827 master-0 kubenswrapper[8606]: I1204 22:01:38.716749 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-catalog-content\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:38.717625 master-0 kubenswrapper[8606]: I1204 22:01:38.717538 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-catalog-content\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:38.717625 master-0 kubenswrapper[8606]: I1204 22:01:38.717570 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-utilities\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:39.682930 master-0 kubenswrapper[8606]: I1204 22:01:39.682835 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 04 22:01:39.690614 master-0 kubenswrapper[8606]: I1204 22:01:39.690538 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 04 22:01:39.690858 master-0 kubenswrapper[8606]: I1204 22:01:39.690718 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:39.696409 master-0 kubenswrapper[8606]: I1204 22:01:39.696147 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-wkcjf" Dec 04 22:01:39.789669 master-0 kubenswrapper[8606]: I1204 22:01:39.789615 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87gv4\" (UniqueName: \"kubernetes.io/projected/ce6002bb-4948-45ab-bb1d-ed65e86b6466-kube-api-access-87gv4\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:39.838353 master-0 kubenswrapper[8606]: I1204 22:01:39.838292 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:39.838676 master-0 kubenswrapper[8606]: I1204 22:01:39.838657 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-var-lock\") pod \"installer-2-master-0\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:39.838893 master-0 kubenswrapper[8606]: I1204 22:01:39.838875 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:39.942992 master-0 kubenswrapper[8606]: I1204 22:01:39.942865 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:39.943424 master-0 kubenswrapper[8606]: I1204 22:01:39.943402 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:39.943838 master-0 kubenswrapper[8606]: I1204 22:01:39.943021 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:39.943921 master-0 kubenswrapper[8606]: I1204 22:01:39.943795 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-var-lock\") pod \"installer-2-master-0\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:39.944490 master-0 kubenswrapper[8606]: I1204 22:01:39.944474 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-var-lock\") pod \"installer-2-master-0\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:39.963328 master-0 kubenswrapper[8606]: I1204 22:01:39.963273 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:40.003678 master-0 kubenswrapper[8606]: I1204 22:01:40.002473 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:01:40.028004 master-0 kubenswrapper[8606]: I1204 22:01:40.027949 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:01:40.387399 master-0 kubenswrapper[8606]: I1204 22:01:40.387314 8606 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 04 22:01:40.387675 master-0 kubenswrapper[8606]: I1204 22:01:40.387636 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcdctl" containerID="cri-o://75e4e520a75639c893eb6ea15b07a3187aaf4dfc898564bd4832b04c7d30a431" gracePeriod=30 Dec 04 22:01:40.387837 master-0 kubenswrapper[8606]: I1204 22:01:40.387797 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcd" containerID="cri-o://b4b557e71fac173d7ebddbf04536e46989f934644030fceea9234231919b8e8f" gracePeriod=30 Dec 04 22:01:40.392800 master-0 kubenswrapper[8606]: I1204 22:01:40.392772 8606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Dec 04 22:01:40.393152 master-0 kubenswrapper[8606]: E1204 22:01:40.393127 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcd" Dec 04 22:01:40.393152 master-0 kubenswrapper[8606]: I1204 22:01:40.393149 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcd" Dec 04 22:01:40.393309 master-0 kubenswrapper[8606]: E1204 22:01:40.393163 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcdctl" Dec 04 22:01:40.393309 master-0 kubenswrapper[8606]: I1204 22:01:40.393171 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcdctl" Dec 04 22:01:40.393309 master-0 kubenswrapper[8606]: I1204 22:01:40.393304 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcd" Dec 04 22:01:40.393428 master-0 kubenswrapper[8606]: I1204 22:01:40.393324 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcdctl" Dec 04 22:01:40.412966 master-0 kubenswrapper[8606]: I1204 22:01:40.412901 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.451870 master-0 kubenswrapper[8606]: I1204 22:01:40.451811 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.452022 master-0 kubenswrapper[8606]: I1204 22:01:40.451889 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.452022 master-0 kubenswrapper[8606]: I1204 22:01:40.451933 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.452184 master-0 kubenswrapper[8606]: I1204 22:01:40.452087 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.452524 master-0 kubenswrapper[8606]: I1204 22:01:40.452363 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.452524 master-0 kubenswrapper[8606]: I1204 22:01:40.452416 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.553997 master-0 kubenswrapper[8606]: I1204 22:01:40.553855 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554217 master-0 kubenswrapper[8606]: I1204 22:01:40.554101 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554217 master-0 kubenswrapper[8606]: I1204 22:01:40.554155 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554317 master-0 kubenswrapper[8606]: I1204 22:01:40.554284 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554370 master-0 kubenswrapper[8606]: I1204 22:01:40.554310 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554416 master-0 kubenswrapper[8606]: I1204 22:01:40.554383 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554416 master-0 kubenswrapper[8606]: I1204 22:01:40.554391 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554492 master-0 kubenswrapper[8606]: I1204 22:01:40.554441 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554599 master-0 kubenswrapper[8606]: I1204 22:01:40.554559 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554673 master-0 kubenswrapper[8606]: I1204 22:01:40.554643 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554762 master-0 kubenswrapper[8606]: I1204 22:01:40.554686 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:40.554762 master-0 kubenswrapper[8606]: I1204 22:01:40.554707 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:01:41.836406 master-0 kubenswrapper[8606]: I1204 22:01:41.836340 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_b977a2cf-4e95-4456-957d-b1ba05c0d1ff/installer/0.log" Dec 04 22:01:41.836977 master-0 kubenswrapper[8606]: I1204 22:01:41.836448 8606 generic.go:334] "Generic (PLEG): container finished" podID="b977a2cf-4e95-4456-957d-b1ba05c0d1ff" containerID="8772a454066677ab94d735965a9fe39b0c5a1577a466faea547f6edb43aab31c" exitCode=1 Dec 04 22:01:41.836977 master-0 kubenswrapper[8606]: I1204 22:01:41.836533 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"b977a2cf-4e95-4456-957d-b1ba05c0d1ff","Type":"ContainerDied","Data":"8772a454066677ab94d735965a9fe39b0c5a1577a466faea547f6edb43aab31c"} Dec 04 22:01:43.024130 master-0 kubenswrapper[8606]: W1204 22:01:43.024070 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7eb3f9_ce05_4128_9a1e_dc1c42ded4eb.slice/crio-8ccadfcf02bee77f4b3f98d491b1ee8f4b7c03cb21fbe9104543e0f3a4c0e10a WatchSource:0}: Error finding container 8ccadfcf02bee77f4b3f98d491b1ee8f4b7c03cb21fbe9104543e0f3a4c0e10a: Status 404 returned error can't find the container with id 8ccadfcf02bee77f4b3f98d491b1ee8f4b7c03cb21fbe9104543e0f3a4c0e10a Dec 04 22:01:43.351707 master-0 kubenswrapper[8606]: I1204 22:01:43.351041 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_b977a2cf-4e95-4456-957d-b1ba05c0d1ff/installer/0.log" Dec 04 22:01:43.351707 master-0 kubenswrapper[8606]: I1204 22:01:43.351148 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:43.399555 master-0 kubenswrapper[8606]: I1204 22:01:43.398834 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-var-lock\") pod \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " Dec 04 22:01:43.399555 master-0 kubenswrapper[8606]: I1204 22:01:43.398901 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kubelet-dir\") pod \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " Dec 04 22:01:43.399555 master-0 kubenswrapper[8606]: I1204 22:01:43.398935 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-var-lock" (OuterVolumeSpecName: "var-lock") pod "b977a2cf-4e95-4456-957d-b1ba05c0d1ff" (UID: "b977a2cf-4e95-4456-957d-b1ba05c0d1ff"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:43.399555 master-0 kubenswrapper[8606]: I1204 22:01:43.398962 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kube-api-access\") pod \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\" (UID: \"b977a2cf-4e95-4456-957d-b1ba05c0d1ff\") " Dec 04 22:01:43.399555 master-0 kubenswrapper[8606]: I1204 22:01:43.398977 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b977a2cf-4e95-4456-957d-b1ba05c0d1ff" (UID: "b977a2cf-4e95-4456-957d-b1ba05c0d1ff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:43.399555 master-0 kubenswrapper[8606]: I1204 22:01:43.399261 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:43.399555 master-0 kubenswrapper[8606]: I1204 22:01:43.399274 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:43.412157 master-0 kubenswrapper[8606]: I1204 22:01:43.412098 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b977a2cf-4e95-4456-957d-b1ba05c0d1ff" (UID: "b977a2cf-4e95-4456-957d-b1ba05c0d1ff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:01:43.507976 master-0 kubenswrapper[8606]: I1204 22:01:43.504550 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b977a2cf-4e95-4456-957d-b1ba05c0d1ff-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:43.853062 master-0 kubenswrapper[8606]: I1204 22:01:43.852994 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" event={"ID":"3d178411-75a4-4ff8-9764-6f3e3944eca4","Type":"ContainerStarted","Data":"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228"} Dec 04 22:01:43.855566 master-0 kubenswrapper[8606]: I1204 22:01:43.855519 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-jb4xf_7f091088-2166-4026-9fa6-62bd83407edb/openshift-controller-manager-operator/0.log" Dec 04 22:01:43.855639 master-0 kubenswrapper[8606]: I1204 22:01:43.855566 8606 generic.go:334] "Generic (PLEG): container finished" podID="7f091088-2166-4026-9fa6-62bd83407edb" containerID="2da555718ea10aaf4197144683ccb4702237b92306aae894f469e5c551742616" exitCode=1 Dec 04 22:01:43.855639 master-0 kubenswrapper[8606]: I1204 22:01:43.855620 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerDied","Data":"2da555718ea10aaf4197144683ccb4702237b92306aae894f469e5c551742616"} Dec 04 22:01:43.856123 master-0 kubenswrapper[8606]: I1204 22:01:43.856102 8606 scope.go:117] "RemoveContainer" containerID="2da555718ea10aaf4197144683ccb4702237b92306aae894f469e5c551742616" Dec 04 22:01:43.858338 master-0 kubenswrapper[8606]: I1204 22:01:43.858317 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_b977a2cf-4e95-4456-957d-b1ba05c0d1ff/installer/0.log" Dec 04 22:01:43.858384 master-0 kubenswrapper[8606]: I1204 22:01:43.858370 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"b977a2cf-4e95-4456-957d-b1ba05c0d1ff","Type":"ContainerDied","Data":"6285458f93d5be2507cf69523363385cf66539c4aef333534749d36e6229ace7"} Dec 04 22:01:43.858422 master-0 kubenswrapper[8606]: I1204 22:01:43.858395 8606 scope.go:117] "RemoveContainer" containerID="8772a454066677ab94d735965a9fe39b0c5a1577a466faea547f6edb43aab31c" Dec 04 22:01:43.858526 master-0 kubenswrapper[8606]: I1204 22:01:43.858484 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 04 22:01:43.874821 master-0 kubenswrapper[8606]: I1204 22:01:43.869447 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" event={"ID":"a043ea49-97f9-4ae6-83b9-733f12754d94","Type":"ContainerStarted","Data":"b24a52101599e57bc25b6c160a06c23124bc447eb919bdd2267f0b91d0f6aaee"} Dec 04 22:01:43.884358 master-0 kubenswrapper[8606]: I1204 22:01:43.882963 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerStarted","Data":"93d74a7e351d1bb38ca66b99396fddaa338eac5fd2201ea238d97a8b16a1e1a0"} Dec 04 22:01:43.890001 master-0 kubenswrapper[8606]: I1204 22:01:43.889952 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-7vlpp" event={"ID":"2d142201-6e77-4828-b86b-05d4144a2f08","Type":"ContainerStarted","Data":"1129d1c5176ef3c828bda41dc553996cb75881e0c9229783b32fa908eaa25ec0"} Dec 04 22:01:43.894437 master-0 kubenswrapper[8606]: I1204 22:01:43.894272 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" event={"ID":"5598683a-cd32-486d-8839-205829d55cc2","Type":"ContainerStarted","Data":"922ec5ad22c0f758ca0c6af6881b85724b616bc9bf1514cfd7b12d47fc0ff553"} Dec 04 22:01:43.901970 master-0 kubenswrapper[8606]: I1204 22:01:43.901632 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" event={"ID":"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb","Type":"ContainerStarted","Data":"446458854f272c65918d3eef29e63c52aea4a45ba36f434208f291e2e3410da7"} Dec 04 22:01:43.901970 master-0 kubenswrapper[8606]: I1204 22:01:43.901883 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" event={"ID":"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb","Type":"ContainerStarted","Data":"8ccadfcf02bee77f4b3f98d491b1ee8f4b7c03cb21fbe9104543e0f3a4c0e10a"} Dec 04 22:01:43.907489 master-0 kubenswrapper[8606]: I1204 22:01:43.907402 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" event={"ID":"800f436c-145d-4281-8d4d-644ba2cb0ebb","Type":"ContainerStarted","Data":"66fa513342b7f47d4f807e0f29b5398451337b70d0aea1cac07ea70f754f3d14"} Dec 04 22:01:43.921837 master-0 kubenswrapper[8606]: I1204 22:01:43.921457 8606 generic.go:334] "Generic (PLEG): container finished" podID="810c363b-a4c7-428d-a2fb-285adc29f477" containerID="b8a9c51a67f38c6ea4afbc1a4b2e8c17d0b815c4b55531281069c19c0fd8cfa9" exitCode=0 Dec 04 22:01:43.922126 master-0 kubenswrapper[8606]: I1204 22:01:43.921928 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerDied","Data":"b8a9c51a67f38c6ea4afbc1a4b2e8c17d0b815c4b55531281069c19c0fd8cfa9"} Dec 04 22:01:43.925061 master-0 kubenswrapper[8606]: I1204 22:01:43.925025 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" event={"ID":"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9","Type":"ContainerStarted","Data":"7031d386f42300ef917c16f433aec3d9b72a6769b546f2943379602f68aa4683"} Dec 04 22:01:44.933135 master-0 kubenswrapper[8606]: I1204 22:01:44.933068 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" event={"ID":"3d178411-75a4-4ff8-9764-6f3e3944eca4","Type":"ContainerStarted","Data":"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62"} Dec 04 22:01:44.944009 master-0 kubenswrapper[8606]: I1204 22:01:44.943943 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-jb4xf_7f091088-2166-4026-9fa6-62bd83407edb/openshift-controller-manager-operator/0.log" Dec 04 22:01:44.944271 master-0 kubenswrapper[8606]: I1204 22:01:44.944158 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerStarted","Data":"3c8faa0cec9898a47039ead85f90eab240ebf83ecd040f53acd3c80c7bec151c"} Dec 04 22:01:44.952618 master-0 kubenswrapper[8606]: I1204 22:01:44.947438 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" event={"ID":"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb","Type":"ContainerStarted","Data":"ba166c0c83b63968d9c53772f494598b095ed5d017e4a288c9e60bcf13979dcd"} Dec 04 22:01:44.952618 master-0 kubenswrapper[8606]: I1204 22:01:44.949229 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerStarted","Data":"4187b1d7d08b53bf4814dc4dfbd0a6ff2e8881049bae7dd2ea8c02223e861224"} Dec 04 22:01:46.968110 master-0 kubenswrapper[8606]: I1204 22:01:46.968025 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" event={"ID":"3d178411-75a4-4ff8-9764-6f3e3944eca4","Type":"ContainerStarted","Data":"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82"} Dec 04 22:01:50.997550 master-0 kubenswrapper[8606]: I1204 22:01:50.997416 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerStarted","Data":"5cfb1b530b3c0f2a217c50d608731d8e0deab29bd288261a2ecaa64bebc6a3b1"} Dec 04 22:01:50.998281 master-0 kubenswrapper[8606]: I1204 22:01:50.997810 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:53.437362 master-0 kubenswrapper[8606]: E1204 22:01:53.437222 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 04 22:01:53.438268 master-0 kubenswrapper[8606]: I1204 22:01:53.438219 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 04 22:01:53.597956 master-0 kubenswrapper[8606]: I1204 22:01:53.597857 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:01:53.597956 master-0 kubenswrapper[8606]: I1204 22:01:53.597942 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:01:53.597956 master-0 kubenswrapper[8606]: I1204 22:01:53.597958 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:01:53.598419 master-0 kubenswrapper[8606]: I1204 22:01:53.598107 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:01:54.023585 master-0 kubenswrapper[8606]: I1204 22:01:54.023420 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_9fca9b57-0b34-46d4-9f3a-dbd4acd630f6/installer/0.log" Dec 04 22:01:54.023585 master-0 kubenswrapper[8606]: I1204 22:01:54.023584 8606 generic.go:334] "Generic (PLEG): container finished" podID="9fca9b57-0b34-46d4-9f3a-dbd4acd630f6" containerID="63eed725435ffc8fd80ef462cf5e0dcb22612c336b3cabe2d142c84feedc099e" exitCode=1 Dec 04 22:01:54.023986 master-0 kubenswrapper[8606]: I1204 22:01:54.023637 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6","Type":"ContainerDied","Data":"63eed725435ffc8fd80ef462cf5e0dcb22612c336b3cabe2d142c84feedc099e"} Dec 04 22:01:55.034461 master-0 kubenswrapper[8606]: I1204 22:01:55.034285 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 04 22:01:55.354407 master-0 kubenswrapper[8606]: I1204 22:01:55.354370 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_9fca9b57-0b34-46d4-9f3a-dbd4acd630f6/installer/0.log" Dec 04 22:01:55.354527 master-0 kubenswrapper[8606]: I1204 22:01:55.354456 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:55.543659 master-0 kubenswrapper[8606]: I1204 22:01:55.543491 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kube-api-access\") pod \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " Dec 04 22:01:55.543851 master-0 kubenswrapper[8606]: I1204 22:01:55.543667 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kubelet-dir\") pod \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " Dec 04 22:01:55.543851 master-0 kubenswrapper[8606]: I1204 22:01:55.543718 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-var-lock\") pod \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\" (UID: \"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6\") " Dec 04 22:01:55.543992 master-0 kubenswrapper[8606]: I1204 22:01:55.543856 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9fca9b57-0b34-46d4-9f3a-dbd4acd630f6" (UID: "9fca9b57-0b34-46d4-9f3a-dbd4acd630f6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:55.544196 master-0 kubenswrapper[8606]: I1204 22:01:55.544091 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-var-lock" (OuterVolumeSpecName: "var-lock") pod "9fca9b57-0b34-46d4-9f3a-dbd4acd630f6" (UID: "9fca9b57-0b34-46d4-9f3a-dbd4acd630f6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:55.544385 master-0 kubenswrapper[8606]: I1204 22:01:55.544342 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:55.548068 master-0 kubenswrapper[8606]: I1204 22:01:55.548001 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9fca9b57-0b34-46d4-9f3a-dbd4acd630f6" (UID: "9fca9b57-0b34-46d4-9f3a-dbd4acd630f6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:01:55.645902 master-0 kubenswrapper[8606]: I1204 22:01:55.645817 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:55.645902 master-0 kubenswrapper[8606]: I1204 22:01:55.645887 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:56.041250 master-0 kubenswrapper[8606]: I1204 22:01:56.041152 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" event={"ID":"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9","Type":"ContainerStarted","Data":"6d31ad2a1f5237b4355ed2f39e4e13656076c4a85f80a08d5d712a1a6ab75238"} Dec 04 22:01:56.043602 master-0 kubenswrapper[8606]: I1204 22:01:56.043476 8606 generic.go:334] "Generic (PLEG): container finished" podID="0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" containerID="4c4fa6995a939a53e102917b86fbd0f10791e85887df9e375f44a27329f6b171" exitCode=0 Dec 04 22:01:56.043735 master-0 kubenswrapper[8606]: I1204 22:01:56.043637 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab","Type":"ContainerDied","Data":"4c4fa6995a939a53e102917b86fbd0f10791e85887df9e375f44a27329f6b171"} Dec 04 22:01:56.045807 master-0 kubenswrapper[8606]: I1204 22:01:56.045754 8606 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c" exitCode=0 Dec 04 22:01:56.045975 master-0 kubenswrapper[8606]: I1204 22:01:56.045809 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerDied","Data":"e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c"} Dec 04 22:01:56.045975 master-0 kubenswrapper[8606]: I1204 22:01:56.045853 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"654092bb21c6a2cc28da4ebfdbe186ba874eca77a06a9c3e6b7af5ba0743168d"} Dec 04 22:01:56.048602 master-0 kubenswrapper[8606]: I1204 22:01:56.048488 8606 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b" exitCode=1 Dec 04 22:01:56.048602 master-0 kubenswrapper[8606]: I1204 22:01:56.048551 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b"} Dec 04 22:01:56.048838 master-0 kubenswrapper[8606]: I1204 22:01:56.048608 8606 scope.go:117] "RemoveContainer" containerID="183ef4cdec698bdfc64b154e7dcc6b79ea7ba6ed603e295f5859a3a5c552d57d" Dec 04 22:01:56.049694 master-0 kubenswrapper[8606]: I1204 22:01:56.049640 8606 scope.go:117] "RemoveContainer" containerID="467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b" Dec 04 22:01:56.050930 master-0 kubenswrapper[8606]: I1204 22:01:56.050864 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_9fca9b57-0b34-46d4-9f3a-dbd4acd630f6/installer/0.log" Dec 04 22:01:56.051079 master-0 kubenswrapper[8606]: I1204 22:01:56.050982 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"9fca9b57-0b34-46d4-9f3a-dbd4acd630f6","Type":"ContainerDied","Data":"32c3997205f44d7d981e61cc8c878310736400e3977f5531db32f6189ad28d9e"} Dec 04 22:01:56.051079 master-0 kubenswrapper[8606]: I1204 22:01:56.051010 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 04 22:01:56.143265 master-0 kubenswrapper[8606]: I1204 22:01:56.143203 8606 scope.go:117] "RemoveContainer" containerID="63eed725435ffc8fd80ef462cf5e0dcb22612c336b3cabe2d142c84feedc099e" Dec 04 22:01:56.596544 master-0 kubenswrapper[8606]: I1204 22:01:56.596322 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:01:56.596544 master-0 kubenswrapper[8606]: I1204 22:01:56.596431 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:01:56.596901 master-0 kubenswrapper[8606]: I1204 22:01:56.596649 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:01:56.596901 master-0 kubenswrapper[8606]: I1204 22:01:56.596786 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:01:56.793953 master-0 kubenswrapper[8606]: E1204 22:01:56.793658 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:01:46Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:01:46Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:01:46Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:01:46Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3e65409fc2b27ad0aaeb500a39e264663d2980821f099b830b551785ce4ce8b\\\"],\\\"sizeBytes\\\":1631758507},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9014f384de5f9a0b7418d5869ad349abb9588d16bd09ed650a163c045315dbff\\\"],\\\"sizeBytes\\\":1232140918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b472823604757237c2d16bd6f6221f4cf562aa3b05942c7f602e1e8b2e55a7c6\\\"],\\\"sizeBytes\\\":983705650},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\\\"],\\\"sizeBytes\\\":938303566},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:631a3798b749fecc041a99929eb946618df723e15055e805ff752a1a1273481c\\\"],\\\"sizeBytes\\\":870567329},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f1ca78c423f43f89a0411e40393642f64e4f8df9e5f61c25e31047c4cce170f9\\\"],\\\"sizeBytes\\\":857069957},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b12f830c3316aa4dc061c2d00c74126282b3e2bcccc301eab00d57fff3c4c7c\\\"],\\\"sizeBytes\\\":767284906},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb3ec61f9a932a9ad13bdeb44bcf9477a8d5f728151d7f19ed3ef7d4b02b3a82\\\"],\\\"sizeBytes\\\":682371258},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:916566bb9d0143352324233d460ad94697719c11c8c9158e3aea8f475941751f\\\"],\\\"sizeBytes\\\":677523572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5451aa441e5b8d8689c032405d410c8049a849ef2edf77e5b6a5ce2838c6569b\\\"],\\\"sizeBytes\\\":672407260},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9724d2036305cbd729e1f484c5bad89971de977fff8a6723fef1873858dd1123\\\"],\\\"sizeBytes\\\":616108962},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:df606f3b71d4376d1a2108c09f0d3dab455fc30bcb67c60e91590c105e9025bf\\\"],\\\"sizeBytes\\\":583836304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:79f99fd6cce984287932edf0d009660bb488d663081f3d62ec3b23bc8bfbf6c2\\\"],\\\"sizeBytes\\\":576619763},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eddedae7578d79b5a3f748000ae5c00b9f14a04710f9f9ec7b52fc569be5dfb8\\\"],\\\"sizeBytes\\\":552673986},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa24edce3d740f84c40018e94cdbf2bc7375268d13d57c2d664e43a46ccea3fc\\\"],\\\"sizeBytes\\\":543227406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\\\"],\\\"sizeBytes\\\":532719167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cfde59e48cd5dee3721f34d249cb119cc3259fd857965d34f9c7ed83b0c363a1\\\"],\\\"sizeBytes\\\":532402162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f4d4282cb53325e737ad68abbfcb70687ae04fb50353f4f0ba0ba5703b15009a\\\"],\\\"sizeBytes\\\":512838054},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:8c885ea0b3c5124989f0a9b93eba98eb9fca6bbd0262772d85d90bf713a4d572\\\"],\\\"sizeBytes\\\":512452153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\\\"],\\\"sizeBytes\\\":509437356},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e85850a4ae1a1e3ec2c590a4936d640882b6550124da22031c85b526afbf52df\\\"],\\\"sizeBytes\\\":507687221},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8375671da86aa527ee7e291d86971b0baa823ffc7663b5a983084456e76c0f59\\\"],\\\"sizeBytes\\\":506741476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:831f30660844091d6154e2674d3a9da6f34271bf8a2c40b56f7416066318742b\\\"],\\\"sizeBytes\\\":505649178},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86af77350cfe6fd69280157e4162aa0147873d9431c641ae4ad3e881ff768a73\\\"],\\\"sizeBytes\\\":505628211},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9\\\"],\\\"sizeBytes\\\":503340749},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da\\\"],\\\"sizeBytes\\\":503011144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8eabac819f289e29d75c7ab172d8124554849a47f0b00770928c3eb19a5a31c4\\\"],\\\"sizeBytes\\\":502436444},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10e57ca7611f79710f05777dc6a8f31c7e04eb09da4d8d793a5acfbf0e4692d7\\\"],\\\"sizeBytes\\\":500943492},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f042fa25014f3d37f3ea967d21f361d2a11833ae18f2c750318101b25d2497ce\\\"],\\\"sizeBytes\\\":500848684},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91af633e585621630c40d14f188e37d36b44678d0a59e582d850bf8d593d3a0c\\\"],\\\"sizeBytes\\\":499798563},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\\\"],\\\"sizeBytes\\\":499705918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:75d996f6147edb88c09fd1a052099de66638590d7d03a735006244bc9e19f898\\\"],\\\"sizeBytes\\\":499082775},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f952cec1e5332b84bdffa249cd426f39087058d6544ddcec650a414c15a9b68\\\"],\\\"sizeBytes\\\":489528665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c416b201d480bddb5a4960ec42f4740761a1335001cf84ba5ae19ad6857771b1\\\"],\\\"sizeBytes\\\":481559117},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3a77aa4d03b89ea284e3467a268e5989a77a2ef63e685eb1d5c5ea5b3922b7a\\\"],\\\"sizeBytes\\\":478917802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb928c13a46d3fb45f4a881892d023a92d610a5430be0ffd916aaf8da8e7d297\\\"],\\\"sizeBytes\\\":478642572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fd3e9f8f00a59bda7483ec7dc8a0ed602f9ca30e3d72b22072dbdf2819da3f61\\\"],\\\"sizeBytes\\\":465144618},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c1edf52f70bf9b1d1457e0c4111bc79cdaa1edd659ddbdb9d8176eff8b46956\\\"],\\\"sizeBytes\\\":462727837},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8cc27777e72233024fe84ee1faa168aec715a0b24912a3ce70715ddccba328df\\\"],\\\"sizeBytes\\\":461702648},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\\\"],\\\"sizeBytes\\\":459552216},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3ce2cbf1032ad0f24f204db73687002fcf302e86ebde3945801c74351b64576\\\"],\\\"sizeBytes\\\":458169255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7664a2d4cb10e82ed32abbf95799f43fc3d10135d7dd94799730de504a89680a\\\"],\\\"sizeBytes\\\":452589750},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4ecc5bac651ff1942865baee5159582e9602c89b47eeab18400a32abcba8f690\\\"],\\\"sizeBytes\\\":451039520},{\\\"names\\\":[],\\\"sizeBytes\\\":450841337},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1386b0fcb731d843f15fb64532f8b676c927821d69dd3d4503c973c3e2a04216\\\"],\\\"sizeBytes\\\":449978499},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2632d7f05d5a992e91038ded81c715898f3fe803420a9b67a0201e9fd8075213\\\"],\\\"sizeBytes\\\":443291941},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f0aa9cd04713acc5c6fea721bd849e1500da8ae945e0b32000887f34d786e0b\\\"],\\\"sizeBytes\\\":442509555},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e438b814f8e16f00b3fc4b69991af80eee79ae111d2a707f34aa64b2ccbb6eb\\\"],\\\"sizeBytes\\\":437737925},{\\\"names\\\":[],\\\"sizeBytes\\\":433122306},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a3d37aa7a22c68afa963ecfb4b43c52cccf152580cd66e4d5382fb69e4037cc\\\"],\\\"sizeBytes\\\":406053031}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:01:57.059110 master-0 kubenswrapper[8606]: I1204 22:01:57.059029 8606 generic.go:334] "Generic (PLEG): container finished" podID="5e09e2af7200e6f9be469dbfd9bb1127" containerID="d00575d56d81b17e8c0212c4cad634fe2f3afd13660a8de6afbe8a4381dd50d7" exitCode=1 Dec 04 22:01:57.059834 master-0 kubenswrapper[8606]: I1204 22:01:57.059143 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerDied","Data":"d00575d56d81b17e8c0212c4cad634fe2f3afd13660a8de6afbe8a4381dd50d7"} Dec 04 22:01:57.059903 master-0 kubenswrapper[8606]: I1204 22:01:57.059834 8606 scope.go:117] "RemoveContainer" containerID="d00575d56d81b17e8c0212c4cad634fe2f3afd13660a8de6afbe8a4381dd50d7" Dec 04 22:01:57.062221 master-0 kubenswrapper[8606]: I1204 22:01:57.062165 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9"} Dec 04 22:01:57.306772 master-0 kubenswrapper[8606]: I1204 22:01:57.306717 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:01:57.452886 master-0 kubenswrapper[8606]: I1204 22:01:57.452805 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:57.577611 master-0 kubenswrapper[8606]: I1204 22:01:57.577464 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kube-api-access\") pod \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " Dec 04 22:01:57.577611 master-0 kubenswrapper[8606]: I1204 22:01:57.577602 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kubelet-dir\") pod \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " Dec 04 22:01:57.578221 master-0 kubenswrapper[8606]: I1204 22:01:57.577803 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-var-lock\") pod \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\" (UID: \"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab\") " Dec 04 22:01:57.578221 master-0 kubenswrapper[8606]: I1204 22:01:57.578027 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" (UID: "0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:57.578221 master-0 kubenswrapper[8606]: I1204 22:01:57.578150 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-var-lock" (OuterVolumeSpecName: "var-lock") pod "0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" (UID: "0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:01:57.578480 master-0 kubenswrapper[8606]: I1204 22:01:57.578442 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:57.578631 master-0 kubenswrapper[8606]: I1204 22:01:57.578482 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:57.583208 master-0 kubenswrapper[8606]: I1204 22:01:57.583141 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" (UID: "0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:01:57.681232 master-0 kubenswrapper[8606]: I1204 22:01:57.681137 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:01:58.076273 master-0 kubenswrapper[8606]: I1204 22:01:58.076189 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 04 22:01:58.077228 master-0 kubenswrapper[8606]: I1204 22:01:58.076189 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab","Type":"ContainerDied","Data":"93283789117b0576a3fa7e5b0b96c59c1e9ecb75d010ca6d14ffe858c88069c2"} Dec 04 22:01:58.077228 master-0 kubenswrapper[8606]: I1204 22:01:58.076355 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93283789117b0576a3fa7e5b0b96c59c1e9ecb75d010ca6d14ffe858c88069c2" Dec 04 22:01:58.080791 master-0 kubenswrapper[8606]: I1204 22:01:58.079709 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"229f04f01e031c096a8d66a3a3b9f5322d73a495869829416a90812a311e2aee"} Dec 04 22:01:58.468117 master-0 kubenswrapper[8606]: E1204 22:01:58.467915 8606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Dec 04 22:01:58.482091 master-0 kubenswrapper[8606]: I1204 22:01:58.482035 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:01:59.596525 master-0 kubenswrapper[8606]: I1204 22:01:59.596444 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:01:59.597177 master-0 kubenswrapper[8606]: I1204 22:01:59.596493 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:01:59.597177 master-0 kubenswrapper[8606]: I1204 22:01:59.596603 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:01:59.597177 master-0 kubenswrapper[8606]: I1204 22:01:59.596646 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:01:59.597177 master-0 kubenswrapper[8606]: I1204 22:01:59.596756 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:01:59.597973 master-0 kubenswrapper[8606]: I1204 22:01:59.597912 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"5cfb1b530b3c0f2a217c50d608731d8e0deab29bd288261a2ecaa64bebc6a3b1"} pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Dec 04 22:01:59.598064 master-0 kubenswrapper[8606]: I1204 22:01:59.597988 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" containerID="cri-o://5cfb1b530b3c0f2a217c50d608731d8e0deab29bd288261a2ecaa64bebc6a3b1" gracePeriod=30 Dec 04 22:01:59.598064 master-0 kubenswrapper[8606]: I1204 22:01:59.598010 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:01:59.598149 master-0 kubenswrapper[8606]: I1204 22:01:59.598107 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:00.307225 master-0 kubenswrapper[8606]: I1204 22:02:00.307058 8606 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:00.331392 master-0 kubenswrapper[8606]: I1204 22:02:00.331225 8606 patch_prober.go:28] interesting pod/etcd-operator-5bf4d88c6f-flrrb container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" start-of-body= Dec 04 22:02:00.331392 master-0 kubenswrapper[8606]: I1204 22:02:00.331346 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" podUID="ceb419e4-d804-4111-b8d8-8436cc2ee617" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" Dec 04 22:02:01.103886 master-0 kubenswrapper[8606]: I1204 22:02:01.103696 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/0.log" Dec 04 22:02:01.104775 master-0 kubenswrapper[8606]: I1204 22:02:01.104722 8606 generic.go:334] "Generic (PLEG): container finished" podID="810c363b-a4c7-428d-a2fb-285adc29f477" containerID="5cfb1b530b3c0f2a217c50d608731d8e0deab29bd288261a2ecaa64bebc6a3b1" exitCode=255 Dec 04 22:02:01.104863 master-0 kubenswrapper[8606]: I1204 22:02:01.104796 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerDied","Data":"5cfb1b530b3c0f2a217c50d608731d8e0deab29bd288261a2ecaa64bebc6a3b1"} Dec 04 22:02:01.503693 master-0 kubenswrapper[8606]: I1204 22:02:01.503584 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:02:02.116408 master-0 kubenswrapper[8606]: I1204 22:02:02.116311 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_4b9fbd90-66d5-4637-9821-22242aa6f6d7/installer/0.log" Dec 04 22:02:02.116408 master-0 kubenswrapper[8606]: I1204 22:02:02.116405 8606 generic.go:334] "Generic (PLEG): container finished" podID="4b9fbd90-66d5-4637-9821-22242aa6f6d7" containerID="05ebda65d53028c7345257866dac633a27c8894eb475430d761e1c0a053ea020" exitCode=1 Dec 04 22:02:02.117404 master-0 kubenswrapper[8606]: I1204 22:02:02.116539 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"4b9fbd90-66d5-4637-9821-22242aa6f6d7","Type":"ContainerDied","Data":"05ebda65d53028c7345257866dac633a27c8894eb475430d761e1c0a053ea020"} Dec 04 22:02:02.119802 master-0 kubenswrapper[8606]: I1204 22:02:02.119739 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/0.log" Dec 04 22:02:02.120379 master-0 kubenswrapper[8606]: I1204 22:02:02.120317 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerStarted","Data":"7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e"} Dec 04 22:02:02.121622 master-0 kubenswrapper[8606]: I1204 22:02:02.121545 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:02:03.513235 master-0 kubenswrapper[8606]: I1204 22:02:03.513160 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_4b9fbd90-66d5-4637-9821-22242aa6f6d7/installer/0.log" Dec 04 22:02:03.514035 master-0 kubenswrapper[8606]: I1204 22:02:03.513297 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:02:03.690776 master-0 kubenswrapper[8606]: I1204 22:02:03.690633 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kube-api-access\") pod \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " Dec 04 22:02:03.690776 master-0 kubenswrapper[8606]: I1204 22:02:03.690799 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kubelet-dir\") pod \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " Dec 04 22:02:03.691219 master-0 kubenswrapper[8606]: I1204 22:02:03.690935 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4b9fbd90-66d5-4637-9821-22242aa6f6d7" (UID: "4b9fbd90-66d5-4637-9821-22242aa6f6d7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:02:03.691219 master-0 kubenswrapper[8606]: I1204 22:02:03.691075 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-var-lock\") pod \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\" (UID: \"4b9fbd90-66d5-4637-9821-22242aa6f6d7\") " Dec 04 22:02:03.691219 master-0 kubenswrapper[8606]: I1204 22:02:03.691120 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-var-lock" (OuterVolumeSpecName: "var-lock") pod "4b9fbd90-66d5-4637-9821-22242aa6f6d7" (UID: "4b9fbd90-66d5-4637-9821-22242aa6f6d7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:02:03.692091 master-0 kubenswrapper[8606]: I1204 22:02:03.692025 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:02:03.692091 master-0 kubenswrapper[8606]: I1204 22:02:03.692070 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:02:03.697146 master-0 kubenswrapper[8606]: I1204 22:02:03.697039 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4b9fbd90-66d5-4637-9821-22242aa6f6d7" (UID: "4b9fbd90-66d5-4637-9821-22242aa6f6d7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:02:03.794496 master-0 kubenswrapper[8606]: I1204 22:02:03.794383 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b9fbd90-66d5-4637-9821-22242aa6f6d7-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:02:04.141931 master-0 kubenswrapper[8606]: I1204 22:02:04.141670 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_4b9fbd90-66d5-4637-9821-22242aa6f6d7/installer/0.log" Dec 04 22:02:04.141931 master-0 kubenswrapper[8606]: I1204 22:02:04.141846 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"4b9fbd90-66d5-4637-9821-22242aa6f6d7","Type":"ContainerDied","Data":"5aea2f5066e056e4a369b8871e8461c5b3fa8918d7cce8402f38ffd4c90c32d6"} Dec 04 22:02:04.141931 master-0 kubenswrapper[8606]: I1204 22:02:04.141927 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aea2f5066e056e4a369b8871e8461c5b3fa8918d7cce8402f38ffd4c90c32d6" Dec 04 22:02:04.141931 master-0 kubenswrapper[8606]: I1204 22:02:04.141933 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:02:05.597750 master-0 kubenswrapper[8606]: I1204 22:02:05.597115 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:05.598674 master-0 kubenswrapper[8606]: I1204 22:02:05.597773 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:05.598674 master-0 kubenswrapper[8606]: I1204 22:02:05.597174 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:05.598674 master-0 kubenswrapper[8606]: I1204 22:02:05.597939 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:06.795250 master-0 kubenswrapper[8606]: E1204 22:02:06.795153 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:08.470867 master-0 kubenswrapper[8606]: E1204 22:02:08.469189 8606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:08.596814 master-0 kubenswrapper[8606]: I1204 22:02:08.596723 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:08.597159 master-0 kubenswrapper[8606]: I1204 22:02:08.596830 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:08.597159 master-0 kubenswrapper[8606]: I1204 22:02:08.596720 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:08.597159 master-0 kubenswrapper[8606]: I1204 22:02:08.596937 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:09.055656 master-0 kubenswrapper[8606]: E1204 22:02:09.055549 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 04 22:02:09.185569 master-0 kubenswrapper[8606]: I1204 22:02:09.185440 8606 generic.go:334] "Generic (PLEG): container finished" podID="cc0396a9a2689b3e8c132c12640cbe83" containerID="b4b557e71fac173d7ebddbf04536e46989f934644030fceea9234231919b8e8f" exitCode=0 Dec 04 22:02:09.700011 master-0 kubenswrapper[8606]: I1204 22:02:09.699852 8606 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-bm2pk container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" start-of-body= Dec 04 22:02:09.700011 master-0 kubenswrapper[8606]: I1204 22:02:09.699955 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" podUID="f893663c-7c1e-4eda-9839-99c1c0440304" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" Dec 04 22:02:10.197482 master-0 kubenswrapper[8606]: I1204 22:02:10.197403 8606 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d" exitCode=0 Dec 04 22:02:10.197831 master-0 kubenswrapper[8606]: I1204 22:02:10.197537 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerDied","Data":"38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d"} Dec 04 22:02:10.307597 master-0 kubenswrapper[8606]: I1204 22:02:10.307484 8606 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:11.207289 master-0 kubenswrapper[8606]: I1204 22:02:11.207220 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_cc0396a9a2689b3e8c132c12640cbe83/etcdctl/0.log" Dec 04 22:02:11.208000 master-0 kubenswrapper[8606]: I1204 22:02:11.207304 8606 generic.go:334] "Generic (PLEG): container finished" podID="cc0396a9a2689b3e8c132c12640cbe83" containerID="75e4e520a75639c893eb6ea15b07a3187aaf4dfc898564bd4832b04c7d30a431" exitCode=137 Dec 04 22:02:11.596587 master-0 kubenswrapper[8606]: I1204 22:02:11.596455 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:11.596587 master-0 kubenswrapper[8606]: I1204 22:02:11.596483 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:11.596587 master-0 kubenswrapper[8606]: I1204 22:02:11.596575 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:11.597109 master-0 kubenswrapper[8606]: I1204 22:02:11.596616 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:11.597109 master-0 kubenswrapper[8606]: I1204 22:02:11.596702 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:02:11.597632 master-0 kubenswrapper[8606]: I1204 22:02:11.597568 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:11.597632 master-0 kubenswrapper[8606]: I1204 22:02:11.597610 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:11.597923 master-0 kubenswrapper[8606]: I1204 22:02:11.597854 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e"} pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Dec 04 22:02:11.598018 master-0 kubenswrapper[8606]: I1204 22:02:11.597941 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" containerID="cri-o://7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e" gracePeriod=30 Dec 04 22:02:12.825329 master-0 kubenswrapper[8606]: I1204 22:02:12.825254 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_cc0396a9a2689b3e8c132c12640cbe83/etcdctl/0.log" Dec 04 22:02:12.825855 master-0 kubenswrapper[8606]: I1204 22:02:12.825417 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:02:12.941298 master-0 kubenswrapper[8606]: I1204 22:02:12.941173 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"cc0396a9a2689b3e8c132c12640cbe83\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " Dec 04 22:02:12.941715 master-0 kubenswrapper[8606]: I1204 22:02:12.941358 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"cc0396a9a2689b3e8c132c12640cbe83\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " Dec 04 22:02:12.941715 master-0 kubenswrapper[8606]: I1204 22:02:12.941462 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir" (OuterVolumeSpecName: "data-dir") pod "cc0396a9a2689b3e8c132c12640cbe83" (UID: "cc0396a9a2689b3e8c132c12640cbe83"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:02:12.941715 master-0 kubenswrapper[8606]: I1204 22:02:12.941555 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs" (OuterVolumeSpecName: "certs") pod "cc0396a9a2689b3e8c132c12640cbe83" (UID: "cc0396a9a2689b3e8c132c12640cbe83"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:02:12.942386 master-0 kubenswrapper[8606]: I1204 22:02:12.942331 8606 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:02:12.942458 master-0 kubenswrapper[8606]: I1204 22:02:12.942383 8606 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:02:13.222138 master-0 kubenswrapper[8606]: I1204 22:02:13.222034 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/1.log" Dec 04 22:02:13.222937 master-0 kubenswrapper[8606]: I1204 22:02:13.222876 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/0.log" Dec 04 22:02:13.223477 master-0 kubenswrapper[8606]: I1204 22:02:13.223413 8606 generic.go:334] "Generic (PLEG): container finished" podID="810c363b-a4c7-428d-a2fb-285adc29f477" containerID="7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e" exitCode=255 Dec 04 22:02:13.223625 master-0 kubenswrapper[8606]: I1204 22:02:13.223519 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerDied","Data":"7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e"} Dec 04 22:02:13.223625 master-0 kubenswrapper[8606]: I1204 22:02:13.223597 8606 scope.go:117] "RemoveContainer" containerID="5cfb1b530b3c0f2a217c50d608731d8e0deab29bd288261a2ecaa64bebc6a3b1" Dec 04 22:02:13.226950 master-0 kubenswrapper[8606]: I1204 22:02:13.226910 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_cc0396a9a2689b3e8c132c12640cbe83/etcdctl/0.log" Dec 04 22:02:13.227127 master-0 kubenswrapper[8606]: I1204 22:02:13.227066 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:02:13.403985 master-0 kubenswrapper[8606]: I1204 22:02:13.403803 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0396a9a2689b3e8c132c12640cbe83" path="/var/lib/kubelet/pods/cc0396a9a2689b3e8c132c12640cbe83/volumes" Dec 04 22:02:13.404422 master-0 kubenswrapper[8606]: I1204 22:02:13.404376 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 04 22:02:13.971939 master-0 kubenswrapper[8606]: I1204 22:02:13.971886 8606 scope.go:117] "RemoveContainer" containerID="b4b557e71fac173d7ebddbf04536e46989f934644030fceea9234231919b8e8f" Dec 04 22:02:14.049861 master-0 kubenswrapper[8606]: I1204 22:02:14.012906 8606 scope.go:117] "RemoveContainer" containerID="75e4e520a75639c893eb6ea15b07a3187aaf4dfc898564bd4832b04c7d30a431" Dec 04 22:02:14.236941 master-0 kubenswrapper[8606]: I1204 22:02:14.236741 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/1.log" Dec 04 22:02:14.397706 master-0 kubenswrapper[8606]: E1204 22:02:14.397428 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e223198c7dbc4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:01:40.387789764 +0000 UTC m=+65.198091989,LastTimestamp:2025-12-04 22:01:40.387789764 +0000 UTC m=+65.198091989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:02:14.597029 master-0 kubenswrapper[8606]: I1204 22:02:14.596894 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:14.597376 master-0 kubenswrapper[8606]: I1204 22:02:14.597033 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:16.254930 master-0 kubenswrapper[8606]: I1204 22:02:16.254844 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/1.log" Dec 04 22:02:16.795998 master-0 kubenswrapper[8606]: E1204 22:02:16.795883 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:17.265309 master-0 kubenswrapper[8606]: I1204 22:02:17.265272 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-8lq7w_871cb002-67f4-43aa-a41d-7a5b2f340059/network-operator/0.log" Dec 04 22:02:17.266049 master-0 kubenswrapper[8606]: I1204 22:02:17.266020 8606 generic.go:334] "Generic (PLEG): container finished" podID="871cb002-67f4-43aa-a41d-7a5b2f340059" containerID="9d7fd4b64c7f9d10b43359385a6360e49aa71c5085c781ef53642cd82a85d004" exitCode=255 Dec 04 22:02:17.268324 master-0 kubenswrapper[8606]: I1204 22:02:17.268301 8606 generic.go:334] "Generic (PLEG): container finished" podID="e065179e-634a-4cbe-bb59-5b01c514e4de" containerID="aec30a53010adc6ee6176e40e860c2639cbdf974b27b2d24e1d71f75f8a5c427" exitCode=0 Dec 04 22:02:18.279843 master-0 kubenswrapper[8606]: I1204 22:02:18.279767 8606 generic.go:334] "Generic (PLEG): container finished" podID="56f25fad-089d-4df6-abb1-10d4c76750f1" containerID="d8a2de466dc95e948ba536210f040992057ba7bc222a8102fb88249ab34f040a" exitCode=0 Dec 04 22:02:18.472619 master-0 kubenswrapper[8606]: E1204 22:02:18.470083 8606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:19.700758 master-0 kubenswrapper[8606]: I1204 22:02:19.699505 8606 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-bm2pk container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" start-of-body= Dec 04 22:02:19.700758 master-0 kubenswrapper[8606]: I1204 22:02:19.699690 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" podUID="f893663c-7c1e-4eda-9839-99c1c0440304" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" Dec 04 22:02:20.306909 master-0 kubenswrapper[8606]: I1204 22:02:20.306701 8606 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:20.596470 master-0 kubenswrapper[8606]: I1204 22:02:20.596231 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:20.596470 master-0 kubenswrapper[8606]: I1204 22:02:20.596377 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:20.596470 master-0 kubenswrapper[8606]: I1204 22:02:20.596240 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:20.596922 master-0 kubenswrapper[8606]: I1204 22:02:20.596564 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:23.206857 master-0 kubenswrapper[8606]: E1204 22:02:23.206740 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 04 22:02:23.596246 master-0 kubenswrapper[8606]: I1204 22:02:23.596179 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:23.596481 master-0 kubenswrapper[8606]: I1204 22:02:23.596254 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:23.596481 master-0 kubenswrapper[8606]: I1204 22:02:23.596282 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:23.596481 master-0 kubenswrapper[8606]: I1204 22:02:23.596352 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:25.424973 master-0 kubenswrapper[8606]: I1204 22:02:25.424910 8606 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd" exitCode=0 Dec 04 22:02:26.596484 master-0 kubenswrapper[8606]: I1204 22:02:26.596420 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:26.596484 master-0 kubenswrapper[8606]: I1204 22:02:26.596482 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:26.597279 master-0 kubenswrapper[8606]: I1204 22:02:26.596541 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:26.597279 master-0 kubenswrapper[8606]: I1204 22:02:26.596647 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:26.797202 master-0 kubenswrapper[8606]: E1204 22:02:26.797139 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:27.448581 master-0 kubenswrapper[8606]: I1204 22:02:27.448467 8606 generic.go:334] "Generic (PLEG): container finished" podID="ceb419e4-d804-4111-b8d8-8436cc2ee617" containerID="466a053aebc195d2f55d104f73cf9c35f09469c457c1576c051e6861f31f8a13" exitCode=0 Dec 04 22:02:27.451375 master-0 kubenswrapper[8606]: I1204 22:02:27.451327 8606 generic.go:334] "Generic (PLEG): container finished" podID="690b447a-19c0-4925-bc9d-d0c86a83a377" containerID="f701b6e27b366f9b3e2d799e563c87e892e7b625684a50d11abda6232179d479" exitCode=0 Dec 04 22:02:28.471289 master-0 kubenswrapper[8606]: E1204 22:02:28.471183 8606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:29.596314 master-0 kubenswrapper[8606]: I1204 22:02:29.596240 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:29.597647 master-0 kubenswrapper[8606]: I1204 22:02:29.596345 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:29.699721 master-0 kubenswrapper[8606]: I1204 22:02:29.699651 8606 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-bm2pk container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" start-of-body= Dec 04 22:02:29.700010 master-0 kubenswrapper[8606]: I1204 22:02:29.699749 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" podUID="f893663c-7c1e-4eda-9839-99c1c0440304" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" Dec 04 22:02:32.597353 master-0 kubenswrapper[8606]: I1204 22:02:32.597266 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:32.598327 master-0 kubenswrapper[8606]: I1204 22:02:32.597401 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:35.596356 master-0 kubenswrapper[8606]: I1204 22:02:35.596280 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:35.597150 master-0 kubenswrapper[8606]: I1204 22:02:35.596369 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:36.798614 master-0 kubenswrapper[8606]: E1204 22:02:36.798490 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:36.798614 master-0 kubenswrapper[8606]: E1204 22:02:36.798598 8606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 22:02:38.471757 master-0 kubenswrapper[8606]: E1204 22:02:38.471596 8606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:38.471757 master-0 kubenswrapper[8606]: I1204 22:02:38.471734 8606 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 04 22:02:38.529674 master-0 kubenswrapper[8606]: I1204 22:02:38.529569 8606 generic.go:334] "Generic (PLEG): container finished" podID="f893663c-7c1e-4eda-9839-99c1c0440304" containerID="49e5b6467d42b24a4142a36b3091700faf9ab3af4e0dd62b2e3ca1fd3da47a30" exitCode=0 Dec 04 22:02:38.532019 master-0 kubenswrapper[8606]: I1204 22:02:38.531967 8606 generic.go:334] "Generic (PLEG): container finished" podID="24648a41-875f-4e98-8b21-3bdd38dffa32" containerID="cb9981e4dfed9821dbae6b8b7a8e8e8f099f873bacacc6149961ccf58995e524" exitCode=0 Dec 04 22:02:38.596586 master-0 kubenswrapper[8606]: I1204 22:02:38.596442 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:38.596586 master-0 kubenswrapper[8606]: I1204 22:02:38.596538 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:41.556979 master-0 kubenswrapper[8606]: I1204 22:02:41.556860 8606 generic.go:334] "Generic (PLEG): container finished" podID="46229484-5fa1-4595-94a0-44477abae90e" containerID="c77537fc4f2900520f8e93c8fc7a9508c178081936170d16a0dcd4122f2c7777" exitCode=0 Dec 04 22:02:41.598816 master-0 kubenswrapper[8606]: I1204 22:02:41.598695 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:41.598816 master-0 kubenswrapper[8606]: I1204 22:02:41.598802 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:43.353147 master-0 kubenswrapper[8606]: I1204 22:02:43.353057 8606 status_manager.go:851] "Failed to get status for pod" podUID="b977a2cf-4e95-4456-957d-b1ba05c0d1ff" pod="openshift-kube-scheduler/installer-2-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-2-master-0)" Dec 04 22:02:43.576985 master-0 kubenswrapper[8606]: I1204 22:02:43.576895 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-nk92d_634c1df6-de4d-4e26-8c71-d39311cae0ce/approver/0.log" Dec 04 22:02:43.578205 master-0 kubenswrapper[8606]: I1204 22:02:43.578110 8606 generic.go:334] "Generic (PLEG): container finished" podID="634c1df6-de4d-4e26-8c71-d39311cae0ce" containerID="a679264390b031ae4f297359e8c908ad01e2a92651d2cb70742a5a02fd398618" exitCode=1 Dec 04 22:02:44.015966 master-0 kubenswrapper[8606]: E1204 22:02:44.015907 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:02:44.015966 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-sdrkm_openshift-marketplace_ae107ad4-104c-4264-9844-afb3af28b19e_0(8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f): error adding pod openshift-marketplace_redhat-marketplace-sdrkm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f" Netns:"/var/run/netns/904ef221-837b-4514-83a8-cf449849e163" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-sdrkm;K8S_POD_INFRA_CONTAINER_ID=8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f;K8S_POD_UID=ae107ad4-104c-4264-9844-afb3af28b19e" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-sdrkm] networking: Multus: [openshift-marketplace/redhat-marketplace-sdrkm/ae107ad4-104c-4264-9844-afb3af28b19e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-sdrkm?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.015966 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.015966 master-0 kubenswrapper[8606]: > Dec 04 22:02:44.016132 master-0 kubenswrapper[8606]: E1204 22:02:44.016000 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:02:44.016132 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-sdrkm_openshift-marketplace_ae107ad4-104c-4264-9844-afb3af28b19e_0(8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f): error adding pod openshift-marketplace_redhat-marketplace-sdrkm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f" Netns:"/var/run/netns/904ef221-837b-4514-83a8-cf449849e163" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-sdrkm;K8S_POD_INFRA_CONTAINER_ID=8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f;K8S_POD_UID=ae107ad4-104c-4264-9844-afb3af28b19e" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-sdrkm] networking: Multus: [openshift-marketplace/redhat-marketplace-sdrkm/ae107ad4-104c-4264-9844-afb3af28b19e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-sdrkm?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.016132 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.016132 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:02:44.016132 master-0 kubenswrapper[8606]: E1204 22:02:44.016025 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:02:44.016132 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-sdrkm_openshift-marketplace_ae107ad4-104c-4264-9844-afb3af28b19e_0(8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f): error adding pod openshift-marketplace_redhat-marketplace-sdrkm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f" Netns:"/var/run/netns/904ef221-837b-4514-83a8-cf449849e163" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-sdrkm;K8S_POD_INFRA_CONTAINER_ID=8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f;K8S_POD_UID=ae107ad4-104c-4264-9844-afb3af28b19e" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-sdrkm] networking: Multus: [openshift-marketplace/redhat-marketplace-sdrkm/ae107ad4-104c-4264-9844-afb3af28b19e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-sdrkm?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.016132 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.016132 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:02:44.016333 master-0 kubenswrapper[8606]: E1204 22:02:44.016107 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-sdrkm_openshift-marketplace(ae107ad4-104c-4264-9844-afb3af28b19e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-sdrkm_openshift-marketplace(ae107ad4-104c-4264-9844-afb3af28b19e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-sdrkm_openshift-marketplace_ae107ad4-104c-4264-9844-afb3af28b19e_0(8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f): error adding pod openshift-marketplace_redhat-marketplace-sdrkm to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f\\\" Netns:\\\"/var/run/netns/904ef221-837b-4514-83a8-cf449849e163\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-sdrkm;K8S_POD_INFRA_CONTAINER_ID=8b0bc241f94128265b5fd87623bf65ee0569263036f3fd25c06e19eff4d3182f;K8S_POD_UID=ae107ad4-104c-4264-9844-afb3af28b19e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-sdrkm] networking: Multus: [openshift-marketplace/redhat-marketplace-sdrkm/ae107ad4-104c-4264-9844-afb3af28b19e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-sdrkm?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/redhat-marketplace-sdrkm" podUID="ae107ad4-104c-4264-9844-afb3af28b19e" Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: E1204 22:02:44.030186 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager_c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d_0(3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb): error adding pod openshift-operator-lifecycle-manager_packageserver-7b4bc6c685-l6dfn to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb" Netns:"/var/run/netns/4029a272-3735-4a5f-b24e-9992dc0328c8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-7b4bc6c685-l6dfn;K8S_POD_INFRA_CONTAINER_ID=3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb;K8S_POD_UID=c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-7b4bc6c685-l6dfn?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: > Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: E1204 22:02:44.030296 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager_c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d_0(3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb): error adding pod openshift-operator-lifecycle-manager_packageserver-7b4bc6c685-l6dfn to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb" Netns:"/var/run/netns/4029a272-3735-4a5f-b24e-9992dc0328c8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-7b4bc6c685-l6dfn;K8S_POD_INFRA_CONTAINER_ID=3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb;K8S_POD_UID=c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-7b4bc6c685-l6dfn?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: > pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: E1204 22:02:44.030338 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager_c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d_0(3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb): error adding pod openshift-operator-lifecycle-manager_packageserver-7b4bc6c685-l6dfn to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb" Netns:"/var/run/netns/4029a272-3735-4a5f-b24e-9992dc0328c8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-7b4bc6c685-l6dfn;K8S_POD_INFRA_CONTAINER_ID=3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb;K8S_POD_UID=c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-7b4bc6c685-l6dfn?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: > pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:02:44.030922 master-0 kubenswrapper[8606]: E1204 22:02:44.030440 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager(c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager(c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager_c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d_0(3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb): error adding pod openshift-operator-lifecycle-manager_packageserver-7b4bc6c685-l6dfn to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb\\\" Netns:\\\"/var/run/netns/4029a272-3735-4a5f-b24e-9992dc0328c8\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-7b4bc6c685-l6dfn;K8S_POD_INFRA_CONTAINER_ID=3df2aa6e651e4ca514cbdb8a8d59c563db44169c4caf343b2c114a7e26c2beeb;K8S_POD_UID=c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-7b4bc6c685-l6dfn?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" podUID="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" Dec 04 22:02:44.190283 master-0 kubenswrapper[8606]: E1204 22:02:44.190206 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:02:44.190283 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-vvkjf_openshift-marketplace_2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae_0(c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c): error adding pod openshift-marketplace_community-operators-vvkjf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c" Netns:"/var/run/netns/47a4725b-92c9-45f4-9b2e-312853af77a3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-vvkjf;K8S_POD_INFRA_CONTAINER_ID=c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c;K8S_POD_UID=2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-vvkjf] networking: Multus: [openshift-marketplace/community-operators-vvkjf/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-vvkjf in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-vvkjf in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vvkjf?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.190283 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.190283 master-0 kubenswrapper[8606]: > Dec 04 22:02:44.190449 master-0 kubenswrapper[8606]: E1204 22:02:44.190296 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:02:44.190449 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-vvkjf_openshift-marketplace_2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae_0(c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c): error adding pod openshift-marketplace_community-operators-vvkjf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c" Netns:"/var/run/netns/47a4725b-92c9-45f4-9b2e-312853af77a3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-vvkjf;K8S_POD_INFRA_CONTAINER_ID=c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c;K8S_POD_UID=2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-vvkjf] networking: Multus: [openshift-marketplace/community-operators-vvkjf/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-vvkjf in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-vvkjf in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vvkjf?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.190449 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.190449 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:02:44.190449 master-0 kubenswrapper[8606]: E1204 22:02:44.190325 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:02:44.190449 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-vvkjf_openshift-marketplace_2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae_0(c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c): error adding pod openshift-marketplace_community-operators-vvkjf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c" Netns:"/var/run/netns/47a4725b-92c9-45f4-9b2e-312853af77a3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-vvkjf;K8S_POD_INFRA_CONTAINER_ID=c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c;K8S_POD_UID=2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-vvkjf] networking: Multus: [openshift-marketplace/community-operators-vvkjf/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-vvkjf in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-vvkjf in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vvkjf?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.190449 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.190449 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:02:44.190449 master-0 kubenswrapper[8606]: E1204 22:02:44.190391 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-vvkjf_openshift-marketplace(2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-vvkjf_openshift-marketplace(2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-vvkjf_openshift-marketplace_2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae_0(c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c): error adding pod openshift-marketplace_community-operators-vvkjf to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c\\\" Netns:\\\"/var/run/netns/47a4725b-92c9-45f4-9b2e-312853af77a3\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-vvkjf;K8S_POD_INFRA_CONTAINER_ID=c93de3c88efcc9fa164fdc0ce8d37130cbc01edcbd7381d4fb1663325518de3c;K8S_POD_UID=2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/community-operators-vvkjf] networking: Multus: [openshift-marketplace/community-operators-vvkjf/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-vvkjf in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-vvkjf in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vvkjf?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/community-operators-vvkjf" podUID="2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: E1204 22:02:44.295993 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-sw6sx_openshift-marketplace_29828f55-427b-4fe3-8713-03bcd6ac9dec_0(075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718): error adding pod openshift-marketplace_certified-operators-sw6sx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718" Netns:"/var/run/netns/d03284b2-0049-44ae-bf66-5d37b6183671" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-sw6sx;K8S_POD_INFRA_CONTAINER_ID=075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718;K8S_POD_UID=29828f55-427b-4fe3-8713-03bcd6ac9dec" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-sw6sx] networking: Multus: [openshift-marketplace/certified-operators-sw6sx/29828f55-427b-4fe3-8713-03bcd6ac9dec]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-sw6sx in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-sw6sx in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sw6sx?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: > Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: E1204 22:02:44.296098 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-sw6sx_openshift-marketplace_29828f55-427b-4fe3-8713-03bcd6ac9dec_0(075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718): error adding pod openshift-marketplace_certified-operators-sw6sx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718" Netns:"/var/run/netns/d03284b2-0049-44ae-bf66-5d37b6183671" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-sw6sx;K8S_POD_INFRA_CONTAINER_ID=075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718;K8S_POD_UID=29828f55-427b-4fe3-8713-03bcd6ac9dec" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-sw6sx] networking: Multus: [openshift-marketplace/certified-operators-sw6sx/29828f55-427b-4fe3-8713-03bcd6ac9dec]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-sw6sx in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-sw6sx in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sw6sx?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: E1204 22:02:44.296131 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-sw6sx_openshift-marketplace_29828f55-427b-4fe3-8713-03bcd6ac9dec_0(075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718): error adding pod openshift-marketplace_certified-operators-sw6sx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718" Netns:"/var/run/netns/d03284b2-0049-44ae-bf66-5d37b6183671" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-sw6sx;K8S_POD_INFRA_CONTAINER_ID=075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718;K8S_POD_UID=29828f55-427b-4fe3-8713-03bcd6ac9dec" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-sw6sx] networking: Multus: [openshift-marketplace/certified-operators-sw6sx/29828f55-427b-4fe3-8713-03bcd6ac9dec]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-sw6sx in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-sw6sx in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sw6sx?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.296167 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:02:44.296771 master-0 kubenswrapper[8606]: E1204 22:02:44.296234 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-sw6sx_openshift-marketplace(29828f55-427b-4fe3-8713-03bcd6ac9dec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-sw6sx_openshift-marketplace(29828f55-427b-4fe3-8713-03bcd6ac9dec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-sw6sx_openshift-marketplace_29828f55-427b-4fe3-8713-03bcd6ac9dec_0(075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718): error adding pod openshift-marketplace_certified-operators-sw6sx to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718\\\" Netns:\\\"/var/run/netns/d03284b2-0049-44ae-bf66-5d37b6183671\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-sw6sx;K8S_POD_INFRA_CONTAINER_ID=075036a791406bac3bc674a2a0282e72f76fbaaecd69fa734f3f8f009d89f718;K8S_POD_UID=29828f55-427b-4fe3-8713-03bcd6ac9dec\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/certified-operators-sw6sx] networking: Multus: [openshift-marketplace/certified-operators-sw6sx/29828f55-427b-4fe3-8713-03bcd6ac9dec]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-sw6sx in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-sw6sx in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sw6sx?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/certified-operators-sw6sx" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" Dec 04 22:02:44.305229 master-0 kubenswrapper[8606]: E1204 22:02:44.305150 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:02:44.305229 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_0791dc66-67d9-42bd-b7c3-d45dc5513c3b_0(ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c" Netns:"/var/run/netns/fc09ba33-e1d1-4e34-88b1-d01a88e81342" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c;K8S_POD_UID=0791dc66-67d9-42bd-b7c3-d45dc5513c3b" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/0791dc66-67d9-42bd-b7c3-d45dc5513c3b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.305229 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.305229 master-0 kubenswrapper[8606]: > Dec 04 22:02:44.305472 master-0 kubenswrapper[8606]: E1204 22:02:44.305260 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:02:44.305472 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_0791dc66-67d9-42bd-b7c3-d45dc5513c3b_0(ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c" Netns:"/var/run/netns/fc09ba33-e1d1-4e34-88b1-d01a88e81342" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c;K8S_POD_UID=0791dc66-67d9-42bd-b7c3-d45dc5513c3b" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/0791dc66-67d9-42bd-b7c3-d45dc5513c3b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.305472 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.305472 master-0 kubenswrapper[8606]: > pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:02:44.305472 master-0 kubenswrapper[8606]: E1204 22:02:44.305303 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:02:44.305472 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_0791dc66-67d9-42bd-b7c3-d45dc5513c3b_0(ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c" Netns:"/var/run/netns/fc09ba33-e1d1-4e34-88b1-d01a88e81342" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c;K8S_POD_UID=0791dc66-67d9-42bd-b7c3-d45dc5513c3b" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/0791dc66-67d9-42bd-b7c3-d45dc5513c3b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.305472 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.305472 master-0 kubenswrapper[8606]: > pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:02:44.305472 master-0 kubenswrapper[8606]: E1204 22:02:44.305416 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-2-master-0_openshift-kube-controller-manager(0791dc66-67d9-42bd-b7c3-d45dc5513c3b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-2-master-0_openshift-kube-controller-manager(0791dc66-67d9-42bd-b7c3-d45dc5513c3b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_0791dc66-67d9-42bd-b7c3-d45dc5513c3b_0(ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c\\\" Netns:\\\"/var/run/netns/fc09ba33-e1d1-4e34-88b1-d01a88e81342\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=ea0f90087ea7f5e76f21d1c3a07201e8d37dcb261ad533b5bc5e6684522f295c;K8S_POD_UID=0791dc66-67d9-42bd-b7c3-d45dc5513c3b\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/0791dc66-67d9-42bd-b7c3-d45dc5513c3b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-controller-manager/installer-2-master-0" podUID="0791dc66-67d9-42bd-b7c3-d45dc5513c3b" Dec 04 22:02:44.324372 master-0 kubenswrapper[8606]: E1204 22:02:44.324278 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:02:44.324372 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-zt44t_openshift-marketplace_ce6002bb-4948-45ab-bb1d-ed65e86b6466_0(a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571): error adding pod openshift-marketplace_redhat-operators-zt44t to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571" Netns:"/var/run/netns/f98be176-89da-42e4-8aeb-bff9f243b4de" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-operators-zt44t;K8S_POD_INFRA_CONTAINER_ID=a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571;K8S_POD_UID=ce6002bb-4948-45ab-bb1d-ed65e86b6466" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-operators-zt44t] networking: Multus: [openshift-marketplace/redhat-operators-zt44t/ce6002bb-4948-45ab-bb1d-ed65e86b6466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-operators-zt44t in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-operators-zt44t in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zt44t?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.324372 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.324372 master-0 kubenswrapper[8606]: > Dec 04 22:02:44.324622 master-0 kubenswrapper[8606]: E1204 22:02:44.324380 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:02:44.324622 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-zt44t_openshift-marketplace_ce6002bb-4948-45ab-bb1d-ed65e86b6466_0(a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571): error adding pod openshift-marketplace_redhat-operators-zt44t to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571" Netns:"/var/run/netns/f98be176-89da-42e4-8aeb-bff9f243b4de" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-operators-zt44t;K8S_POD_INFRA_CONTAINER_ID=a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571;K8S_POD_UID=ce6002bb-4948-45ab-bb1d-ed65e86b6466" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-operators-zt44t] networking: Multus: [openshift-marketplace/redhat-operators-zt44t/ce6002bb-4948-45ab-bb1d-ed65e86b6466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-operators-zt44t in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-operators-zt44t in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zt44t?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.324622 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.324622 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:02:44.324622 master-0 kubenswrapper[8606]: E1204 22:02:44.324412 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:02:44.324622 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-zt44t_openshift-marketplace_ce6002bb-4948-45ab-bb1d-ed65e86b6466_0(a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571): error adding pod openshift-marketplace_redhat-operators-zt44t to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571" Netns:"/var/run/netns/f98be176-89da-42e4-8aeb-bff9f243b4de" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-operators-zt44t;K8S_POD_INFRA_CONTAINER_ID=a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571;K8S_POD_UID=ce6002bb-4948-45ab-bb1d-ed65e86b6466" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-operators-zt44t] networking: Multus: [openshift-marketplace/redhat-operators-zt44t/ce6002bb-4948-45ab-bb1d-ed65e86b6466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-operators-zt44t in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-operators-zt44t in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zt44t?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:02:44.324622 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:02:44.324622 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:02:44.324622 master-0 kubenswrapper[8606]: E1204 22:02:44.324525 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-operators-zt44t_openshift-marketplace(ce6002bb-4948-45ab-bb1d-ed65e86b6466)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-operators-zt44t_openshift-marketplace(ce6002bb-4948-45ab-bb1d-ed65e86b6466)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-zt44t_openshift-marketplace_ce6002bb-4948-45ab-bb1d-ed65e86b6466_0(a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571): error adding pod openshift-marketplace_redhat-operators-zt44t to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571\\\" Netns:\\\"/var/run/netns/f98be176-89da-42e4-8aeb-bff9f243b4de\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-operators-zt44t;K8S_POD_INFRA_CONTAINER_ID=a77d7cf4bcaa8a471563ccb79e919260e11ded259b042c028e53ed988ad1f571;K8S_POD_UID=ce6002bb-4948-45ab-bb1d-ed65e86b6466\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/redhat-operators-zt44t] networking: Multus: [openshift-marketplace/redhat-operators-zt44t/ce6002bb-4948-45ab-bb1d-ed65e86b6466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-operators-zt44t in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-operators-zt44t in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zt44t?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/redhat-operators-zt44t" podUID="ce6002bb-4948-45ab-bb1d-ed65e86b6466" Dec 04 22:02:44.588434 master-0 kubenswrapper[8606]: I1204 22:02:44.588351 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/0.log" Dec 04 22:02:44.588434 master-0 kubenswrapper[8606]: I1204 22:02:44.588412 8606 generic.go:334] "Generic (PLEG): container finished" podID="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" containerID="93d74a7e351d1bb38ca66b99396fddaa338eac5fd2201ea238d97a8b16a1e1a0" exitCode=1 Dec 04 22:02:44.590935 master-0 kubenswrapper[8606]: I1204 22:02:44.590867 8606 generic.go:334] "Generic (PLEG): container finished" podID="2d142201-6e77-4828-b86b-05d4144a2f08" containerID="1129d1c5176ef3c828bda41dc553996cb75881e0c9229783b32fa908eaa25ec0" exitCode=0 Dec 04 22:02:44.591067 master-0 kubenswrapper[8606]: I1204 22:02:44.591015 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:02:44.591067 master-0 kubenswrapper[8606]: I1204 22:02:44.591053 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:02:44.591205 master-0 kubenswrapper[8606]: I1204 22:02:44.591071 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:02:44.591205 master-0 kubenswrapper[8606]: I1204 22:02:44.591090 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:02:44.591330 master-0 kubenswrapper[8606]: I1204 22:02:44.591262 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:02:44.591842 master-0 kubenswrapper[8606]: I1204 22:02:44.591779 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:02:44.592431 master-0 kubenswrapper[8606]: I1204 22:02:44.592340 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:02:44.592431 master-0 kubenswrapper[8606]: I1204 22:02:44.592375 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:02:44.592431 master-0 kubenswrapper[8606]: I1204 22:02:44.592425 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:02:44.592820 master-0 kubenswrapper[8606]: I1204 22:02:44.592551 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:02:44.597053 master-0 kubenswrapper[8606]: I1204 22:02:44.596989 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:44.597195 master-0 kubenswrapper[8606]: I1204 22:02:44.597062 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:46.611273 master-0 kubenswrapper[8606]: I1204 22:02:46.611156 8606 generic.go:334] "Generic (PLEG): container finished" podID="a544105a-5bec-456a-aef6-c160943c1f67" containerID="9820de7c24faf6bdc5aac51f81548f854bf3fa05b1f8fd46fe8346195ddc8ca4" exitCode=0 Dec 04 22:02:47.407354 master-0 kubenswrapper[8606]: E1204 22:02:47.407285 8606 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:02:47.407739 master-0 kubenswrapper[8606]: E1204 22:02:47.407474 8606 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.016s" Dec 04 22:02:47.407739 master-0 kubenswrapper[8606]: I1204 22:02:47.407521 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:02:47.408727 master-0 kubenswrapper[8606]: I1204 22:02:47.408652 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:47.408727 master-0 kubenswrapper[8606]: I1204 22:02:47.408678 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6"} pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Dec 04 22:02:47.408826 master-0 kubenswrapper[8606]: I1204 22:02:47.408762 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:47.408826 master-0 kubenswrapper[8606]: I1204 22:02:47.408787 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" containerID="cri-o://0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6" gracePeriod=30 Dec 04 22:02:47.415532 master-0 kubenswrapper[8606]: I1204 22:02:47.415463 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 04 22:02:47.754223 master-0 kubenswrapper[8606]: I1204 22:02:47.754037 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": read tcp 10.128.0.2:42754->10.128.0.53:8443: read: connection reset by peer" start-of-body= Dec 04 22:02:47.754223 master-0 kubenswrapper[8606]: I1204 22:02:47.754104 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": read tcp 10.128.0.2:42754->10.128.0.53:8443: read: connection reset by peer" Dec 04 22:02:48.401473 master-0 kubenswrapper[8606]: E1204 22:02:48.401196 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-autoscaler-operator-5f49d774cd-5m4l9.187e22322f2de411 openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:cluster-autoscaler-operator-5f49d774cd-5m4l9,UID:5598683a-cd32-486d-8839-205829d55cc2,APIVersion:v1,ResourceVersion:8694,FieldPath:spec.containers{cluster-autoscaler-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:72bbe2c638872937108f647950ab8ad35c0428ca8ecc6a39a8314aace7d95078\" in 10.687s (10.687s including waiting). Image size: 450841337 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:01:42.911058961 +0000 UTC m=+67.721361176,LastTimestamp:2025-12-04 22:01:42.911058961 +0000 UTC m=+67.721361176,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:02:48.473125 master-0 kubenswrapper[8606]: E1204 22:02:48.472990 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Dec 04 22:02:48.630160 master-0 kubenswrapper[8606]: I1204 22:02:48.630040 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/2.log" Dec 04 22:02:48.630959 master-0 kubenswrapper[8606]: I1204 22:02:48.630894 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/1.log" Dec 04 22:02:48.631727 master-0 kubenswrapper[8606]: I1204 22:02:48.631655 8606 generic.go:334] "Generic (PLEG): container finished" podID="810c363b-a4c7-428d-a2fb-285adc29f477" containerID="0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6" exitCode=255 Dec 04 22:02:50.596971 master-0 kubenswrapper[8606]: I1204 22:02:50.596829 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:50.596971 master-0 kubenswrapper[8606]: I1204 22:02:50.596946 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:51.655412 master-0 kubenswrapper[8606]: I1204 22:02:51.655289 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba/installer/0.log" Dec 04 22:02:51.655412 master-0 kubenswrapper[8606]: I1204 22:02:51.655360 8606 generic.go:334] "Generic (PLEG): container finished" podID="3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" containerID="7f95f72da52c53d3c8d88cdae7b632b1e707bccffe42c9e45b84331a1108d0c6" exitCode=1 Dec 04 22:02:53.596532 master-0 kubenswrapper[8606]: I1204 22:02:53.596432 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:53.597253 master-0 kubenswrapper[8606]: I1204 22:02:53.596564 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:56.596559 master-0 kubenswrapper[8606]: I1204 22:02:56.596434 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:56.597445 master-0 kubenswrapper[8606]: I1204 22:02:56.596613 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:02:57.189003 master-0 kubenswrapper[8606]: E1204 22:02:57.187744 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:02:47Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:02:47Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:02:47Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:02:47Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3e65409fc2b27ad0aaeb500a39e264663d2980821f099b830b551785ce4ce8b\\\"],\\\"sizeBytes\\\":1631758507},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9014f384de5f9a0b7418d5869ad349abb9588d16bd09ed650a163c045315dbff\\\"],\\\"sizeBytes\\\":1232140918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b472823604757237c2d16bd6f6221f4cf562aa3b05942c7f602e1e8b2e55a7c6\\\"],\\\"sizeBytes\\\":983705650},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\\\"],\\\"sizeBytes\\\":938303566},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:61664aa69b33349cc6de45e44ae6033e7f483c034ea01c0d9a8ca08a12d88e3a\\\"],\\\"sizeBytes\\\":874825223},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:631a3798b749fecc041a99929eb946618df723e15055e805ff752a1a1273481c\\\"],\\\"sizeBytes\\\":870567329},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f1ca78c423f43f89a0411e40393642f64e4f8df9e5f61c25e31047c4cce170f9\\\"],\\\"sizeBytes\\\":857069957},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c2431a990bcddde98829abda81950247021a2ebbabc964b1516ea046b5f1d4e\\\"],\\\"sizeBytes\\\":856659740},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b12f830c3316aa4dc061c2d00c74126282b3e2bcccc301eab00d57fff3c4c7c\\\"],\\\"sizeBytes\\\":767284906},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb3ec61f9a932a9ad13bdeb44bcf9477a8d5f728151d7f19ed3ef7d4b02b3a82\\\"],\\\"sizeBytes\\\":682371258},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:916566bb9d0143352324233d460ad94697719c11c8c9158e3aea8f475941751f\\\"],\\\"sizeBytes\\\":677523572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5451aa441e5b8d8689c032405d410c8049a849ef2edf77e5b6a5ce2838c6569b\\\"],\\\"sizeBytes\\\":672407260},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9724d2036305cbd729e1f484c5bad89971de977fff8a6723fef1873858dd1123\\\"],\\\"sizeBytes\\\":616108962},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:df606f3b71d4376d1a2108c09f0d3dab455fc30bcb67c60e91590c105e9025bf\\\"],\\\"sizeBytes\\\":583836304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:79f99fd6cce984287932edf0d009660bb488d663081f3d62ec3b23bc8bfbf6c2\\\"],\\\"sizeBytes\\\":576619763},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eddedae7578d79b5a3f748000ae5c00b9f14a04710f9f9ec7b52fc569be5dfb8\\\"],\\\"sizeBytes\\\":552673986},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd38b8be3af889b0f97e2df41517c89a11260901432a9a1ee943195bb3a22737\\\"],\\\"sizeBytes\\\":551889548},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa24edce3d740f84c40018e94cdbf2bc7375268d13d57c2d664e43a46ccea3fc\\\"],\\\"sizeBytes\\\":543227406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\\\"],\\\"sizeBytes\\\":532719167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cfde59e48cd5dee3721f34d249cb119cc3259fd857965d34f9c7ed83b0c363a1\\\"],\\\"sizeBytes\\\":532402162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f4d4282cb53325e737ad68abbfcb70687ae04fb50353f4f0ba0ba5703b15009a\\\"],\\\"sizeBytes\\\":512838054},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:8c885ea0b3c5124989f0a9b93eba98eb9fca6bbd0262772d85d90bf713a4d572\\\"],\\\"sizeBytes\\\":512452153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\\\"],\\\"sizeBytes\\\":509437356},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:97d26892192b552c16527bf2771e1b86528ab581a02dd9279cdf71c194830e3e\\\"],\\\"sizeBytes\\\":508042119},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e85850a4ae1a1e3ec2c590a4936d640882b6550124da22031c85b526afbf52df\\\"],\\\"sizeBytes\\\":507687221},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8375671da86aa527ee7e291d86971b0baa823ffc7663b5a983084456e76c0f59\\\"],\\\"sizeBytes\\\":506741476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:831f30660844091d6154e2674d3a9da6f34271bf8a2c40b56f7416066318742b\\\"],\\\"sizeBytes\\\":505649178},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86af77350cfe6fd69280157e4162aa0147873d9431c641ae4ad3e881ff768a73\\\"],\\\"sizeBytes\\\":505628211},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9\\\"],\\\"sizeBytes\\\":503340749},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da\\\"],\\\"sizeBytes\\\":503011144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8eabac819f289e29d75c7ab172d8124554849a47f0b00770928c3eb19a5a31c4\\\"],\\\"sizeBytes\\\":502436444},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10e57ca7611f79710f05777dc6a8f31c7e04eb09da4d8d793a5acfbf0e4692d7\\\"],\\\"sizeBytes\\\":500943492},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f042fa25014f3d37f3ea967d21f361d2a11833ae18f2c750318101b25d2497ce\\\"],\\\"sizeBytes\\\":500848684},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91af633e585621630c40d14f188e37d36b44678d0a59e582d850bf8d593d3a0c\\\"],\\\"sizeBytes\\\":499798563},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\\\"],\\\"sizeBytes\\\":499705918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33a20002692769235e95271ab071783c57ff50681088fa1035b86af31e73cf20\\\"],\\\"sizeBytes\\\":499125567},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:75d996f6147edb88c09fd1a052099de66638590d7d03a735006244bc9e19f898\\\"],\\\"sizeBytes\\\":499082775},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3b8d91a25eeb9f02041e947adb3487da3e7ab8449d3d2ad015827e7954df7b34\\\"],\\\"sizeBytes\\\":490455952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f952cec1e5332b84bdffa249cd426f39087058d6544ddcec650a414c15a9b68\\\"],\\\"sizeBytes\\\":489528665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c416b201d480bddb5a4960ec42f4740761a1335001cf84ba5ae19ad6857771b1\\\"],\\\"sizeBytes\\\":481559117},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3a77aa4d03b89ea284e3467a268e5989a77a2ef63e685eb1d5c5ea5b3922b7a\\\"],\\\"sizeBytes\\\":478917802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb928c13a46d3fb45f4a881892d023a92d610a5430be0ffd916aaf8da8e7d297\\\"],\\\"sizeBytes\\\":478642572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a92c310ce30dcb3de85d6aac868e0d80919670fa29ef83d55edd96b0cae35563\\\"],\\\"sizeBytes\\\":465285478},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fd3e9f8f00a59bda7483ec7dc8a0ed602f9ca30e3d72b22072dbdf2819da3f61\\\"],\\\"sizeBytes\\\":465144618},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c1edf52f70bf9b1d1457e0c4111bc79cdaa1edd659ddbdb9d8176eff8b46956\\\"],\\\"sizeBytes\\\":462727837},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8cc27777e72233024fe84ee1faa168aec715a0b24912a3ce70715ddccba328df\\\"],\\\"sizeBytes\\\":461702648},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\\\"],\\\"sizeBytes\\\":459552216},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3ce2cbf1032ad0f24f204db73687002fcf302e86ebde3945801c74351b64576\\\"],\\\"sizeBytes\\\":458169255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7664a2d4cb10e82ed32abbf95799f43fc3d10135d7dd94799730de504a89680a\\\"],\\\"sizeBytes\\\":452589750},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4ecc5bac651ff1942865baee5159582e9602c89b47eeab18400a32abcba8f690\\\"],\\\"sizeBytes\\\":451039520}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:02:58.674929 master-0 kubenswrapper[8606]: E1204 22:02:58.674797 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Dec 04 22:02:58.722588 master-0 kubenswrapper[8606]: I1204 22:02:58.722440 8606 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9" exitCode=1 Dec 04 22:02:59.596691 master-0 kubenswrapper[8606]: I1204 22:02:59.596561 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:02:59.597037 master-0 kubenswrapper[8606]: I1204 22:02:59.596691 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:00.332554 master-0 kubenswrapper[8606]: I1204 22:03:00.332403 8606 patch_prober.go:28] interesting pod/etcd-operator-5bf4d88c6f-flrrb container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" start-of-body= Dec 04 22:03:00.333382 master-0 kubenswrapper[8606]: I1204 22:03:00.332590 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" podUID="ceb419e4-d804-4111-b8d8-8436cc2ee617" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.12:8443/healthz\": dial tcp 10.128.0.12:8443: connect: connection refused" Dec 04 22:03:02.597080 master-0 kubenswrapper[8606]: I1204 22:03:02.596988 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:02.597886 master-0 kubenswrapper[8606]: I1204 22:03:02.597090 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:05.596769 master-0 kubenswrapper[8606]: I1204 22:03:05.596646 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:05.596769 master-0 kubenswrapper[8606]: I1204 22:03:05.596782 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:07.188919 master-0 kubenswrapper[8606]: E1204 22:03:07.188817 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:03:08.597737 master-0 kubenswrapper[8606]: I1204 22:03:08.597611 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:08.597737 master-0 kubenswrapper[8606]: I1204 22:03:08.597711 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:09.076023 master-0 kubenswrapper[8606]: E1204 22:03:09.075875 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="800ms" Dec 04 22:03:10.801423 master-0 kubenswrapper[8606]: I1204 22:03:10.801385 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/0.log" Dec 04 22:03:10.801971 master-0 kubenswrapper[8606]: I1204 22:03:10.801713 8606 generic.go:334] "Generic (PLEG): container finished" podID="addddaac-a31a-4dbf-b78f-87225b11b463" containerID="f32a0325771ce40043e2990b6e044b2e673986f92037baf7df71e61135c7bd82" exitCode=1 Dec 04 22:03:11.596669 master-0 kubenswrapper[8606]: I1204 22:03:11.596491 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:11.596669 master-0 kubenswrapper[8606]: I1204 22:03:11.596636 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:14.596624 master-0 kubenswrapper[8606]: I1204 22:03:14.596546 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:14.597264 master-0 kubenswrapper[8606]: I1204 22:03:14.596634 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:14.830456 master-0 kubenswrapper[8606]: I1204 22:03:14.830406 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-dcf7fc84b-qmhlw_a043ea49-97f9-4ae6-83b9-733f12754d94/cluster-storage-operator/0.log" Dec 04 22:03:14.830847 master-0 kubenswrapper[8606]: I1204 22:03:14.830478 8606 generic.go:334] "Generic (PLEG): container finished" podID="a043ea49-97f9-4ae6-83b9-733f12754d94" containerID="b24a52101599e57bc25b6c160a06c23124bc447eb919bdd2267f0b91d0f6aaee" exitCode=255 Dec 04 22:03:14.833091 master-0 kubenswrapper[8606]: I1204 22:03:14.833028 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-jb4xf_7f091088-2166-4026-9fa6-62bd83407edb/openshift-controller-manager-operator/1.log" Dec 04 22:03:14.834253 master-0 kubenswrapper[8606]: I1204 22:03:14.834181 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-jb4xf_7f091088-2166-4026-9fa6-62bd83407edb/openshift-controller-manager-operator/0.log" Dec 04 22:03:14.834396 master-0 kubenswrapper[8606]: I1204 22:03:14.834263 8606 generic.go:334] "Generic (PLEG): container finished" podID="7f091088-2166-4026-9fa6-62bd83407edb" containerID="3c8faa0cec9898a47039ead85f90eab240ebf83ecd040f53acd3c80c7bec151c" exitCode=255 Dec 04 22:03:17.189750 master-0 kubenswrapper[8606]: E1204 22:03:17.189681 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:03:17.597043 master-0 kubenswrapper[8606]: I1204 22:03:17.596964 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:17.597309 master-0 kubenswrapper[8606]: I1204 22:03:17.597058 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:19.877237 master-0 kubenswrapper[8606]: E1204 22:03:19.876969 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Dec 04 22:03:20.596406 master-0 kubenswrapper[8606]: I1204 22:03:20.596294 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:20.596841 master-0 kubenswrapper[8606]: I1204 22:03:20.596460 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:21.418151 master-0 kubenswrapper[8606]: E1204 22:03:21.418075 8606 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:03:21.418724 master-0 kubenswrapper[8606]: E1204 22:03:21.418326 8606 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.011s" Dec 04 22:03:21.418724 master-0 kubenswrapper[8606]: I1204 22:03:21.418355 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:03:21.423716 master-0 kubenswrapper[8606]: I1204 22:03:21.423683 8606 scope.go:117] "RemoveContainer" containerID="56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9" Dec 04 22:03:21.427283 master-0 kubenswrapper[8606]: I1204 22:03:21.427241 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 04 22:03:22.404184 master-0 kubenswrapper[8606]: E1204 22:03:22.403941 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-baremetal-operator-78f758c7b9-44srj.187e2232307bf9a6 openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:cluster-baremetal-operator-78f758c7b9-44srj,UID:a3899a38-39b8-4b48-81e5-4d8854ecc8ab,APIVersion:v1,ResourceVersion:8722,FieldPath:spec.containers{cluster-baremetal-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a92c310ce30dcb3de85d6aac868e0d80919670fa29ef83d55edd96b0cae35563\" in 9.337s (9.337s including waiting). Image size: 465285478 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:01:42.93295351 +0000 UTC m=+67.743255735,LastTimestamp:2025-12-04 22:01:42.93295351 +0000 UTC m=+67.743255735,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:03:23.596394 master-0 kubenswrapper[8606]: I1204 22:03:23.596279 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:23.596394 master-0 kubenswrapper[8606]: I1204 22:03:23.596379 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:26.596984 master-0 kubenswrapper[8606]: I1204 22:03:26.596846 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:26.596984 master-0 kubenswrapper[8606]: I1204 22:03:26.596964 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:27.192167 master-0 kubenswrapper[8606]: E1204 22:03:27.192075 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:03:29.596448 master-0 kubenswrapper[8606]: I1204 22:03:29.596368 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:29.597061 master-0 kubenswrapper[8606]: I1204 22:03:29.596460 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:31.631985 master-0 kubenswrapper[8606]: E1204 22:03:31.631899 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 04 22:03:32.597050 master-0 kubenswrapper[8606]: I1204 22:03:32.596965 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:32.597050 master-0 kubenswrapper[8606]: I1204 22:03:32.597047 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:35.596462 master-0 kubenswrapper[8606]: I1204 22:03:35.596302 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:35.596462 master-0 kubenswrapper[8606]: I1204 22:03:35.596400 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:37.193573 master-0 kubenswrapper[8606]: E1204 22:03:37.193444 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:03:37.193573 master-0 kubenswrapper[8606]: E1204 22:03:37.193538 8606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 22:03:38.599049 master-0 kubenswrapper[8606]: I1204 22:03:38.598928 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:38.600563 master-0 kubenswrapper[8606]: I1204 22:03:38.599039 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:41.596662 master-0 kubenswrapper[8606]: I1204 22:03:41.596492 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:41.597448 master-0 kubenswrapper[8606]: I1204 22:03:41.596669 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:43.358223 master-0 kubenswrapper[8606]: I1204 22:03:43.358077 8606 status_manager.go:851] "Failed to get status for pod" podUID="a043ea49-97f9-4ae6-83b9-733f12754d94" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods cluster-storage-operator-dcf7fc84b-qmhlw)" Dec 04 22:03:44.035696 master-0 kubenswrapper[8606]: I1204 22:03:44.035597 8606 generic.go:334] "Generic (PLEG): container finished" podID="c6a5d14d-0409-4024-b0a8-200fa2594185" containerID="bad146c5ce315f7f5070081135587df7a077e864def57e2c38a773560069cf17" exitCode=0 Dec 04 22:03:44.597337 master-0 kubenswrapper[8606]: I1204 22:03:44.597231 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:44.598300 master-0 kubenswrapper[8606]: I1204 22:03:44.597339 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:44.833393 master-0 kubenswrapper[8606]: E1204 22:03:44.833264 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 04 22:03:45.474835 master-0 kubenswrapper[8606]: E1204 22:03:45.474758 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:03:45.474835 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager_c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d_0(4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558): error adding pod openshift-operator-lifecycle-manager_packageserver-7b4bc6c685-l6dfn to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558" Netns:"/var/run/netns/1e2561ad-e25e-4fbd-978b-f4323b7d7b73" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-7b4bc6c685-l6dfn;K8S_POD_INFRA_CONTAINER_ID=4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558;K8S_POD_UID=c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-7b4bc6c685-l6dfn?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.474835 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.474835 master-0 kubenswrapper[8606]: > Dec 04 22:03:45.475021 master-0 kubenswrapper[8606]: E1204 22:03:45.474868 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:03:45.475021 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager_c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d_0(4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558): error adding pod openshift-operator-lifecycle-manager_packageserver-7b4bc6c685-l6dfn to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558" Netns:"/var/run/netns/1e2561ad-e25e-4fbd-978b-f4323b7d7b73" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-7b4bc6c685-l6dfn;K8S_POD_INFRA_CONTAINER_ID=4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558;K8S_POD_UID=c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-7b4bc6c685-l6dfn?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.475021 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.475021 master-0 kubenswrapper[8606]: > pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:03:45.475021 master-0 kubenswrapper[8606]: E1204 22:03:45.474903 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:03:45.475021 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager_c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d_0(4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558): error adding pod openshift-operator-lifecycle-manager_packageserver-7b4bc6c685-l6dfn to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558" Netns:"/var/run/netns/1e2561ad-e25e-4fbd-978b-f4323b7d7b73" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-7b4bc6c685-l6dfn;K8S_POD_INFRA_CONTAINER_ID=4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558;K8S_POD_UID=c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-7b4bc6c685-l6dfn?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.475021 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.475021 master-0 kubenswrapper[8606]: > pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:03:45.475232 master-0 kubenswrapper[8606]: E1204 22:03:45.475006 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager(c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager(c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-7b4bc6c685-l6dfn_openshift-operator-lifecycle-manager_c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d_0(4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558): error adding pod openshift-operator-lifecycle-manager_packageserver-7b4bc6c685-l6dfn to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558\\\" Netns:\\\"/var/run/netns/1e2561ad-e25e-4fbd-978b-f4323b7d7b73\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-7b4bc6c685-l6dfn;K8S_POD_INFRA_CONTAINER_ID=4b0e7ca02be24357d604f6dce3de1bcc4f98c003c2c1344963fbd9faa28f4558;K8S_POD_UID=c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-7b4bc6c685-l6dfn in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-7b4bc6c685-l6dfn?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" podUID="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" Dec 04 22:03:45.596924 master-0 kubenswrapper[8606]: E1204 22:03:45.596824 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:03:45.596924 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-sw6sx_openshift-marketplace_29828f55-427b-4fe3-8713-03bcd6ac9dec_0(4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0): error adding pod openshift-marketplace_certified-operators-sw6sx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0" Netns:"/var/run/netns/15b2eb90-78fa-476a-9028-f25fa4ba1943" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-sw6sx;K8S_POD_INFRA_CONTAINER_ID=4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0;K8S_POD_UID=29828f55-427b-4fe3-8713-03bcd6ac9dec" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-sw6sx] networking: Multus: [openshift-marketplace/certified-operators-sw6sx/29828f55-427b-4fe3-8713-03bcd6ac9dec]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-sw6sx in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-sw6sx in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sw6sx?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.596924 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.596924 master-0 kubenswrapper[8606]: > Dec 04 22:03:45.597145 master-0 kubenswrapper[8606]: E1204 22:03:45.596960 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:03:45.597145 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-sw6sx_openshift-marketplace_29828f55-427b-4fe3-8713-03bcd6ac9dec_0(4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0): error adding pod openshift-marketplace_certified-operators-sw6sx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0" Netns:"/var/run/netns/15b2eb90-78fa-476a-9028-f25fa4ba1943" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-sw6sx;K8S_POD_INFRA_CONTAINER_ID=4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0;K8S_POD_UID=29828f55-427b-4fe3-8713-03bcd6ac9dec" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-sw6sx] networking: Multus: [openshift-marketplace/certified-operators-sw6sx/29828f55-427b-4fe3-8713-03bcd6ac9dec]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-sw6sx in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-sw6sx in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sw6sx?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.597145 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.597145 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:03:45.597145 master-0 kubenswrapper[8606]: E1204 22:03:45.597007 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:03:45.597145 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-sw6sx_openshift-marketplace_29828f55-427b-4fe3-8713-03bcd6ac9dec_0(4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0): error adding pod openshift-marketplace_certified-operators-sw6sx to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0" Netns:"/var/run/netns/15b2eb90-78fa-476a-9028-f25fa4ba1943" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-sw6sx;K8S_POD_INFRA_CONTAINER_ID=4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0;K8S_POD_UID=29828f55-427b-4fe3-8713-03bcd6ac9dec" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-sw6sx] networking: Multus: [openshift-marketplace/certified-operators-sw6sx/29828f55-427b-4fe3-8713-03bcd6ac9dec]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-sw6sx in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-sw6sx in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sw6sx?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.597145 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.597145 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:03:45.597145 master-0 kubenswrapper[8606]: E1204 22:03:45.597099 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-sw6sx_openshift-marketplace(29828f55-427b-4fe3-8713-03bcd6ac9dec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-sw6sx_openshift-marketplace(29828f55-427b-4fe3-8713-03bcd6ac9dec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-sw6sx_openshift-marketplace_29828f55-427b-4fe3-8713-03bcd6ac9dec_0(4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0): error adding pod openshift-marketplace_certified-operators-sw6sx to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0\\\" Netns:\\\"/var/run/netns/15b2eb90-78fa-476a-9028-f25fa4ba1943\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-sw6sx;K8S_POD_INFRA_CONTAINER_ID=4a26ab4d58aa6d834b51f095bade215900f3763a3d49728e4fb673e79c3c3ae0;K8S_POD_UID=29828f55-427b-4fe3-8713-03bcd6ac9dec\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/certified-operators-sw6sx] networking: Multus: [openshift-marketplace/certified-operators-sw6sx/29828f55-427b-4fe3-8713-03bcd6ac9dec]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-sw6sx in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-sw6sx in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sw6sx?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/certified-operators-sw6sx" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" Dec 04 22:03:45.606708 master-0 kubenswrapper[8606]: E1204 22:03:45.606548 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:03:45.606708 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-zt44t_openshift-marketplace_ce6002bb-4948-45ab-bb1d-ed65e86b6466_0(0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3): error adding pod openshift-marketplace_redhat-operators-zt44t to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3" Netns:"/var/run/netns/f1665931-a5f6-4892-8b3a-49b6f706a055" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-operators-zt44t;K8S_POD_INFRA_CONTAINER_ID=0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3;K8S_POD_UID=ce6002bb-4948-45ab-bb1d-ed65e86b6466" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-operators-zt44t] networking: Multus: [openshift-marketplace/redhat-operators-zt44t/ce6002bb-4948-45ab-bb1d-ed65e86b6466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-operators-zt44t in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-operators-zt44t in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zt44t?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.606708 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.606708 master-0 kubenswrapper[8606]: > Dec 04 22:03:45.606708 master-0 kubenswrapper[8606]: E1204 22:03:45.606658 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:03:45.606708 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-zt44t_openshift-marketplace_ce6002bb-4948-45ab-bb1d-ed65e86b6466_0(0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3): error adding pod openshift-marketplace_redhat-operators-zt44t to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3" Netns:"/var/run/netns/f1665931-a5f6-4892-8b3a-49b6f706a055" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-operators-zt44t;K8S_POD_INFRA_CONTAINER_ID=0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3;K8S_POD_UID=ce6002bb-4948-45ab-bb1d-ed65e86b6466" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-operators-zt44t] networking: Multus: [openshift-marketplace/redhat-operators-zt44t/ce6002bb-4948-45ab-bb1d-ed65e86b6466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-operators-zt44t in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-operators-zt44t in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zt44t?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.606708 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.606708 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:03:45.607572 master-0 kubenswrapper[8606]: E1204 22:03:45.606728 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:03:45.607572 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-zt44t_openshift-marketplace_ce6002bb-4948-45ab-bb1d-ed65e86b6466_0(0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3): error adding pod openshift-marketplace_redhat-operators-zt44t to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3" Netns:"/var/run/netns/f1665931-a5f6-4892-8b3a-49b6f706a055" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-operators-zt44t;K8S_POD_INFRA_CONTAINER_ID=0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3;K8S_POD_UID=ce6002bb-4948-45ab-bb1d-ed65e86b6466" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-operators-zt44t] networking: Multus: [openshift-marketplace/redhat-operators-zt44t/ce6002bb-4948-45ab-bb1d-ed65e86b6466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-operators-zt44t in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-operators-zt44t in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zt44t?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.607572 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.607572 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:03:45.607572 master-0 kubenswrapper[8606]: E1204 22:03:45.606825 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-operators-zt44t_openshift-marketplace(ce6002bb-4948-45ab-bb1d-ed65e86b6466)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-operators-zt44t_openshift-marketplace(ce6002bb-4948-45ab-bb1d-ed65e86b6466)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-operators-zt44t_openshift-marketplace_ce6002bb-4948-45ab-bb1d-ed65e86b6466_0(0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3): error adding pod openshift-marketplace_redhat-operators-zt44t to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3\\\" Netns:\\\"/var/run/netns/f1665931-a5f6-4892-8b3a-49b6f706a055\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-operators-zt44t;K8S_POD_INFRA_CONTAINER_ID=0e2ee042a8526ef8a83f71dcc3aac9d31b83ead8076f0b0f9eb9f6b67b4d7aa3;K8S_POD_UID=ce6002bb-4948-45ab-bb1d-ed65e86b6466\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/redhat-operators-zt44t] networking: Multus: [openshift-marketplace/redhat-operators-zt44t/ce6002bb-4948-45ab-bb1d-ed65e86b6466]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-operators-zt44t in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-operators-zt44t in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-zt44t?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/redhat-operators-zt44t" podUID="ce6002bb-4948-45ab-bb1d-ed65e86b6466" Dec 04 22:03:45.613245 master-0 kubenswrapper[8606]: E1204 22:03:45.613176 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:03:45.613245 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-sdrkm_openshift-marketplace_ae107ad4-104c-4264-9844-afb3af28b19e_0(66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617): error adding pod openshift-marketplace_redhat-marketplace-sdrkm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617" Netns:"/var/run/netns/d03817ed-4dab-44b6-9fb3-c8203861aaf7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-sdrkm;K8S_POD_INFRA_CONTAINER_ID=66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617;K8S_POD_UID=ae107ad4-104c-4264-9844-afb3af28b19e" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-sdrkm] networking: Multus: [openshift-marketplace/redhat-marketplace-sdrkm/ae107ad4-104c-4264-9844-afb3af28b19e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-sdrkm?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.613245 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.613245 master-0 kubenswrapper[8606]: > Dec 04 22:03:45.613723 master-0 kubenswrapper[8606]: E1204 22:03:45.613255 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:03:45.613723 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-sdrkm_openshift-marketplace_ae107ad4-104c-4264-9844-afb3af28b19e_0(66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617): error adding pod openshift-marketplace_redhat-marketplace-sdrkm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617" Netns:"/var/run/netns/d03817ed-4dab-44b6-9fb3-c8203861aaf7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-sdrkm;K8S_POD_INFRA_CONTAINER_ID=66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617;K8S_POD_UID=ae107ad4-104c-4264-9844-afb3af28b19e" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-sdrkm] networking: Multus: [openshift-marketplace/redhat-marketplace-sdrkm/ae107ad4-104c-4264-9844-afb3af28b19e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-sdrkm?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.613723 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.613723 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:03:45.613723 master-0 kubenswrapper[8606]: E1204 22:03:45.613278 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:03:45.613723 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-sdrkm_openshift-marketplace_ae107ad4-104c-4264-9844-afb3af28b19e_0(66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617): error adding pod openshift-marketplace_redhat-marketplace-sdrkm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617" Netns:"/var/run/netns/d03817ed-4dab-44b6-9fb3-c8203861aaf7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-sdrkm;K8S_POD_INFRA_CONTAINER_ID=66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617;K8S_POD_UID=ae107ad4-104c-4264-9844-afb3af28b19e" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-sdrkm] networking: Multus: [openshift-marketplace/redhat-marketplace-sdrkm/ae107ad4-104c-4264-9844-afb3af28b19e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-sdrkm?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.613723 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.613723 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:03:45.613723 master-0 kubenswrapper[8606]: E1204 22:03:45.613352 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-sdrkm_openshift-marketplace(ae107ad4-104c-4264-9844-afb3af28b19e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-sdrkm_openshift-marketplace(ae107ad4-104c-4264-9844-afb3af28b19e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-sdrkm_openshift-marketplace_ae107ad4-104c-4264-9844-afb3af28b19e_0(66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617): error adding pod openshift-marketplace_redhat-marketplace-sdrkm to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617\\\" Netns:\\\"/var/run/netns/d03817ed-4dab-44b6-9fb3-c8203861aaf7\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-sdrkm;K8S_POD_INFRA_CONTAINER_ID=66105594364cd12fc17f1e7baf1f723ed02bdd2e7e37f015a7b233674846d617;K8S_POD_UID=ae107ad4-104c-4264-9844-afb3af28b19e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-sdrkm] networking: Multus: [openshift-marketplace/redhat-marketplace-sdrkm/ae107ad4-104c-4264-9844-afb3af28b19e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-sdrkm in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-sdrkm?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/redhat-marketplace-sdrkm" podUID="ae107ad4-104c-4264-9844-afb3af28b19e" Dec 04 22:03:45.618535 master-0 kubenswrapper[8606]: E1204 22:03:45.618456 8606 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 22:03:45.618535 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-vvkjf_openshift-marketplace_2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae_0(f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0): error adding pod openshift-marketplace_community-operators-vvkjf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0" Netns:"/var/run/netns/84d0a42d-2373-4d01-9413-c37691365f48" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-vvkjf;K8S_POD_INFRA_CONTAINER_ID=f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0;K8S_POD_UID=2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-vvkjf] networking: Multus: [openshift-marketplace/community-operators-vvkjf/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-vvkjf in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-vvkjf in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vvkjf?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.618535 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.618535 master-0 kubenswrapper[8606]: > Dec 04 22:03:45.618805 master-0 kubenswrapper[8606]: E1204 22:03:45.618536 8606 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 22:03:45.618805 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-vvkjf_openshift-marketplace_2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae_0(f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0): error adding pod openshift-marketplace_community-operators-vvkjf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0" Netns:"/var/run/netns/84d0a42d-2373-4d01-9413-c37691365f48" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-vvkjf;K8S_POD_INFRA_CONTAINER_ID=f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0;K8S_POD_UID=2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-vvkjf] networking: Multus: [openshift-marketplace/community-operators-vvkjf/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-vvkjf in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-vvkjf in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vvkjf?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.618805 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.618805 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:03:45.618805 master-0 kubenswrapper[8606]: E1204 22:03:45.618562 8606 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 04 22:03:45.618805 master-0 kubenswrapper[8606]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-vvkjf_openshift-marketplace_2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae_0(f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0): error adding pod openshift-marketplace_community-operators-vvkjf to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0" Netns:"/var/run/netns/84d0a42d-2373-4d01-9413-c37691365f48" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-vvkjf;K8S_POD_INFRA_CONTAINER_ID=f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0;K8S_POD_UID=2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-vvkjf] networking: Multus: [openshift-marketplace/community-operators-vvkjf/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-vvkjf in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-vvkjf in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vvkjf?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 04 22:03:45.618805 master-0 kubenswrapper[8606]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 22:03:45.618805 master-0 kubenswrapper[8606]: > pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:03:45.618805 master-0 kubenswrapper[8606]: E1204 22:03:45.618630 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-vvkjf_openshift-marketplace(2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-vvkjf_openshift-marketplace(2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-vvkjf_openshift-marketplace_2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae_0(f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0): error adding pod openshift-marketplace_community-operators-vvkjf to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0\\\" Netns:\\\"/var/run/netns/84d0a42d-2373-4d01-9413-c37691365f48\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-vvkjf;K8S_POD_INFRA_CONTAINER_ID=f78e1da303a2ea79c1eaf8433fa6761f74d2e30feb360dbde74151e4651521a0;K8S_POD_UID=2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/community-operators-vvkjf] networking: Multus: [openshift-marketplace/community-operators-vvkjf/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-vvkjf in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-vvkjf in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vvkjf?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/community-operators-vvkjf" podUID="2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" Dec 04 22:03:47.596804 master-0 kubenswrapper[8606]: I1204 22:03:47.596681 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:03:47.597424 master-0 kubenswrapper[8606]: I1204 22:03:47.596815 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:03:48.070645 master-0 kubenswrapper[8606]: I1204 22:03:48.070578 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/0.log" Dec 04 22:03:48.070928 master-0 kubenswrapper[8606]: I1204 22:03:48.070669 8606 generic.go:334] "Generic (PLEG): container finished" podID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" containerID="264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb" exitCode=1 Dec 04 22:03:49.083460 master-0 kubenswrapper[8606]: I1204 22:03:49.083368 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-nxbjw_ce6b5a46-172b-4575-ba22-ff3c6ea4207f/manager/0.log" Dec 04 22:03:49.083460 master-0 kubenswrapper[8606]: I1204 22:03:49.083453 8606 generic.go:334] "Generic (PLEG): container finished" podID="ce6b5a46-172b-4575-ba22-ff3c6ea4207f" containerID="037f05faa0b4635e20f5127ded6c5b63a2893aa9715387918fd80e11092dcfbb" exitCode=1 Dec 04 22:03:49.086202 master-0 kubenswrapper[8606]: I1204 22:03:49.086131 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-v7zfw_fb0274dc-fac1-41f9-b3e5-77253d851fdf/manager/0.log" Dec 04 22:03:49.087326 master-0 kubenswrapper[8606]: I1204 22:03:49.087219 8606 generic.go:334] "Generic (PLEG): container finished" podID="fb0274dc-fac1-41f9-b3e5-77253d851fdf" containerID="5b7837bb8d893076191e798bbe6f7756d536495c527346610e4cc8ec29e29fe5" exitCode=1 Dec 04 22:03:51.595868 master-0 kubenswrapper[8606]: I1204 22:03:51.595725 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:03:51.596569 master-0 kubenswrapper[8606]: I1204 22:03:51.595893 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:03:52.269286 master-0 kubenswrapper[8606]: I1204 22:03:52.269171 8606 patch_prober.go:28] interesting pod/marketplace-operator-f797b99b6-m9m4h container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" start-of-body= Dec 04 22:03:52.269286 master-0 kubenswrapper[8606]: I1204 22:03:52.269173 8606 patch_prober.go:28] interesting pod/marketplace-operator-f797b99b6-m9m4h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" start-of-body= Dec 04 22:03:52.269704 master-0 kubenswrapper[8606]: I1204 22:03:52.269299 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" podUID="c6a5d14d-0409-4024-b0a8-200fa2594185" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" Dec 04 22:03:52.269704 master-0 kubenswrapper[8606]: I1204 22:03:52.269340 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" podUID="c6a5d14d-0409-4024-b0a8-200fa2594185" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" Dec 04 22:03:54.597147 master-0 kubenswrapper[8606]: I1204 22:03:54.596984 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:03:54.598462 master-0 kubenswrapper[8606]: I1204 22:03:54.597193 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:03:55.429894 master-0 kubenswrapper[8606]: E1204 22:03:55.429785 8606 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:03:55.430245 master-0 kubenswrapper[8606]: E1204 22:03:55.430198 8606 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.012s" Dec 04 22:03:55.431307 master-0 kubenswrapper[8606]: I1204 22:03:55.431256 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:03:55.432132 master-0 kubenswrapper[8606]: I1204 22:03:55.432060 8606 scope.go:117] "RemoveContainer" containerID="f32a0325771ce40043e2990b6e044b2e673986f92037baf7df71e61135c7bd82" Dec 04 22:03:55.432997 master-0 kubenswrapper[8606]: I1204 22:03:55.432920 8606 scope.go:117] "RemoveContainer" containerID="49e5b6467d42b24a4142a36b3091700faf9ab3af4e0dd62b2e3ca1fd3da47a30" Dec 04 22:03:55.433107 master-0 kubenswrapper[8606]: I1204 22:03:55.433056 8606 scope.go:117] "RemoveContainer" containerID="cb9981e4dfed9821dbae6b8b7a8e8e8f099f873bacacc6149961ccf58995e524" Dec 04 22:03:55.434214 master-0 kubenswrapper[8606]: I1204 22:03:55.433946 8606 scope.go:117] "RemoveContainer" containerID="9d7fd4b64c7f9d10b43359385a6360e49aa71c5085c781ef53642cd82a85d004" Dec 04 22:03:55.434214 master-0 kubenswrapper[8606]: I1204 22:03:55.434017 8606 scope.go:117] "RemoveContainer" containerID="b24a52101599e57bc25b6c160a06c23124bc447eb919bdd2267f0b91d0f6aaee" Dec 04 22:03:55.434486 master-0 kubenswrapper[8606]: I1204 22:03:55.434266 8606 scope.go:117] "RemoveContainer" containerID="264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb" Dec 04 22:03:55.435640 master-0 kubenswrapper[8606]: I1204 22:03:55.435435 8606 scope.go:117] "RemoveContainer" containerID="037f05faa0b4635e20f5127ded6c5b63a2893aa9715387918fd80e11092dcfbb" Dec 04 22:03:55.435640 master-0 kubenswrapper[8606]: I1204 22:03:55.435594 8606 scope.go:117] "RemoveContainer" containerID="d8a2de466dc95e948ba536210f040992057ba7bc222a8102fb88249ab34f040a" Dec 04 22:03:55.436958 master-0 kubenswrapper[8606]: I1204 22:03:55.436802 8606 scope.go:117] "RemoveContainer" containerID="c77537fc4f2900520f8e93c8fc7a9508c178081936170d16a0dcd4122f2c7777" Dec 04 22:03:55.437167 master-0 kubenswrapper[8606]: I1204 22:03:55.437131 8606 scope.go:117] "RemoveContainer" containerID="9820de7c24faf6bdc5aac51f81548f854bf3fa05b1f8fd46fe8346195ddc8ca4" Dec 04 22:03:55.441969 master-0 kubenswrapper[8606]: I1204 22:03:55.441917 8606 scope.go:117] "RemoveContainer" containerID="a679264390b031ae4f297359e8c908ad01e2a92651d2cb70742a5a02fd398618" Dec 04 22:03:55.442271 master-0 kubenswrapper[8606]: I1204 22:03:55.442227 8606 scope.go:117] "RemoveContainer" containerID="f701b6e27b366f9b3e2d799e563c87e892e7b625684a50d11abda6232179d479" Dec 04 22:03:55.442766 master-0 kubenswrapper[8606]: I1204 22:03:55.442450 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:03:55.447711 master-0 kubenswrapper[8606]: I1204 22:03:55.447651 8606 scope.go:117] "RemoveContainer" containerID="93d74a7e351d1bb38ca66b99396fddaa338eac5fd2201ea238d97a8b16a1e1a0" Dec 04 22:03:55.453673 master-0 kubenswrapper[8606]: I1204 22:03:55.450448 8606 scope.go:117] "RemoveContainer" containerID="466a053aebc195d2f55d104f73cf9c35f09469c457c1576c051e6861f31f8a13" Dec 04 22:03:55.453673 master-0 kubenswrapper[8606]: I1204 22:03:55.451138 8606 scope.go:117] "RemoveContainer" containerID="1129d1c5176ef3c828bda41dc553996cb75881e0c9229783b32fa908eaa25ec0" Dec 04 22:03:55.453673 master-0 kubenswrapper[8606]: I1204 22:03:55.451235 8606 scope.go:117] "RemoveContainer" containerID="bad146c5ce315f7f5070081135587df7a077e864def57e2c38a773560069cf17" Dec 04 22:03:55.454042 master-0 kubenswrapper[8606]: I1204 22:03:55.453990 8606 scope.go:117] "RemoveContainer" containerID="3c8faa0cec9898a47039ead85f90eab240ebf83ecd040f53acd3c80c7bec151c" Dec 04 22:03:55.457102 master-0 kubenswrapper[8606]: I1204 22:03:55.457059 8606 scope.go:117] "RemoveContainer" containerID="aec30a53010adc6ee6176e40e860c2639cbdf974b27b2d24e1d71f75f8a5c427" Dec 04 22:03:55.459098 master-0 kubenswrapper[8606]: I1204 22:03:55.458692 8606 scope.go:117] "RemoveContainer" containerID="5b7837bb8d893076191e798bbe6f7756d536495c527346610e4cc8ec29e29fe5" Dec 04 22:03:55.467540 master-0 kubenswrapper[8606]: I1204 22:03:55.467456 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 04 22:03:56.137676 master-0 kubenswrapper[8606]: I1204 22:03:56.137613 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-dcf7fc84b-qmhlw_a043ea49-97f9-4ae6-83b9-733f12754d94/cluster-storage-operator/0.log" Dec 04 22:03:56.139567 master-0 kubenswrapper[8606]: I1204 22:03:56.139533 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/0.log" Dec 04 22:03:56.141437 master-0 kubenswrapper[8606]: I1204 22:03:56.141403 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/0.log" Dec 04 22:03:56.144290 master-0 kubenswrapper[8606]: I1204 22:03:56.144259 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-nxbjw_ce6b5a46-172b-4575-ba22-ff3c6ea4207f/manager/0.log" Dec 04 22:03:56.146186 master-0 kubenswrapper[8606]: I1204 22:03:56.146154 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-8lq7w_871cb002-67f4-43aa-a41d-7a5b2f340059/network-operator/0.log" Dec 04 22:03:56.148139 master-0 kubenswrapper[8606]: I1204 22:03:56.148103 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/0.log" Dec 04 22:03:56.157212 master-0 kubenswrapper[8606]: I1204 22:03:56.157175 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-v7zfw_fb0274dc-fac1-41f9-b3e5-77253d851fdf/manager/0.log" Dec 04 22:03:56.162196 master-0 kubenswrapper[8606]: I1204 22:03:56.161939 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-nk92d_634c1df6-de4d-4e26-8c71-d39311cae0ce/approver/0.log" Dec 04 22:03:56.167682 master-0 kubenswrapper[8606]: I1204 22:03:56.167645 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-jb4xf_7f091088-2166-4026-9fa6-62bd83407edb/openshift-controller-manager-operator/1.log" Dec 04 22:03:56.168716 master-0 kubenswrapper[8606]: I1204 22:03:56.168686 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-jb4xf_7f091088-2166-4026-9fa6-62bd83407edb/openshift-controller-manager-operator/0.log" Dec 04 22:03:56.384654 master-0 kubenswrapper[8606]: I1204 22:03:56.384578 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba/installer/0.log" Dec 04 22:03:56.384654 master-0 kubenswrapper[8606]: I1204 22:03:56.384658 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:03:56.406990 master-0 kubenswrapper[8606]: E1204 22:03:56.406844 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{openshift-config-operator-68758cbcdb-fg6vx.187e223230fd2d59 openshift-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-config-operator,Name:openshift-config-operator-68758cbcdb-fg6vx,UID:810c363b-a4c7-428d-a2fb-285adc29f477,APIVersion:v1,ResourceVersion:8776,FieldPath:spec.initContainers{openshift-api},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b00c658332d6c6786bd969b26097c20a78c79c045f1692a8809234f5fb586c22\" in 9.346s (9.346s including waiting). Image size: 433122306 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:01:42.941420889 +0000 UTC m=+67.751723114,LastTimestamp:2025-12-04 22:01:42.941420889 +0000 UTC m=+67.751723114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:03:56.519253 master-0 kubenswrapper[8606]: I1204 22:03:56.519177 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-var-lock\") pod \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " Dec 04 22:03:56.519419 master-0 kubenswrapper[8606]: I1204 22:03:56.519302 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kubelet-dir\") pod \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " Dec 04 22:03:56.519419 master-0 kubenswrapper[8606]: I1204 22:03:56.519295 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-var-lock" (OuterVolumeSpecName: "var-lock") pod "3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" (UID: "3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:03:56.519419 master-0 kubenswrapper[8606]: I1204 22:03:56.519384 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kube-api-access\") pod \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\" (UID: \"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba\") " Dec 04 22:03:56.519656 master-0 kubenswrapper[8606]: I1204 22:03:56.519487 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" (UID: "3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:03:56.520105 master-0 kubenswrapper[8606]: I1204 22:03:56.520042 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:03:56.520194 master-0 kubenswrapper[8606]: I1204 22:03:56.520105 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:03:56.524051 master-0 kubenswrapper[8606]: I1204 22:03:56.523996 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" (UID: "3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:03:56.623211 master-0 kubenswrapper[8606]: I1204 22:03:56.623084 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:03:57.191744 master-0 kubenswrapper[8606]: I1204 22:03:57.191636 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba/installer/0.log" Dec 04 22:03:57.192622 master-0 kubenswrapper[8606]: I1204 22:03:57.191820 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:03:57.413694 master-0 kubenswrapper[8606]: E1204 22:03:57.413273 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:03:47Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:03:47Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:03:47Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:03:47Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3e65409fc2b27ad0aaeb500a39e264663d2980821f099b830b551785ce4ce8b\\\"],\\\"sizeBytes\\\":1631758507},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9014f384de5f9a0b7418d5869ad349abb9588d16bd09ed650a163c045315dbff\\\"],\\\"sizeBytes\\\":1232140918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b472823604757237c2d16bd6f6221f4cf562aa3b05942c7f602e1e8b2e55a7c6\\\"],\\\"sizeBytes\\\":983705650},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\\\"],\\\"sizeBytes\\\":938303566},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:61664aa69b33349cc6de45e44ae6033e7f483c034ea01c0d9a8ca08a12d88e3a\\\"],\\\"sizeBytes\\\":874825223},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:631a3798b749fecc041a99929eb946618df723e15055e805ff752a1a1273481c\\\"],\\\"sizeBytes\\\":870567329},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f1ca78c423f43f89a0411e40393642f64e4f8df9e5f61c25e31047c4cce170f9\\\"],\\\"sizeBytes\\\":857069957},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c2431a990bcddde98829abda81950247021a2ebbabc964b1516ea046b5f1d4e\\\"],\\\"sizeBytes\\\":856659740},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b12f830c3316aa4dc061c2d00c74126282b3e2bcccc301eab00d57fff3c4c7c\\\"],\\\"sizeBytes\\\":767284906},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb3ec61f9a932a9ad13bdeb44bcf9477a8d5f728151d7f19ed3ef7d4b02b3a82\\\"],\\\"sizeBytes\\\":682371258},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:916566bb9d0143352324233d460ad94697719c11c8c9158e3aea8f475941751f\\\"],\\\"sizeBytes\\\":677523572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5451aa441e5b8d8689c032405d410c8049a849ef2edf77e5b6a5ce2838c6569b\\\"],\\\"sizeBytes\\\":672407260},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9724d2036305cbd729e1f484c5bad89971de977fff8a6723fef1873858dd1123\\\"],\\\"sizeBytes\\\":616108962},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:df606f3b71d4376d1a2108c09f0d3dab455fc30bcb67c60e91590c105e9025bf\\\"],\\\"sizeBytes\\\":583836304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:79f99fd6cce984287932edf0d009660bb488d663081f3d62ec3b23bc8bfbf6c2\\\"],\\\"sizeBytes\\\":576619763},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eddedae7578d79b5a3f748000ae5c00b9f14a04710f9f9ec7b52fc569be5dfb8\\\"],\\\"sizeBytes\\\":552673986},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd38b8be3af889b0f97e2df41517c89a11260901432a9a1ee943195bb3a22737\\\"],\\\"sizeBytes\\\":551889548},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa24edce3d740f84c40018e94cdbf2bc7375268d13d57c2d664e43a46ccea3fc\\\"],\\\"sizeBytes\\\":543227406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\\\"],\\\"sizeBytes\\\":532719167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cfde59e48cd5dee3721f34d249cb119cc3259fd857965d34f9c7ed83b0c363a1\\\"],\\\"sizeBytes\\\":532402162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f4d4282cb53325e737ad68abbfcb70687ae04fb50353f4f0ba0ba5703b15009a\\\"],\\\"sizeBytes\\\":512838054},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:8c885ea0b3c5124989f0a9b93eba98eb9fca6bbd0262772d85d90bf713a4d572\\\"],\\\"sizeBytes\\\":512452153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\\\"],\\\"sizeBytes\\\":509437356},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:97d26892192b552c16527bf2771e1b86528ab581a02dd9279cdf71c194830e3e\\\"],\\\"sizeBytes\\\":508042119},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e85850a4ae1a1e3ec2c590a4936d640882b6550124da22031c85b526afbf52df\\\"],\\\"sizeBytes\\\":507687221},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8375671da86aa527ee7e291d86971b0baa823ffc7663b5a983084456e76c0f59\\\"],\\\"sizeBytes\\\":506741476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:831f30660844091d6154e2674d3a9da6f34271bf8a2c40b56f7416066318742b\\\"],\\\"sizeBytes\\\":505649178},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86af77350cfe6fd69280157e4162aa0147873d9431c641ae4ad3e881ff768a73\\\"],\\\"sizeBytes\\\":505628211},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9\\\"],\\\"sizeBytes\\\":503340749},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da\\\"],\\\"sizeBytes\\\":503011144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8eabac819f289e29d75c7ab172d8124554849a47f0b00770928c3eb19a5a31c4\\\"],\\\"sizeBytes\\\":502436444},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10e57ca7611f79710f05777dc6a8f31c7e04eb09da4d8d793a5acfbf0e4692d7\\\"],\\\"sizeBytes\\\":500943492},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f042fa25014f3d37f3ea967d21f361d2a11833ae18f2c750318101b25d2497ce\\\"],\\\"sizeBytes\\\":500848684},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91af633e585621630c40d14f188e37d36b44678d0a59e582d850bf8d593d3a0c\\\"],\\\"sizeBytes\\\":499798563},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\\\"],\\\"sizeBytes\\\":499705918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33a20002692769235e95271ab071783c57ff50681088fa1035b86af31e73cf20\\\"],\\\"sizeBytes\\\":499125567},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:75d996f6147edb88c09fd1a052099de66638590d7d03a735006244bc9e19f898\\\"],\\\"sizeBytes\\\":499082775},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3b8d91a25eeb9f02041e947adb3487da3e7ab8449d3d2ad015827e7954df7b34\\\"],\\\"sizeBytes\\\":490455952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f952cec1e5332b84bdffa249cd426f39087058d6544ddcec650a414c15a9b68\\\"],\\\"sizeBytes\\\":489528665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c416b201d480bddb5a4960ec42f4740761a1335001cf84ba5ae19ad6857771b1\\\"],\\\"sizeBytes\\\":481559117},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3a77aa4d03b89ea284e3467a268e5989a77a2ef63e685eb1d5c5ea5b3922b7a\\\"],\\\"sizeBytes\\\":478917802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb928c13a46d3fb45f4a881892d023a92d610a5430be0ffd916aaf8da8e7d297\\\"],\\\"sizeBytes\\\":478642572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a92c310ce30dcb3de85d6aac868e0d80919670fa29ef83d55edd96b0cae35563\\\"],\\\"sizeBytes\\\":465285478},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fd3e9f8f00a59bda7483ec7dc8a0ed602f9ca30e3d72b22072dbdf2819da3f61\\\"],\\\"sizeBytes\\\":465144618},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c1edf52f70bf9b1d1457e0c4111bc79cdaa1edd659ddbdb9d8176eff8b46956\\\"],\\\"sizeBytes\\\":462727837},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8cc27777e72233024fe84ee1faa168aec715a0b24912a3ce70715ddccba328df\\\"],\\\"sizeBytes\\\":461702648},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\\\"],\\\"sizeBytes\\\":459552216},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3ce2cbf1032ad0f24f204db73687002fcf302e86ebde3945801c74351b64576\\\"],\\\"sizeBytes\\\":458169255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7664a2d4cb10e82ed32abbf95799f43fc3d10135d7dd94799730de504a89680a\\\"],\\\"sizeBytes\\\":452589750},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4ecc5bac651ff1942865baee5159582e9602c89b47eeab18400a32abcba8f690\\\"],\\\"sizeBytes\\\":451039520}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:03:57.597637 master-0 kubenswrapper[8606]: I1204 22:03:57.597532 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:03:57.597637 master-0 kubenswrapper[8606]: I1204 22:03:57.597625 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:00.596403 master-0 kubenswrapper[8606]: I1204 22:04:00.596278 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:00.597476 master-0 kubenswrapper[8606]: I1204 22:04:00.596390 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:01.235261 master-0 kubenswrapper[8606]: E1204 22:04:01.235138 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:04:03.242724 master-0 kubenswrapper[8606]: I1204 22:04:03.242586 8606 generic.go:334] "Generic (PLEG): container finished" podID="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" containerID="8235211a3898b8961786603441645f7da3fef63f8a04f95fcc274a44a7765453" exitCode=0 Dec 04 22:04:03.597213 master-0 kubenswrapper[8606]: I1204 22:04:03.597080 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:03.597213 master-0 kubenswrapper[8606]: I1204 22:04:03.597182 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:06.596561 master-0 kubenswrapper[8606]: I1204 22:04:06.596381 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:06.596561 master-0 kubenswrapper[8606]: I1204 22:04:06.596493 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:07.414835 master-0 kubenswrapper[8606]: E1204 22:04:07.414740 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:08.093802 master-0 kubenswrapper[8606]: I1204 22:04:08.093700 8606 patch_prober.go:28] interesting pod/machine-config-daemon-ppnv8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 22:04:08.093802 master-0 kubenswrapper[8606]: I1204 22:04:08.093793 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" podUID="8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 22:04:08.460684 master-0 kubenswrapper[8606]: E1204 22:04:08.460493 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 04 22:04:09.595901 master-0 kubenswrapper[8606]: I1204 22:04:09.595818 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:09.596760 master-0 kubenswrapper[8606]: I1204 22:04:09.595890 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:10.309427 master-0 kubenswrapper[8606]: I1204 22:04:10.309328 8606 generic.go:334] "Generic (PLEG): container finished" podID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerID="ab6c10b9a3e0637d5c7a14c6df7c632b34ad06eac467a51eec2ac60a0a5a71c4" exitCode=0 Dec 04 22:04:11.611750 master-0 kubenswrapper[8606]: I1204 22:04:11.611613 8606 patch_prober.go:28] interesting pod/controller-manager-86785576d9-t7jrz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" start-of-body= Dec 04 22:04:11.612495 master-0 kubenswrapper[8606]: I1204 22:04:11.611667 8606 patch_prober.go:28] interesting pod/controller-manager-86785576d9-t7jrz container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" start-of-body= Dec 04 22:04:11.612495 master-0 kubenswrapper[8606]: I1204 22:04:11.611776 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" Dec 04 22:04:11.612495 master-0 kubenswrapper[8606]: I1204 22:04:11.611837 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" Dec 04 22:04:12.597076 master-0 kubenswrapper[8606]: I1204 22:04:12.596931 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:12.597488 master-0 kubenswrapper[8606]: I1204 22:04:12.597095 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:15.358704 master-0 kubenswrapper[8606]: I1204 22:04:15.358602 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-nznvn_f1534e25-7add-46a1-8f4e-0065c232aa4e/control-plane-machine-set-operator/0.log" Dec 04 22:04:15.358704 master-0 kubenswrapper[8606]: I1204 22:04:15.358689 8606 generic.go:334] "Generic (PLEG): container finished" podID="f1534e25-7add-46a1-8f4e-0065c232aa4e" containerID="efec9b80d16091e3ba4473728d27aba3a23ca799a67ec448c19c49d6e7be1b22" exitCode=1 Dec 04 22:04:15.596258 master-0 kubenswrapper[8606]: I1204 22:04:15.596144 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:15.596258 master-0 kubenswrapper[8606]: I1204 22:04:15.596239 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:17.416121 master-0 kubenswrapper[8606]: E1204 22:04:17.415976 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:18.236588 master-0 kubenswrapper[8606]: E1204 22:04:18.236307 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:04:18.383986 master-0 kubenswrapper[8606]: I1204 22:04:18.383906 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-f797d8546-4g7dd_c52111ac-30f6-47b7-a8ca-13659fbd71b4/machine-approver-controller/0.log" Dec 04 22:04:18.384562 master-0 kubenswrapper[8606]: I1204 22:04:18.384432 8606 generic.go:334] "Generic (PLEG): container finished" podID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerID="ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6" exitCode=255 Dec 04 22:04:18.538590 master-0 kubenswrapper[8606]: I1204 22:04:18.537896 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": read tcp 10.128.0.2:33670->10.128.0.53:8443: read: connection reset by peer" start-of-body= Dec 04 22:04:18.538590 master-0 kubenswrapper[8606]: I1204 22:04:18.537992 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": read tcp 10.128.0.2:33670->10.128.0.53:8443: read: connection reset by peer" Dec 04 22:04:19.394079 master-0 kubenswrapper[8606]: I1204 22:04:19.393967 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/3.log" Dec 04 22:04:19.394659 master-0 kubenswrapper[8606]: I1204 22:04:19.394619 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/2.log" Dec 04 22:04:19.395585 master-0 kubenswrapper[8606]: I1204 22:04:19.395477 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/1.log" Dec 04 22:04:19.396535 master-0 kubenswrapper[8606]: I1204 22:04:19.396418 8606 generic.go:334] "Generic (PLEG): container finished" podID="810c363b-a4c7-428d-a2fb-285adc29f477" containerID="89b06e692941a98417c89bb9068d2a41907167dc117005a2191cab580a9cd940" exitCode=255 Dec 04 22:04:20.596913 master-0 kubenswrapper[8606]: I1204 22:04:20.596787 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:04:20.596913 master-0 kubenswrapper[8606]: I1204 22:04:20.596871 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:04:21.611101 master-0 kubenswrapper[8606]: I1204 22:04:21.610975 8606 patch_prober.go:28] interesting pod/controller-manager-86785576d9-t7jrz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" start-of-body= Dec 04 22:04:21.611101 master-0 kubenswrapper[8606]: I1204 22:04:21.611060 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" Dec 04 22:04:21.612148 master-0 kubenswrapper[8606]: I1204 22:04:21.611127 8606 patch_prober.go:28] interesting pod/controller-manager-86785576d9-t7jrz container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" start-of-body= Dec 04 22:04:21.612148 master-0 kubenswrapper[8606]: I1204 22:04:21.611259 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" Dec 04 22:04:23.431081 master-0 kubenswrapper[8606]: I1204 22:04:23.430989 8606 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4" exitCode=1 Dec 04 22:04:23.597145 master-0 kubenswrapper[8606]: I1204 22:04:23.597037 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:04:23.597481 master-0 kubenswrapper[8606]: I1204 22:04:23.597163 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:04:26.457315 master-0 kubenswrapper[8606]: I1204 22:04:26.457176 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/1.log" Dec 04 22:04:26.458177 master-0 kubenswrapper[8606]: I1204 22:04:26.458018 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/0.log" Dec 04 22:04:26.458177 master-0 kubenswrapper[8606]: I1204 22:04:26.458073 8606 generic.go:334] "Generic (PLEG): container finished" podID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" containerID="87db242867379a05a8e5e70ee5c05b6acdb8e3f23756e86287e28e91fa037302" exitCode=1 Dec 04 22:04:26.596313 master-0 kubenswrapper[8606]: I1204 22:04:26.596193 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:04:26.596744 master-0 kubenswrapper[8606]: I1204 22:04:26.596398 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:04:27.416903 master-0 kubenswrapper[8606]: E1204 22:04:27.416814 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:29.471345 master-0 kubenswrapper[8606]: E1204 22:04:29.471236 8606 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 04 22:04:29.472637 master-0 kubenswrapper[8606]: E1204 22:04:29.471582 8606 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.041s" Dec 04 22:04:29.480792 master-0 kubenswrapper[8606]: I1204 22:04:29.480670 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 04 22:04:29.597222 master-0 kubenswrapper[8606]: I1204 22:04:29.597104 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:04:29.597222 master-0 kubenswrapper[8606]: I1204 22:04:29.597215 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:04:30.410866 master-0 kubenswrapper[8606]: E1204 22:04:30.410623 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-storage-operator-dcf7fc84b-qmhlw.187e2232332a09d6 openshift-cluster-storage-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-storage-operator,Name:cluster-storage-operator-dcf7fc84b-qmhlw,UID:a043ea49-97f9-4ae6-83b9-733f12754d94,APIVersion:v1,ResourceVersion:8833,FieldPath:spec.containers{cluster-storage-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:97d26892192b552c16527bf2771e1b86528ab581a02dd9279cdf71c194830e3e\" in 8.711s (8.711s including waiting). Image size: 508042119 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:01:42.97791535 +0000 UTC m=+67.788217585,LastTimestamp:2025-12-04 22:01:42.97791535 +0000 UTC m=+67.788217585,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:04:31.612116 master-0 kubenswrapper[8606]: I1204 22:04:31.612053 8606 patch_prober.go:28] interesting pod/controller-manager-86785576d9-t7jrz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" start-of-body= Dec 04 22:04:31.613162 master-0 kubenswrapper[8606]: I1204 22:04:31.612054 8606 patch_prober.go:28] interesting pod/controller-manager-86785576d9-t7jrz container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" start-of-body= Dec 04 22:04:31.613162 master-0 kubenswrapper[8606]: I1204 22:04:31.612844 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" Dec 04 22:04:31.613162 master-0 kubenswrapper[8606]: I1204 22:04:31.612762 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.44:8443/healthz\": dial tcp 10.128.0.44:8443: connect: connection refused" Dec 04 22:04:32.596991 master-0 kubenswrapper[8606]: I1204 22:04:32.596916 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Dec 04 22:04:32.597427 master-0 kubenswrapper[8606]: I1204 22:04:32.597378 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Dec 04 22:04:34.446893 master-0 kubenswrapper[8606]: E1204 22:04:34.446816 8606 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="4.975s" Dec 04 22:04:34.455614 master-0 kubenswrapper[8606]: W1204 22:04:34.454036 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0791dc66_67d9_42bd_b7c3_d45dc5513c3b.slice/crio-87f5905070547f303913fb037b80fa8791f83c49f484c86a7ba733227eaa9eb2 WatchSource:0}: Error finding container 87f5905070547f303913fb037b80fa8791f83c49f484c86a7ba733227eaa9eb2: Status 404 returned error can't find the container with id 87f5905070547f303913fb037b80fa8791f83c49f484c86a7ba733227eaa9eb2 Dec 04 22:04:34.457796 master-0 kubenswrapper[8606]: I1204 22:04:34.457729 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 04 22:04:34.461140 master-0 kubenswrapper[8606]: I1204 22:04:34.461068 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:04:34.461242 master-0 kubenswrapper[8606]: I1204 22:04:34.461152 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerStarted","Data":"0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6"} Dec 04 22:04:34.461242 master-0 kubenswrapper[8606]: I1204 22:04:34.461193 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:04:34.461242 master-0 kubenswrapper[8606]: I1204 22:04:34.461218 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:04:34.461448 master-0 kubenswrapper[8606]: I1204 22:04:34.461249 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:04:34.461448 master-0 kubenswrapper[8606]: I1204 22:04:34.461276 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" event={"ID":"871cb002-67f4-43aa-a41d-7a5b2f340059","Type":"ContainerDied","Data":"9d7fd4b64c7f9d10b43359385a6360e49aa71c5085c781ef53642cd82a85d004"} Dec 04 22:04:34.461448 master-0 kubenswrapper[8606]: I1204 22:04:34.461373 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" event={"ID":"e065179e-634a-4cbe-bb59-5b01c514e4de","Type":"ContainerDied","Data":"aec30a53010adc6ee6176e40e860c2639cbdf974b27b2d24e1d71f75f8a5c427"} Dec 04 22:04:34.461814 master-0 kubenswrapper[8606]: I1204 22:04:34.461718 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:04:34.461989 master-0 kubenswrapper[8606]: I1204 22:04:34.461933 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:04:34.462324 master-0 kubenswrapper[8606]: I1204 22:04:34.462185 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:04:34.462535 master-0 kubenswrapper[8606]: I1204 22:04:34.462361 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:04:34.462535 master-0 kubenswrapper[8606]: I1204 22:04:34.462310 8606 scope.go:117] "RemoveContainer" containerID="e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4" Dec 04 22:04:34.462894 master-0 kubenswrapper[8606]: I1204 22:04:34.462857 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:04:34.462964 master-0 kubenswrapper[8606]: E1204 22:04:34.462896 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 04 22:04:34.463381 master-0 kubenswrapper[8606]: I1204 22:04:34.463337 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:04:34.463381 master-0 kubenswrapper[8606]: I1204 22:04:34.463372 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:04:34.463772 master-0 kubenswrapper[8606]: I1204 22:04:34.463407 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:04:34.463772 master-0 kubenswrapper[8606]: I1204 22:04:34.463431 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:04:34.463772 master-0 kubenswrapper[8606]: I1204 22:04:34.463451 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" event={"ID":"56f25fad-089d-4df6-abb1-10d4c76750f1","Type":"ContainerDied","Data":"d8a2de466dc95e948ba536210f040992057ba7bc222a8102fb88249ab34f040a"} Dec 04 22:04:34.463772 master-0 kubenswrapper[8606]: I1204 22:04:34.463448 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:04:34.463772 master-0 kubenswrapper[8606]: I1204 22:04:34.463759 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.463802 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.463827 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.463849 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.463874 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerDied","Data":"5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd"} Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.463915 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.463938 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" event={"ID":"ceb419e4-d804-4111-b8d8-8436cc2ee617","Type":"ContainerDied","Data":"466a053aebc195d2f55d104f73cf9c35f09469c457c1576c051e6861f31f8a13"} Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.463962 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" event={"ID":"690b447a-19c0-4925-bc9d-d0c86a83a377","Type":"ContainerDied","Data":"f701b6e27b366f9b3e2d799e563c87e892e7b625684a50d11abda6232179d479"} Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.463993 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerDied","Data":"49e5b6467d42b24a4142a36b3091700faf9ab3af4e0dd62b2e3ca1fd3da47a30"} Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.464042 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" event={"ID":"24648a41-875f-4e98-8b21-3bdd38dffa32","Type":"ContainerDied","Data":"cb9981e4dfed9821dbae6b8b7a8e8e8f099f873bacacc6149961ccf58995e524"} Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.464077 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" event={"ID":"46229484-5fa1-4595-94a0-44477abae90e","Type":"ContainerDied","Data":"c77537fc4f2900520f8e93c8fc7a9508c178081936170d16a0dcd4122f2c7777"} Dec 04 22:04:34.464145 master-0 kubenswrapper[8606]: I1204 22:04:34.464149 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerDied","Data":"a679264390b031ae4f297359e8c908ad01e2a92651d2cb70742a5a02fd398618"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464201 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerDied","Data":"93d74a7e351d1bb38ca66b99396fddaa338eac5fd2201ea238d97a8b16a1e1a0"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464235 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-7vlpp" event={"ID":"2d142201-6e77-4828-b86b-05d4144a2f08","Type":"ContainerDied","Data":"1129d1c5176ef3c828bda41dc553996cb75881e0c9229783b32fa908eaa25ec0"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464267 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" event={"ID":"a544105a-5bec-456a-aef6-c160943c1f67","Type":"ContainerDied","Data":"9820de7c24faf6bdc5aac51f81548f854bf3fa05b1f8fd46fe8346195ddc8ca4"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464298 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerDied","Data":"0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464334 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerStarted","Data":"89b06e692941a98417c89bb9068d2a41907167dc117005a2191cab580a9cd940"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464362 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba","Type":"ContainerDied","Data":"7f95f72da52c53d3c8d88cdae7b632b1e707bccffe42c9e45b84331a1108d0c6"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464390 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464418 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerDied","Data":"f32a0325771ce40043e2990b6e044b2e673986f92037baf7df71e61135c7bd82"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464448 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" event={"ID":"a043ea49-97f9-4ae6-83b9-733f12754d94","Type":"ContainerDied","Data":"b24a52101599e57bc25b6c160a06c23124bc447eb919bdd2267f0b91d0f6aaee"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464480 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerDied","Data":"3c8faa0cec9898a47039ead85f90eab240ebf83ecd040f53acd3c80c7bec151c"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464545 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464581 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" event={"ID":"c6a5d14d-0409-4024-b0a8-200fa2594185","Type":"ContainerDied","Data":"bad146c5ce315f7f5070081135587df7a077e864def57e2c38a773560069cf17"} Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.464700 8606 scope.go:117] "RemoveContainer" containerID="0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6" Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.462023 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:04:34.465320 master-0 kubenswrapper[8606]: I1204 22:04:34.465163 8606 scope.go:117] "RemoveContainer" containerID="89b06e692941a98417c89bb9068d2a41907167dc117005a2191cab580a9cd940" Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.465911 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.465941 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerDied","Data":"264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb"} Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.465963 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerDied","Data":"037f05faa0b4635e20f5127ded6c5b63a2893aa9715387918fd80e11092dcfbb"} Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.465976 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerDied","Data":"5b7837bb8d893076191e798bbe6f7756d536495c527346610e4cc8ec29e29fe5"} Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.465989 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" event={"ID":"a043ea49-97f9-4ae6-83b9-733f12754d94","Type":"ContainerStarted","Data":"3e05aae4893276e8afe69a36c63278b1771095172b32eb2660a3d4bf0f266404"} Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.466000 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerStarted","Data":"669bc75f8a8081f19feaad95999690146addc9aad7304006d9c07b88b80a73be"} Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.466002 8606 scope.go:117] "RemoveContainer" containerID="ab6c10b9a3e0637d5c7a14c6df7c632b34ad06eac467a51eec2ac60a0a5a71c4" Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.466010 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerStarted","Data":"87db242867379a05a8e5e70ee5c05b6acdb8e3f23756e86287e28e91fa037302"} Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.466160 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerStarted","Data":"fe4d171382cafc8367d04ba38562e53607a288ace82dafd0c07ed366d2a1ef56"} Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.466197 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" event={"ID":"871cb002-67f4-43aa-a41d-7a5b2f340059","Type":"ContainerStarted","Data":"59c9203f641c765d2eee366e0bf083a83f8954539e6ae9b99846d431ed362e41"} Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.466217 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"6d94211dde773ea9f092db6ee8825549019f84f69e33c14265dafd1be2140e92"} Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.466236 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerStarted","Data":"980d371480e7b1b6a921dafc58f3636bd451634500e5c1c642030a39e001a8a8"} Dec 04 22:04:34.466244 master-0 kubenswrapper[8606]: I1204 22:04:34.466255 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" event={"ID":"ceb419e4-d804-4111-b8d8-8436cc2ee617","Type":"ContainerStarted","Data":"255af5caa519126130f0822a951a744700ae3fbbe4597a788a35633eb402cf2a"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466274 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" event={"ID":"56f25fad-089d-4df6-abb1-10d4c76750f1","Type":"ContainerStarted","Data":"c9ce59505b093a4eba51c54c1e5c9ce08ff10211501d1a1158af9490fff34501"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466294 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerStarted","Data":"818c6aaeff094c3713e384dec4d55a28c1f228a4e98ba130afd94743d45d288f"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466313 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-7vlpp" event={"ID":"2d142201-6e77-4828-b86b-05d4144a2f08","Type":"ContainerStarted","Data":"c86201b566b2834b19c08527807fc66ebfecefd94445a119b31c2c29928e06b2"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466331 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerStarted","Data":"a1cf3561aceffd368c0ca4d9cb40d000ada9182cc1eeba5246056ae555e8f11e"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466348 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" event={"ID":"24648a41-875f-4e98-8b21-3bdd38dffa32","Type":"ContainerStarted","Data":"0e28b0cb43f14ec5571ccac1eddb3ba4fa32d2d2a461323c61e5fc655cd04d29"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466357 8606 scope.go:117] "RemoveContainer" containerID="87db242867379a05a8e5e70ee5c05b6acdb8e3f23756e86287e28e91fa037302" Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466367 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerStarted","Data":"437ce0db468372672eda2ac00bd5f2a8af4827f3a8e23b48967061bc95032bfb"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466386 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" event={"ID":"46229484-5fa1-4595-94a0-44477abae90e","Type":"ContainerStarted","Data":"43708b3c1f0fc23d49f5b68e72b2abf6c84e81e5b8fe673a0fccaff92e14b81d"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466405 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" event={"ID":"e065179e-634a-4cbe-bb59-5b01c514e4de","Type":"ContainerStarted","Data":"2a99a3b20bc07c50baf33232b49049d3fc9873a89ffad171bcaa3c8be2482524"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466425 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" event={"ID":"a544105a-5bec-456a-aef6-c160943c1f67","Type":"ContainerStarted","Data":"419aa165bec1fc028d5393e03a6724568d6a2d80fb3a00accfe0a6d847f186e7"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466449 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" event={"ID":"690b447a-19c0-4925-bc9d-d0c86a83a377","Type":"ContainerStarted","Data":"a2b4230e9757af974dea8f15ae2efeb4c125ff55b0c9cdc6e359f1aae71c8941"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466527 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" event={"ID":"c6a5d14d-0409-4024-b0a8-200fa2594185","Type":"ContainerStarted","Data":"5553ac89bf95798ab2decbb87aaa4e3e8d835fcf542a4ada2023ac05d60471d4"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466603 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba","Type":"ContainerDied","Data":"ec44f98a134fed3f7d27e7c218ca88ef4cd2ac21b667420e0029267e424b27bd"} Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466628 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec44f98a134fed3f7d27e7c218ca88ef4cd2ac21b667420e0029267e424b27bd" Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: I1204 22:04:34.466664 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: E1204 22:04:34.466603 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:04:34.467163 master-0 kubenswrapper[8606]: E1204 22:04:34.466777 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68758cbcdb-fg6vx_openshift-config-operator(810c363b-a4c7-428d-a2fb-285adc29f477)\"" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.467853 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerDied","Data":"8235211a3898b8961786603441645f7da3fef63f8a04f95fcc274a44a7765453"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.467931 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.467954 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.467967 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerDied","Data":"ab6c10b9a3e0637d5c7a14c6df7c632b34ad06eac467a51eec2ac60a0a5a71c4"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.467982 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.467994 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.468005 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.468018 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" event={"ID":"f1534e25-7add-46a1-8f4e-0065c232aa4e","Type":"ContainerDied","Data":"efec9b80d16091e3ba4473728d27aba3a23ca799a67ec448c19c49d6e7be1b22"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.468036 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" event={"ID":"c52111ac-30f6-47b7-a8ca-13659fbd71b4","Type":"ContainerDied","Data":"ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.468054 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerDied","Data":"89b06e692941a98417c89bb9068d2a41907167dc117005a2191cab580a9cd940"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.468073 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.468093 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerDied","Data":"87db242867379a05a8e5e70ee5c05b6acdb8e3f23756e86287e28e91fa037302"} Dec 04 22:04:34.468597 master-0 kubenswrapper[8606]: I1204 22:04:34.468422 8606 scope.go:117] "RemoveContainer" containerID="efec9b80d16091e3ba4473728d27aba3a23ca799a67ec448c19c49d6e7be1b22" Dec 04 22:04:34.469356 master-0 kubenswrapper[8606]: I1204 22:04:34.469177 8606 scope.go:117] "RemoveContainer" containerID="8235211a3898b8961786603441645f7da3fef63f8a04f95fcc274a44a7765453" Dec 04 22:04:34.469741 master-0 kubenswrapper[8606]: I1204 22:04:34.469600 8606 scope.go:117] "RemoveContainer" containerID="ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6" Dec 04 22:04:34.469851 master-0 kubenswrapper[8606]: I1204 22:04:34.469813 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:04:34.470631 master-0 kubenswrapper[8606]: I1204 22:04:34.470585 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:04:34.473478 master-0 kubenswrapper[8606]: I1204 22:04:34.473412 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:04:34.492768 master-0 kubenswrapper[8606]: I1204 22:04:34.490733 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 04 22:04:34.492768 master-0 kubenswrapper[8606]: I1204 22:04:34.490807 8606 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="16f32872-7565-4e37-ba47-1a9338239818" Dec 04 22:04:34.492768 master-0 kubenswrapper[8606]: I1204 22:04:34.492688 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 04 22:04:34.500629 master-0 kubenswrapper[8606]: I1204 22:04:34.499726 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 04 22:04:34.500629 master-0 kubenswrapper[8606]: I1204 22:04:34.499769 8606 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="16f32872-7565-4e37-ba47-1a9338239818" Dec 04 22:04:34.500629 master-0 kubenswrapper[8606]: I1204 22:04:34.500045 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podStartSLOduration=165.708722069 podStartE2EDuration="3m2.500025332s" podCreationTimestamp="2025-12-04 22:01:32 +0000 UTC" firstStartedPulling="2025-12-04 22:01:33.595166325 +0000 UTC m=+58.405468540" lastFinishedPulling="2025-12-04 22:01:50.386469588 +0000 UTC m=+75.196771803" observedRunningTime="2025-12-04 22:04:34.446169928 +0000 UTC m=+239.256472163" watchObservedRunningTime="2025-12-04 22:04:34.500025332 +0000 UTC m=+239.310327557" Dec 04 22:04:34.513891 master-0 kubenswrapper[8606]: I1204 22:04:34.512822 8606 scope.go:117] "RemoveContainer" containerID="87db242867379a05a8e5e70ee5c05b6acdb8e3f23756e86287e28e91fa037302" Dec 04 22:04:34.513891 master-0 kubenswrapper[8606]: E1204 22:04:34.513063 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:04:34.513891 master-0 kubenswrapper[8606]: I1204 22:04:34.513385 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0791dc66-67d9-42bd-b7c3-d45dc5513c3b","Type":"ContainerStarted","Data":"87f5905070547f303913fb037b80fa8791f83c49f484c86a7ba733227eaa9eb2"} Dec 04 22:04:34.516876 master-0 kubenswrapper[8606]: I1204 22:04:34.516856 8606 scope.go:117] "RemoveContainer" containerID="89b06e692941a98417c89bb9068d2a41907167dc117005a2191cab580a9cd940" Dec 04 22:04:34.517561 master-0 kubenswrapper[8606]: E1204 22:04:34.517530 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68758cbcdb-fg6vx_openshift-config-operator(810c363b-a4c7-428d-a2fb-285adc29f477)\"" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" Dec 04 22:04:34.519172 master-0 kubenswrapper[8606]: I1204 22:04:34.519126 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:04:34.519534 master-0 kubenswrapper[8606]: I1204 22:04:34.519486 8606 scope.go:117] "RemoveContainer" containerID="e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4" Dec 04 22:04:34.520128 master-0 kubenswrapper[8606]: E1204 22:04:34.520103 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 04 22:04:34.520439 master-0 kubenswrapper[8606]: I1204 22:04:34.520407 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:04:34.552025 master-0 kubenswrapper[8606]: I1204 22:04:34.551954 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 04 22:04:34.554805 master-0 kubenswrapper[8606]: I1204 22:04:34.554752 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 04 22:04:34.568179 master-0 kubenswrapper[8606]: I1204 22:04:34.568078 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" podStartSLOduration=172.941959529 podStartE2EDuration="3m6.568052622s" podCreationTimestamp="2025-12-04 22:01:28 +0000 UTC" firstStartedPulling="2025-12-04 22:01:29.352721963 +0000 UTC m=+54.163024218" lastFinishedPulling="2025-12-04 22:01:42.978815096 +0000 UTC m=+67.789117311" observedRunningTime="2025-12-04 22:04:34.566029185 +0000 UTC m=+239.376331410" watchObservedRunningTime="2025-12-04 22:04:34.568052622 +0000 UTC m=+239.378354857" Dec 04 22:04:34.582346 master-0 kubenswrapper[8606]: I1204 22:04:34.581687 8606 scope.go:117] "RemoveContainer" containerID="7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e" Dec 04 22:04:34.599650 master-0 kubenswrapper[8606]: I1204 22:04:34.599560 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" podStartSLOduration=172.91246006 podStartE2EDuration="3m3.599526041s" podCreationTimestamp="2025-12-04 22:01:31 +0000 UTC" firstStartedPulling="2025-12-04 22:01:32.22397626 +0000 UTC m=+57.034278475" lastFinishedPulling="2025-12-04 22:01:42.911042241 +0000 UTC m=+67.721344456" observedRunningTime="2025-12-04 22:04:34.596378233 +0000 UTC m=+239.406680498" watchObservedRunningTime="2025-12-04 22:04:34.599526041 +0000 UTC m=+239.409828266" Dec 04 22:04:34.618572 master-0 kubenswrapper[8606]: I1204 22:04:34.617373 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" podStartSLOduration=172.880848121 podStartE2EDuration="3m1.61735523s" podCreationTimestamp="2025-12-04 22:01:33 +0000 UTC" firstStartedPulling="2025-12-04 22:01:34.2413894 +0000 UTC m=+59.051691615" lastFinishedPulling="2025-12-04 22:01:42.977896499 +0000 UTC m=+67.788198724" observedRunningTime="2025-12-04 22:04:34.6138016 +0000 UTC m=+239.424103825" watchObservedRunningTime="2025-12-04 22:04:34.61735523 +0000 UTC m=+239.427657445" Dec 04 22:04:34.699631 master-0 kubenswrapper[8606]: I1204 22:04:34.698314 8606 scope.go:117] "RemoveContainer" containerID="56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9" Dec 04 22:04:34.729305 master-0 kubenswrapper[8606]: I1204 22:04:34.714760 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 04 22:04:34.765282 master-0 kubenswrapper[8606]: I1204 22:04:34.756923 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 04 22:04:34.783973 master-0 kubenswrapper[8606]: I1204 22:04:34.777522 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-55965856b6-7vlpp" podStartSLOduration=174.137064287 podStartE2EDuration="3m2.777480642s" podCreationTimestamp="2025-12-04 22:01:32 +0000 UTC" firstStartedPulling="2025-12-04 22:01:34.384804731 +0000 UTC m=+59.195106946" lastFinishedPulling="2025-12-04 22:01:43.025221086 +0000 UTC m=+67.835523301" observedRunningTime="2025-12-04 22:04:34.757965247 +0000 UTC m=+239.568267482" watchObservedRunningTime="2025-12-04 22:04:34.777480642 +0000 UTC m=+239.587782857" Dec 04 22:04:34.783973 master-0 kubenswrapper[8606]: I1204 22:04:34.782396 8606 scope.go:117] "RemoveContainer" containerID="467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b" Dec 04 22:04:34.810916 master-0 kubenswrapper[8606]: I1204 22:04:34.805325 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" podStartSLOduration=169.295393366 podStartE2EDuration="3m0.805295799s" podCreationTimestamp="2025-12-04 22:01:34 +0000 UTC" firstStartedPulling="2025-12-04 22:01:43.652176827 +0000 UTC m=+68.462479072" lastFinishedPulling="2025-12-04 22:01:55.16207929 +0000 UTC m=+79.972381505" observedRunningTime="2025-12-04 22:04:34.779720044 +0000 UTC m=+239.590022279" watchObservedRunningTime="2025-12-04 22:04:34.805295799 +0000 UTC m=+239.615598014" Dec 04 22:04:34.823463 master-0 kubenswrapper[8606]: I1204 22:04:34.823020 8606 scope.go:117] "RemoveContainer" containerID="2da555718ea10aaf4197144683ccb4702237b92306aae894f469e5c551742616" Dec 04 22:04:34.832389 master-0 kubenswrapper[8606]: I1204 22:04:34.827006 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" podStartSLOduration=177.826990744 podStartE2EDuration="2m57.826990744s" podCreationTimestamp="2025-12-04 22:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:04:34.825000369 +0000 UTC m=+239.635302604" watchObservedRunningTime="2025-12-04 22:04:34.826990744 +0000 UTC m=+239.637292949" Dec 04 22:04:34.850310 master-0 kubenswrapper[8606]: I1204 22:04:34.847029 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" podStartSLOduration=174.509953919 podStartE2EDuration="3m3.847007264s" podCreationTimestamp="2025-12-04 22:01:31 +0000 UTC" firstStartedPulling="2025-12-04 22:01:33.595879154 +0000 UTC m=+58.406181369" lastFinishedPulling="2025-12-04 22:01:42.932932489 +0000 UTC m=+67.743234714" observedRunningTime="2025-12-04 22:04:34.844750641 +0000 UTC m=+239.655052876" watchObservedRunningTime="2025-12-04 22:04:34.847007264 +0000 UTC m=+239.657309489" Dec 04 22:04:34.867157 master-0 kubenswrapper[8606]: I1204 22:04:34.856625 8606 scope.go:117] "RemoveContainer" containerID="264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb" Dec 04 22:04:34.892386 master-0 kubenswrapper[8606]: I1204 22:04:34.888754 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" podStartSLOduration=174.17738402 podStartE2EDuration="3m2.888734259s" podCreationTimestamp="2025-12-04 22:01:32 +0000 UTC" firstStartedPulling="2025-12-04 22:01:34.26654808 +0000 UTC m=+59.076850295" lastFinishedPulling="2025-12-04 22:01:42.977898309 +0000 UTC m=+67.788200534" observedRunningTime="2025-12-04 22:04:34.888118741 +0000 UTC m=+239.698420976" watchObservedRunningTime="2025-12-04 22:04:34.888734259 +0000 UTC m=+239.699036474" Dec 04 22:04:34.947334 master-0 kubenswrapper[8606]: I1204 22:04:34.947288 8606 scope.go:117] "RemoveContainer" containerID="0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6" Dec 04 22:04:34.948519 master-0 kubenswrapper[8606]: E1204 22:04:34.948258 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6\": container with ID starting with 0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6 not found: ID does not exist" containerID="0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6" Dec 04 22:04:34.948519 master-0 kubenswrapper[8606]: I1204 22:04:34.948315 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6"} err="failed to get container status \"0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6\": rpc error: code = NotFound desc = could not find container \"0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6\": container with ID starting with 0aed7b10a4b07b9167d541a8aa970394ac82fce8c56451237381c4b6b3d33ce6 not found: ID does not exist" Dec 04 22:04:34.948519 master-0 kubenswrapper[8606]: I1204 22:04:34.948352 8606 scope.go:117] "RemoveContainer" containerID="7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e" Dec 04 22:04:34.948944 master-0 kubenswrapper[8606]: E1204 22:04:34.948890 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e\": container with ID starting with 7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e not found: ID does not exist" containerID="7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e" Dec 04 22:04:34.949028 master-0 kubenswrapper[8606]: I1204 22:04:34.948937 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e"} err="failed to get container status \"7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e\": rpc error: code = NotFound desc = could not find container \"7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e\": container with ID starting with 7a4f83298b0ef5a8ea0350bedbc36a1b867e926489a8dc4ab370f8fb0750986e not found: ID does not exist" Dec 04 22:04:34.949028 master-0 kubenswrapper[8606]: I1204 22:04:34.948963 8606 scope.go:117] "RemoveContainer" containerID="56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9" Dec 04 22:04:34.949536 master-0 kubenswrapper[8606]: E1204 22:04:34.949317 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9\": container with ID starting with 56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9 not found: ID does not exist" containerID="56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9" Dec 04 22:04:34.949536 master-0 kubenswrapper[8606]: I1204 22:04:34.949356 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9"} err="failed to get container status \"56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9\": rpc error: code = NotFound desc = could not find container \"56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9\": container with ID starting with 56b7dbd04a42157ae789a08a2c4007f65da7974bbd4d164fadbd9b7e896260a9 not found: ID does not exist" Dec 04 22:04:34.949536 master-0 kubenswrapper[8606]: I1204 22:04:34.949377 8606 scope.go:117] "RemoveContainer" containerID="467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b" Dec 04 22:04:34.949792 master-0 kubenswrapper[8606]: E1204 22:04:34.949664 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b\": container with ID starting with 467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b not found: ID does not exist" containerID="467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b" Dec 04 22:04:34.949792 master-0 kubenswrapper[8606]: I1204 22:04:34.949684 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b"} err="failed to get container status \"467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b\": rpc error: code = NotFound desc = could not find container \"467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b\": container with ID starting with 467ff1b3f3e046cb7c66e35fcf61a920df132d82f0e8aacc701b2420e9cdfb3b not found: ID does not exist" Dec 04 22:04:34.949792 master-0 kubenswrapper[8606]: I1204 22:04:34.949698 8606 scope.go:117] "RemoveContainer" containerID="264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb" Dec 04 22:04:34.951099 master-0 kubenswrapper[8606]: E1204 22:04:34.950154 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb\": container with ID starting with 264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb not found: ID does not exist" containerID="264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb" Dec 04 22:04:34.951099 master-0 kubenswrapper[8606]: I1204 22:04:34.950182 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb"} err="failed to get container status \"264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb\": rpc error: code = NotFound desc = could not find container \"264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb\": container with ID starting with 264a9878bb2e4aa8c8b863bad4450177d987e394dadf0c3d03081485ebd374cb not found: ID does not exist" Dec 04 22:04:35.019209 master-0 kubenswrapper[8606]: I1204 22:04:35.018968 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sw6sx"] Dec 04 22:04:35.024290 master-0 kubenswrapper[8606]: W1204 22:04:35.024239 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29828f55_427b_4fe3_8713_03bcd6ac9dec.slice/crio-ee18ee29a901424dbc24deaf0f12aa2ff23c84383c4ff6df05066daea9fbaebe WatchSource:0}: Error finding container ee18ee29a901424dbc24deaf0f12aa2ff23c84383c4ff6df05066daea9fbaebe: Status 404 returned error can't find the container with id ee18ee29a901424dbc24deaf0f12aa2ff23c84383c4ff6df05066daea9fbaebe Dec 04 22:04:35.033110 master-0 kubenswrapper[8606]: I1204 22:04:35.033069 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:04:35.146761 master-0 kubenswrapper[8606]: I1204 22:04:35.145262 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvkjf"] Dec 04 22:04:35.153190 master-0 kubenswrapper[8606]: I1204 22:04:35.152587 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn"] Dec 04 22:04:35.155162 master-0 kubenswrapper[8606]: W1204 22:04:35.155103 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bfb50b0_920e_4f85_a1ec_7b2ceaf89dae.slice/crio-3a3d3c261f26f43a69f444c9052f030961843ae0bbf89248b5cd01b597da7064 WatchSource:0}: Error finding container 3a3d3c261f26f43a69f444c9052f030961843ae0bbf89248b5cd01b597da7064: Status 404 returned error can't find the container with id 3a3d3c261f26f43a69f444c9052f030961843ae0bbf89248b5cd01b597da7064 Dec 04 22:04:35.157055 master-0 kubenswrapper[8606]: I1204 22:04:35.157020 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sdrkm"] Dec 04 22:04:35.167799 master-0 kubenswrapper[8606]: W1204 22:04:35.167740 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae107ad4_104c_4264_9844_afb3af28b19e.slice/crio-c9c2e653f3b9114eb61aa0f9377a8f57b59a4f433c04f9168d0a7788bc429f4a WatchSource:0}: Error finding container c9c2e653f3b9114eb61aa0f9377a8f57b59a4f433c04f9168d0a7788bc429f4a: Status 404 returned error can't find the container with id c9c2e653f3b9114eb61aa0f9377a8f57b59a4f433c04f9168d0a7788bc429f4a Dec 04 22:04:35.168114 master-0 kubenswrapper[8606]: W1204 22:04:35.168081 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc61ef71c_ad0f_41bc_b0ae_a3ee19696f9d.slice/crio-ebcf83d7998d4cc60b59e0a4ee1b7d80a2c88668db04583317334cbd65922154 WatchSource:0}: Error finding container ebcf83d7998d4cc60b59e0a4ee1b7d80a2c88668db04583317334cbd65922154: Status 404 returned error can't find the container with id ebcf83d7998d4cc60b59e0a4ee1b7d80a2c88668db04583317334cbd65922154 Dec 04 22:04:35.242314 master-0 kubenswrapper[8606]: E1204 22:04:35.242223 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:04:35.309141 master-0 kubenswrapper[8606]: I1204 22:04:35.309079 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zt44t"] Dec 04 22:04:35.329625 master-0 kubenswrapper[8606]: W1204 22:04:35.329582 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce6002bb_4948_45ab_bb1d_ed65e86b6466.slice/crio-60aac9ad737c32ff74467da7a19ec918b06f0d3f5c0137f4d12c177366392be7 WatchSource:0}: Error finding container 60aac9ad737c32ff74467da7a19ec918b06f0d3f5c0137f4d12c177366392be7: Status 404 returned error can't find the container with id 60aac9ad737c32ff74467da7a19ec918b06f0d3f5c0137f4d12c177366392be7 Dec 04 22:04:35.403605 master-0 kubenswrapper[8606]: I1204 22:04:35.403376 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fca9b57-0b34-46d4-9f3a-dbd4acd630f6" path="/var/lib/kubelet/pods/9fca9b57-0b34-46d4-9f3a-dbd4acd630f6/volumes" Dec 04 22:04:35.404014 master-0 kubenswrapper[8606]: I1204 22:04:35.403975 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b977a2cf-4e95-4456-957d-b1ba05c0d1ff" path="/var/lib/kubelet/pods/b977a2cf-4e95-4456-957d-b1ba05c0d1ff/volumes" Dec 04 22:04:35.524427 master-0 kubenswrapper[8606]: I1204 22:04:35.523889 8606 generic.go:334] "Generic (PLEG): container finished" podID="2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" containerID="95541d3029d5588838c47cb8939ee7fe2e3c3f04da641f8f8e31b33c2e5cfb73" exitCode=0 Dec 04 22:04:35.524427 master-0 kubenswrapper[8606]: I1204 22:04:35.524001 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvkjf" event={"ID":"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae","Type":"ContainerDied","Data":"95541d3029d5588838c47cb8939ee7fe2e3c3f04da641f8f8e31b33c2e5cfb73"} Dec 04 22:04:35.524427 master-0 kubenswrapper[8606]: I1204 22:04:35.524048 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvkjf" event={"ID":"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae","Type":"ContainerStarted","Data":"3a3d3c261f26f43a69f444c9052f030961843ae0bbf89248b5cd01b597da7064"} Dec 04 22:04:35.528251 master-0 kubenswrapper[8606]: I1204 22:04:35.528196 8606 scope.go:117] "RemoveContainer" containerID="e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4" Dec 04 22:04:35.528553 master-0 kubenswrapper[8606]: E1204 22:04:35.528474 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 04 22:04:35.529182 master-0 kubenswrapper[8606]: I1204 22:04:35.529123 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt44t" event={"ID":"ce6002bb-4948-45ab-bb1d-ed65e86b6466","Type":"ContainerStarted","Data":"0323783c48e18783d0f18adc0e52bb623413c80d32bdfc761472fc94945f10bc"} Dec 04 22:04:35.529226 master-0 kubenswrapper[8606]: I1204 22:04:35.529186 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt44t" event={"ID":"ce6002bb-4948-45ab-bb1d-ed65e86b6466","Type":"ContainerStarted","Data":"60aac9ad737c32ff74467da7a19ec918b06f0d3f5c0137f4d12c177366392be7"} Dec 04 22:04:35.533304 master-0 kubenswrapper[8606]: I1204 22:04:35.531960 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerStarted","Data":"1bd03b6d56dba3556ff5faee83fe97db0ea1194b6ba6e4b1aac4ae2f4b0e67d8"} Dec 04 22:04:35.533304 master-0 kubenswrapper[8606]: I1204 22:04:35.532203 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:04:35.534153 master-0 kubenswrapper[8606]: I1204 22:04:35.534088 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0791dc66-67d9-42bd-b7c3-d45dc5513c3b","Type":"ContainerStarted","Data":"f70a0cabfa84fd6dac7eab4d978f050d5e781f995d7f4f93a12a51cc9706d0d9"} Dec 04 22:04:35.536812 master-0 kubenswrapper[8606]: I1204 22:04:35.536751 8606 generic.go:334] "Generic (PLEG): container finished" podID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerID="5b29db78fe5a1942ea20ecc7d711d841b8eb39751995722550ca54e6750f1a0c" exitCode=0 Dec 04 22:04:35.536890 master-0 kubenswrapper[8606]: I1204 22:04:35.536854 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerDied","Data":"5b29db78fe5a1942ea20ecc7d711d841b8eb39751995722550ca54e6750f1a0c"} Dec 04 22:04:35.536931 master-0 kubenswrapper[8606]: I1204 22:04:35.536896 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerStarted","Data":"ee18ee29a901424dbc24deaf0f12aa2ff23c84383c4ff6df05066daea9fbaebe"} Dec 04 22:04:35.538180 master-0 kubenswrapper[8606]: I1204 22:04:35.538127 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:04:35.543090 master-0 kubenswrapper[8606]: I1204 22:04:35.540170 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/3.log" Dec 04 22:04:35.543090 master-0 kubenswrapper[8606]: I1204 22:04:35.542782 8606 scope.go:117] "RemoveContainer" containerID="89b06e692941a98417c89bb9068d2a41907167dc117005a2191cab580a9cd940" Dec 04 22:04:35.543090 master-0 kubenswrapper[8606]: E1204 22:04:35.542999 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68758cbcdb-fg6vx_openshift-config-operator(810c363b-a4c7-428d-a2fb-285adc29f477)\"" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" Dec 04 22:04:35.553602 master-0 kubenswrapper[8606]: I1204 22:04:35.548240 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-f797d8546-4g7dd_c52111ac-30f6-47b7-a8ca-13659fbd71b4/machine-approver-controller/0.log" Dec 04 22:04:35.553602 master-0 kubenswrapper[8606]: I1204 22:04:35.549004 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" event={"ID":"c52111ac-30f6-47b7-a8ca-13659fbd71b4","Type":"ContainerStarted","Data":"578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f"} Dec 04 22:04:35.553602 master-0 kubenswrapper[8606]: I1204 22:04:35.552052 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-jb4xf_7f091088-2166-4026-9fa6-62bd83407edb/openshift-controller-manager-operator/1.log" Dec 04 22:04:35.554373 master-0 kubenswrapper[8606]: I1204 22:04:35.554324 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdrkm" event={"ID":"ae107ad4-104c-4264-9844-afb3af28b19e","Type":"ContainerStarted","Data":"54a32de727a29737d3f9e1ca99dbe42daef248c481ccfc250f9a1754750f20c0"} Dec 04 22:04:35.554373 master-0 kubenswrapper[8606]: I1204 22:04:35.554361 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdrkm" event={"ID":"ae107ad4-104c-4264-9844-afb3af28b19e","Type":"ContainerStarted","Data":"c9c2e653f3b9114eb61aa0f9377a8f57b59a4f433c04f9168d0a7788bc429f4a"} Dec 04 22:04:35.557918 master-0 kubenswrapper[8606]: I1204 22:04:35.557875 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/1.log" Dec 04 22:04:35.558281 master-0 kubenswrapper[8606]: I1204 22:04:35.558243 8606 scope.go:117] "RemoveContainer" containerID="87db242867379a05a8e5e70ee5c05b6acdb8e3f23756e86287e28e91fa037302" Dec 04 22:04:35.558548 master-0 kubenswrapper[8606]: E1204 22:04:35.558493 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:04:35.561218 master-0 kubenswrapper[8606]: I1204 22:04:35.561174 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" event={"ID":"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d","Type":"ContainerStarted","Data":"50d9f03783c661accee22d1e4308b7f9da15faf71fda445f1589dfc2e32aea11"} Dec 04 22:04:35.561218 master-0 kubenswrapper[8606]: I1204 22:04:35.561209 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" event={"ID":"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d","Type":"ContainerStarted","Data":"ebcf83d7998d4cc60b59e0a4ee1b7d80a2c88668db04583317334cbd65922154"} Dec 04 22:04:35.562392 master-0 kubenswrapper[8606]: I1204 22:04:35.561900 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:04:35.567542 master-0 kubenswrapper[8606]: I1204 22:04:35.567450 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerStarted","Data":"7d1cd57ab4d6b616f66a493d64b829158d3913e17906249bebc8057db4a21035"} Dec 04 22:04:35.574207 master-0 kubenswrapper[8606]: I1204 22:04:35.574148 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-nznvn_f1534e25-7add-46a1-8f4e-0065c232aa4e/control-plane-machine-set-operator/0.log" Dec 04 22:04:35.574961 master-0 kubenswrapper[8606]: I1204 22:04:35.574404 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" event={"ID":"f1534e25-7add-46a1-8f4e-0065c232aa4e","Type":"ContainerStarted","Data":"893b1713daa62af184411a3e5a2cfc2bd5735ca25c31751314839bd533678913"} Dec 04 22:04:35.596226 master-0 kubenswrapper[8606]: I1204 22:04:35.595624 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:04:35.596226 master-0 kubenswrapper[8606]: I1204 22:04:35.595693 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:04:35.754018 master-0 kubenswrapper[8606]: I1204 22:04:35.753571 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" podStartSLOduration=180.737431913 podStartE2EDuration="3m0.737431913s" podCreationTimestamp="2025-12-04 22:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:04:35.737202936 +0000 UTC m=+240.547505181" watchObservedRunningTime="2025-12-04 22:04:35.737431913 +0000 UTC m=+240.547734128" Dec 04 22:04:35.797243 master-0 kubenswrapper[8606]: I1204 22:04:35.797127 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=177.79708751 podStartE2EDuration="2m57.79708751s" podCreationTimestamp="2025-12-04 22:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:04:35.793290033 +0000 UTC m=+240.603592268" watchObservedRunningTime="2025-12-04 22:04:35.79708751 +0000 UTC m=+240.607389725" Dec 04 22:04:36.562163 master-0 kubenswrapper[8606]: I1204 22:04:36.562092 8606 patch_prober.go:28] interesting pod/packageserver-7b4bc6c685-l6dfn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.60:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:36.562694 master-0 kubenswrapper[8606]: I1204 22:04:36.562190 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" podUID="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.60:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:36.583510 master-0 kubenswrapper[8606]: I1204 22:04:36.583449 8606 generic.go:334] "Generic (PLEG): container finished" podID="ae107ad4-104c-4264-9844-afb3af28b19e" containerID="54a32de727a29737d3f9e1ca99dbe42daef248c481ccfc250f9a1754750f20c0" exitCode=0 Dec 04 22:04:36.583612 master-0 kubenswrapper[8606]: I1204 22:04:36.583548 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdrkm" event={"ID":"ae107ad4-104c-4264-9844-afb3af28b19e","Type":"ContainerDied","Data":"54a32de727a29737d3f9e1ca99dbe42daef248c481ccfc250f9a1754750f20c0"} Dec 04 22:04:36.586355 master-0 kubenswrapper[8606]: I1204 22:04:36.586321 8606 generic.go:334] "Generic (PLEG): container finished" podID="ce6002bb-4948-45ab-bb1d-ed65e86b6466" containerID="0323783c48e18783d0f18adc0e52bb623413c80d32bdfc761472fc94945f10bc" exitCode=0 Dec 04 22:04:36.586487 master-0 kubenswrapper[8606]: I1204 22:04:36.586447 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt44t" event={"ID":"ce6002bb-4948-45ab-bb1d-ed65e86b6466","Type":"ContainerDied","Data":"0323783c48e18783d0f18adc0e52bb623413c80d32bdfc761472fc94945f10bc"} Dec 04 22:04:36.587480 master-0 kubenswrapper[8606]: I1204 22:04:36.587449 8606 scope.go:117] "RemoveContainer" containerID="89b06e692941a98417c89bb9068d2a41907167dc117005a2191cab580a9cd940" Dec 04 22:04:36.587724 master-0 kubenswrapper[8606]: E1204 22:04:36.587697 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68758cbcdb-fg6vx_openshift-config-operator(810c363b-a4c7-428d-a2fb-285adc29f477)\"" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" Dec 04 22:04:36.588051 master-0 kubenswrapper[8606]: I1204 22:04:36.588008 8606 scope.go:117] "RemoveContainer" containerID="e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4" Dec 04 22:04:36.588371 master-0 kubenswrapper[8606]: E1204 22:04:36.588319 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 04 22:04:37.055472 master-0 kubenswrapper[8606]: I1204 22:04:37.055413 8606 patch_prober.go:28] interesting pod/packageserver-7b4bc6c685-l6dfn container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.128.0.60:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:37.055737 master-0 kubenswrapper[8606]: I1204 22:04:37.055519 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" podUID="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.60:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:37.306451 master-0 kubenswrapper[8606]: I1204 22:04:37.306315 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:04:37.306667 master-0 kubenswrapper[8606]: I1204 22:04:37.306510 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:04:37.562737 master-0 kubenswrapper[8606]: I1204 22:04:37.562535 8606 patch_prober.go:28] interesting pod/packageserver-7b4bc6c685-l6dfn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.60:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:37.562737 master-0 kubenswrapper[8606]: I1204 22:04:37.562653 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" podUID="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.60:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:37.593317 master-0 kubenswrapper[8606]: I1204 22:04:37.593258 8606 scope.go:117] "RemoveContainer" containerID="e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4" Dec 04 22:04:37.593663 master-0 kubenswrapper[8606]: E1204 22:04:37.593630 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 04 22:04:37.594027 master-0 kubenswrapper[8606]: I1204 22:04:37.593974 8606 scope.go:117] "RemoveContainer" containerID="89b06e692941a98417c89bb9068d2a41907167dc117005a2191cab580a9cd940" Dec 04 22:04:37.594317 master-0 kubenswrapper[8606]: E1204 22:04:37.594277 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68758cbcdb-fg6vx_openshift-config-operator(810c363b-a4c7-428d-a2fb-285adc29f477)\"" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" Dec 04 22:04:38.093618 master-0 kubenswrapper[8606]: I1204 22:04:38.093553 8606 patch_prober.go:28] interesting pod/machine-config-daemon-ppnv8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 22:04:38.093840 master-0 kubenswrapper[8606]: I1204 22:04:38.093655 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" podUID="8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 22:04:38.439130 master-0 kubenswrapper[8606]: I1204 22:04:38.438997 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 04 22:04:38.565962 master-0 kubenswrapper[8606]: I1204 22:04:38.565821 8606 patch_prober.go:28] interesting pod/packageserver-7b4bc6c685-l6dfn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.60:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:38.566562 master-0 kubenswrapper[8606]: I1204 22:04:38.566026 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" podUID="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.60:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:39.567337 master-0 kubenswrapper[8606]: I1204 22:04:39.567221 8606 patch_prober.go:28] interesting pod/packageserver-7b4bc6c685-l6dfn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.60:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:39.567337 master-0 kubenswrapper[8606]: I1204 22:04:39.567294 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" podUID="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.60:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:40.699454 master-0 kubenswrapper[8606]: I1204 22:04:40.699293 8606 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-bm2pk container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:40.699454 master-0 kubenswrapper[8606]: I1204 22:04:40.699389 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" podUID="f893663c-7c1e-4eda-9839-99c1c0440304" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:41.503358 master-0 kubenswrapper[8606]: I1204 22:04:41.503275 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:04:41.504017 master-0 kubenswrapper[8606]: I1204 22:04:41.503968 8606 scope.go:117] "RemoveContainer" containerID="e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4" Dec 04 22:04:41.504249 master-0 kubenswrapper[8606]: E1204 22:04:41.504206 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 04 22:04:43.442536 master-0 kubenswrapper[8606]: I1204 22:04:43.439512 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 04 22:04:43.460470 master-0 kubenswrapper[8606]: I1204 22:04:43.460415 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 04 22:04:47.055746 master-0 kubenswrapper[8606]: I1204 22:04:47.055659 8606 patch_prober.go:28] interesting pod/packageserver-7b4bc6c685-l6dfn container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.128.0.60:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:47.056984 master-0 kubenswrapper[8606]: I1204 22:04:47.055765 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" podUID="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.60:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:47.056984 master-0 kubenswrapper[8606]: I1204 22:04:47.055718 8606 patch_prober.go:28] interesting pod/packageserver-7b4bc6c685-l6dfn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.60:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:47.056984 master-0 kubenswrapper[8606]: I1204 22:04:47.056660 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" podUID="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.60:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:47.477475 master-0 kubenswrapper[8606]: E1204 22:04:47.477258 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 04 22:04:47.679874 master-0 kubenswrapper[8606]: I1204 22:04:47.679805 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 04 22:04:49.392235 master-0 kubenswrapper[8606]: I1204 22:04:49.391842 8606 scope.go:117] "RemoveContainer" containerID="87db242867379a05a8e5e70ee5c05b6acdb8e3f23756e86287e28e91fa037302" Dec 04 22:04:50.698305 master-0 kubenswrapper[8606]: I1204 22:04:50.698204 8606 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-bm2pk container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:04:50.699338 master-0 kubenswrapper[8606]: I1204 22:04:50.698298 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" podUID="f893663c-7c1e-4eda-9839-99c1c0440304" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:04:52.243536 master-0 kubenswrapper[8606]: E1204 22:04:52.243396 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:04:52.392680 master-0 kubenswrapper[8606]: I1204 22:04:52.392579 8606 scope.go:117] "RemoveContainer" containerID="89b06e692941a98417c89bb9068d2a41907167dc117005a2191cab580a9cd940" Dec 04 22:04:53.391109 master-0 kubenswrapper[8606]: I1204 22:04:53.391056 8606 scope.go:117] "RemoveContainer" containerID="e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4" Dec 04 22:04:55.460807 master-0 kubenswrapper[8606]: E1204 22:04:55.460756 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" is forbidden: the server was unable to return a response in the time allotted, but may still be processing the request (get limitranges)" pod="openshift-etcd/etcd-master-0" Dec 04 22:04:56.059012 master-0 kubenswrapper[8606]: I1204 22:04:56.058968 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:04:56.731940 master-0 kubenswrapper[8606]: I1204 22:04:56.731843 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/3.log" Dec 04 22:04:56.732911 master-0 kubenswrapper[8606]: I1204 22:04:56.732433 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerStarted","Data":"cc16e00745629c42a3771375fb21884cb52774d57fe0324c10a8680dd9a3742b"} Dec 04 22:04:56.733177 master-0 kubenswrapper[8606]: I1204 22:04:56.733106 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:04:56.734755 master-0 kubenswrapper[8606]: I1204 22:04:56.734713 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdrkm" event={"ID":"ae107ad4-104c-4264-9844-afb3af28b19e","Type":"ContainerStarted","Data":"e77c322db09ee028391834636928860ad589dd50d5763a9eb98bf7d157a2104d"} Dec 04 22:04:56.737411 master-0 kubenswrapper[8606]: I1204 22:04:56.737337 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/1.log" Dec 04 22:04:56.737605 master-0 kubenswrapper[8606]: I1204 22:04:56.737539 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerStarted","Data":"db1b72940eea381ffedc858f1def5527cda55e491f0234248167dff13f171166"} Dec 04 22:04:56.740722 master-0 kubenswrapper[8606]: I1204 22:04:56.740635 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvkjf" event={"ID":"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae","Type":"ContainerStarted","Data":"ebceb6eb636a1f740136f2a1db4a9178448d55ff6db47b35ebd00354ae58e8f7"} Dec 04 22:04:56.747449 master-0 kubenswrapper[8606]: I1204 22:04:56.747400 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"0a040d82f9bfd9a8d213b7bca90e959915daaafe371835a7acd200542911284e"} Dec 04 22:04:56.749670 master-0 kubenswrapper[8606]: I1204 22:04:56.749170 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt44t" event={"ID":"ce6002bb-4948-45ab-bb1d-ed65e86b6466","Type":"ContainerStarted","Data":"c2ad6d2719e3800fef2a35a9686c68acbf17ddb950d85a4469689ef746cce44d"} Dec 04 22:04:56.754036 master-0 kubenswrapper[8606]: I1204 22:04:56.753522 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerStarted","Data":"5f1c65cf31bac9c169d2527d0952f6b2fb651f148801aa43c79ceb4a8adb4da6"} Dec 04 22:04:57.306873 master-0 kubenswrapper[8606]: I1204 22:04:57.306809 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:04:57.660313 master-0 kubenswrapper[8606]: I1204 22:04:57.660204 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:04:57.773995 master-0 kubenswrapper[8606]: I1204 22:04:57.773853 8606 generic.go:334] "Generic (PLEG): container finished" podID="ae107ad4-104c-4264-9844-afb3af28b19e" containerID="e77c322db09ee028391834636928860ad589dd50d5763a9eb98bf7d157a2104d" exitCode=0 Dec 04 22:04:57.774609 master-0 kubenswrapper[8606]: I1204 22:04:57.774052 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdrkm" event={"ID":"ae107ad4-104c-4264-9844-afb3af28b19e","Type":"ContainerDied","Data":"e77c322db09ee028391834636928860ad589dd50d5763a9eb98bf7d157a2104d"} Dec 04 22:04:57.776867 master-0 kubenswrapper[8606]: I1204 22:04:57.776812 8606 generic.go:334] "Generic (PLEG): container finished" podID="2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" containerID="ebceb6eb636a1f740136f2a1db4a9178448d55ff6db47b35ebd00354ae58e8f7" exitCode=0 Dec 04 22:04:57.776924 master-0 kubenswrapper[8606]: I1204 22:04:57.776888 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvkjf" event={"ID":"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae","Type":"ContainerDied","Data":"ebceb6eb636a1f740136f2a1db4a9178448d55ff6db47b35ebd00354ae58e8f7"} Dec 04 22:04:57.783003 master-0 kubenswrapper[8606]: I1204 22:04:57.782952 8606 generic.go:334] "Generic (PLEG): container finished" podID="ce6002bb-4948-45ab-bb1d-ed65e86b6466" containerID="c2ad6d2719e3800fef2a35a9686c68acbf17ddb950d85a4469689ef746cce44d" exitCode=0 Dec 04 22:04:57.783122 master-0 kubenswrapper[8606]: I1204 22:04:57.783063 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt44t" event={"ID":"ce6002bb-4948-45ab-bb1d-ed65e86b6466","Type":"ContainerDied","Data":"c2ad6d2719e3800fef2a35a9686c68acbf17ddb950d85a4469689ef746cce44d"} Dec 04 22:04:57.785447 master-0 kubenswrapper[8606]: I1204 22:04:57.785404 8606 generic.go:334] "Generic (PLEG): container finished" podID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerID="5f1c65cf31bac9c169d2527d0952f6b2fb651f148801aa43c79ceb4a8adb4da6" exitCode=0 Dec 04 22:04:57.786364 master-0 kubenswrapper[8606]: I1204 22:04:57.786139 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerDied","Data":"5f1c65cf31bac9c169d2527d0952f6b2fb651f148801aa43c79ceb4a8adb4da6"} Dec 04 22:04:57.787137 master-0 kubenswrapper[8606]: I1204 22:04:57.787101 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:04:58.794389 master-0 kubenswrapper[8606]: I1204 22:04:58.794302 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvkjf" event={"ID":"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae","Type":"ContainerStarted","Data":"8070413a1606f0293af50b080ba5194c2bb89b5ae8414595ea0e41476a830534"} Dec 04 22:04:58.797255 master-0 kubenswrapper[8606]: I1204 22:04:58.797226 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt44t" event={"ID":"ce6002bb-4948-45ab-bb1d-ed65e86b6466","Type":"ContainerStarted","Data":"f4890763f3394b36e511904c8ed5db27be23eefd277f0bd8a125d2e665ac4c24"} Dec 04 22:04:58.800358 master-0 kubenswrapper[8606]: I1204 22:04:58.800313 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerStarted","Data":"6ac117d3f888173d5f0f8aae01fddab59b22a163a9082dce4aa284a60ea267f0"} Dec 04 22:04:58.804338 master-0 kubenswrapper[8606]: I1204 22:04:58.803495 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdrkm" event={"ID":"ae107ad4-104c-4264-9844-afb3af28b19e","Type":"ContainerStarted","Data":"25be24c776edf99f501f87f528c64d0bdb9dfd3345a31d68783da8815130b293"} Dec 04 22:04:58.842075 master-0 kubenswrapper[8606]: I1204 22:04:58.841988 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vvkjf" podStartSLOduration=181.099133804 podStartE2EDuration="3m23.841968846s" podCreationTimestamp="2025-12-04 22:01:35 +0000 UTC" firstStartedPulling="2025-12-04 22:04:35.526423409 +0000 UTC m=+240.336725624" lastFinishedPulling="2025-12-04 22:04:58.269258451 +0000 UTC m=+263.079560666" observedRunningTime="2025-12-04 22:04:58.816345501 +0000 UTC m=+263.626647746" watchObservedRunningTime="2025-12-04 22:04:58.841968846 +0000 UTC m=+263.652271061" Dec 04 22:04:58.844486 master-0 kubenswrapper[8606]: I1204 22:04:58.844440 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sw6sx" podStartSLOduration=181.105181124 podStartE2EDuration="3m23.844429845s" podCreationTimestamp="2025-12-04 22:01:35 +0000 UTC" firstStartedPulling="2025-12-04 22:04:35.539037212 +0000 UTC m=+240.349339427" lastFinishedPulling="2025-12-04 22:04:58.278285903 +0000 UTC m=+263.088588148" observedRunningTime="2025-12-04 22:04:58.842426829 +0000 UTC m=+263.652729074" watchObservedRunningTime="2025-12-04 22:04:58.844429845 +0000 UTC m=+263.654732060" Dec 04 22:04:58.869982 master-0 kubenswrapper[8606]: I1204 22:04:58.869889 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sdrkm" podStartSLOduration=179.149514852 podStartE2EDuration="3m21.869870745s" podCreationTimestamp="2025-12-04 22:01:37 +0000 UTC" firstStartedPulling="2025-12-04 22:04:35.55543074 +0000 UTC m=+240.365732955" lastFinishedPulling="2025-12-04 22:04:58.275786593 +0000 UTC m=+263.086088848" observedRunningTime="2025-12-04 22:04:58.867417487 +0000 UTC m=+263.677719712" watchObservedRunningTime="2025-12-04 22:04:58.869870745 +0000 UTC m=+263.680172960" Dec 04 22:04:58.888796 master-0 kubenswrapper[8606]: I1204 22:04:58.888690 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zt44t" podStartSLOduration=178.145259925 podStartE2EDuration="3m20.888668561s" podCreationTimestamp="2025-12-04 22:01:38 +0000 UTC" firstStartedPulling="2025-12-04 22:04:35.531205913 +0000 UTC m=+240.341508128" lastFinishedPulling="2025-12-04 22:04:58.274614549 +0000 UTC m=+263.084916764" observedRunningTime="2025-12-04 22:04:58.887432506 +0000 UTC m=+263.697734711" watchObservedRunningTime="2025-12-04 22:04:58.888668561 +0000 UTC m=+263.698970776" Dec 04 22:05:00.004729 master-0 kubenswrapper[8606]: I1204 22:05:00.004049 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:05:00.004729 master-0 kubenswrapper[8606]: I1204 22:05:00.004191 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:05:00.597669 master-0 kubenswrapper[8606]: I1204 22:05:00.597559 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:05:00.597669 master-0 kubenswrapper[8606]: I1204 22:05:00.597661 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:05:00.598089 master-0 kubenswrapper[8606]: I1204 22:05:00.597560 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:05:00.598089 master-0 kubenswrapper[8606]: I1204 22:05:00.597798 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:05:00.700244 master-0 kubenswrapper[8606]: I1204 22:05:00.700112 8606 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-bm2pk container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:05:00.700244 master-0 kubenswrapper[8606]: I1204 22:05:00.700221 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" podUID="f893663c-7c1e-4eda-9839-99c1c0440304" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:05:00.700802 master-0 kubenswrapper[8606]: I1204 22:05:00.700293 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:05:00.701025 master-0 kubenswrapper[8606]: I1204 22:05:00.700969 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"980d371480e7b1b6a921dafc58f3636bd451634500e5c1c642030a39e001a8a8"} pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Dec 04 22:05:00.701122 master-0 kubenswrapper[8606]: I1204 22:05:00.701030 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" podUID="f893663c-7c1e-4eda-9839-99c1c0440304" containerName="authentication-operator" containerID="cri-o://980d371480e7b1b6a921dafc58f3636bd451634500e5c1c642030a39e001a8a8" gracePeriod=30 Dec 04 22:05:01.043024 master-0 kubenswrapper[8606]: I1204 22:05:01.042957 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zt44t" podUID="ce6002bb-4948-45ab-bb1d-ed65e86b6466" containerName="registry-server" probeResult="failure" output=< Dec 04 22:05:01.043024 master-0 kubenswrapper[8606]: timeout: failed to connect service ":50051" within 1s Dec 04 22:05:01.043024 master-0 kubenswrapper[8606]: > Dec 04 22:05:01.826284 master-0 kubenswrapper[8606]: I1204 22:05:01.826230 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-bm2pk_f893663c-7c1e-4eda-9839-99c1c0440304/authentication-operator/1.log" Dec 04 22:05:01.826803 master-0 kubenswrapper[8606]: I1204 22:05:01.826753 8606 generic.go:334] "Generic (PLEG): container finished" podID="f893663c-7c1e-4eda-9839-99c1c0440304" containerID="980d371480e7b1b6a921dafc58f3636bd451634500e5c1c642030a39e001a8a8" exitCode=255 Dec 04 22:05:01.826862 master-0 kubenswrapper[8606]: I1204 22:05:01.826817 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerDied","Data":"980d371480e7b1b6a921dafc58f3636bd451634500e5c1c642030a39e001a8a8"} Dec 04 22:05:01.826897 master-0 kubenswrapper[8606]: I1204 22:05:01.826864 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerStarted","Data":"4325f17544835c85c6a52e1b1f681caef960ed59819852b48ea3b6353d61e1b5"} Dec 04 22:05:01.826897 master-0 kubenswrapper[8606]: I1204 22:05:01.826890 8606 scope.go:117] "RemoveContainer" containerID="49e5b6467d42b24a4142a36b3091700faf9ab3af4e0dd62b2e3ca1fd3da47a30" Dec 04 22:05:02.630170 master-0 kubenswrapper[8606]: I1204 22:05:02.630087 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:05:02.835661 master-0 kubenswrapper[8606]: I1204 22:05:02.835481 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-bm2pk_f893663c-7c1e-4eda-9839-99c1c0440304/authentication-operator/1.log" Dec 04 22:05:06.012387 master-0 kubenswrapper[8606]: I1204 22:05:06.012213 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:05:06.012387 master-0 kubenswrapper[8606]: I1204 22:05:06.012272 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:05:06.080588 master-0 kubenswrapper[8606]: I1204 22:05:06.080468 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:05:06.218723 master-0 kubenswrapper[8606]: I1204 22:05:06.218290 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:05:06.218723 master-0 kubenswrapper[8606]: I1204 22:05:06.218368 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:05:06.290153 master-0 kubenswrapper[8606]: I1204 22:05:06.290067 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:05:06.922395 master-0 kubenswrapper[8606]: I1204 22:05:06.922314 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:05:06.935004 master-0 kubenswrapper[8606]: I1204 22:05:06.934946 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:05:07.111318 master-0 kubenswrapper[8606]: I1204 22:05:07.111274 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 04 22:05:07.597328 master-0 kubenswrapper[8606]: I1204 22:05:07.597169 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:05:07.597328 master-0 kubenswrapper[8606]: I1204 22:05:07.597291 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:05:07.640272 master-0 kubenswrapper[8606]: I1204 22:05:07.640219 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:05:07.931839 master-0 kubenswrapper[8606]: I1204 22:05:07.931728 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:05:08.093337 master-0 kubenswrapper[8606]: I1204 22:05:08.093256 8606 patch_prober.go:28] interesting pod/machine-config-daemon-ppnv8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 22:05:08.093644 master-0 kubenswrapper[8606]: I1204 22:05:08.093332 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" podUID="8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 22:05:08.094033 master-0 kubenswrapper[8606]: I1204 22:05:08.093961 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:05:08.095964 master-0 kubenswrapper[8606]: I1204 22:05:08.095900 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"446458854f272c65918d3eef29e63c52aea4a45ba36f434208f291e2e3410da7"} pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 22:05:08.096176 master-0 kubenswrapper[8606]: I1204 22:05:08.096068 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" podUID="8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb" containerName="machine-config-daemon" containerID="cri-o://446458854f272c65918d3eef29e63c52aea4a45ba36f434208f291e2e3410da7" gracePeriod=600 Dec 04 22:05:10.072229 master-0 kubenswrapper[8606]: I1204 22:05:10.072145 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:05:10.141974 master-0 kubenswrapper[8606]: I1204 22:05:10.141878 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:05:10.896137 master-0 kubenswrapper[8606]: I1204 22:05:10.896033 8606 generic.go:334] "Generic (PLEG): container finished" podID="8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb" containerID="446458854f272c65918d3eef29e63c52aea4a45ba36f434208f291e2e3410da7" exitCode=0 Dec 04 22:05:10.896486 master-0 kubenswrapper[8606]: I1204 22:05:10.896159 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" event={"ID":"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb","Type":"ContainerDied","Data":"446458854f272c65918d3eef29e63c52aea4a45ba36f434208f291e2e3410da7"} Dec 04 22:05:11.510296 master-0 kubenswrapper[8606]: I1204 22:05:11.510216 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:05:12.915467 master-0 kubenswrapper[8606]: I1204 22:05:12.915350 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" event={"ID":"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb","Type":"ContainerStarted","Data":"1a4d2b917d0f536f861e86755d3bf6744689e0554629cdb4b05a8419c9269007"} Dec 04 22:05:13.799245 master-0 kubenswrapper[8606]: I1204 22:05:13.799156 8606 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 04 22:05:13.799704 master-0 kubenswrapper[8606]: I1204 22:05:13.799639 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" containerID="cri-o://70ec2f528f522213daf96bac275fda7cf7f15b026ed56e4b58dab19aaca3bd29" gracePeriod=30 Dec 04 22:05:13.799843 master-0 kubenswrapper[8606]: I1204 22:05:13.799744 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" containerID="cri-o://0a040d82f9bfd9a8d213b7bca90e959915daaafe371835a7acd200542911284e" gracePeriod=30 Dec 04 22:05:13.802713 master-0 kubenswrapper[8606]: I1204 22:05:13.802047 8606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:05:13.802833 master-0 kubenswrapper[8606]: E1204 22:05:13.802759 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9fbd90-66d5-4637-9821-22242aa6f6d7" containerName="installer" Dec 04 22:05:13.802833 master-0 kubenswrapper[8606]: I1204 22:05:13.802827 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9fbd90-66d5-4637-9821-22242aa6f6d7" containerName="installer" Dec 04 22:05:13.802958 master-0 kubenswrapper[8606]: E1204 22:05:13.802861 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.802958 master-0 kubenswrapper[8606]: I1204 22:05:13.802877 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.802958 master-0 kubenswrapper[8606]: E1204 22:05:13.802939 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b977a2cf-4e95-4456-957d-b1ba05c0d1ff" containerName="installer" Dec 04 22:05:13.802958 master-0 kubenswrapper[8606]: I1204 22:05:13.802956 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b977a2cf-4e95-4456-957d-b1ba05c0d1ff" containerName="installer" Dec 04 22:05:13.803161 master-0 kubenswrapper[8606]: E1204 22:05:13.803024 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.803161 master-0 kubenswrapper[8606]: I1204 22:05:13.803042 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.803161 master-0 kubenswrapper[8606]: E1204 22:05:13.803062 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" containerName="installer" Dec 04 22:05:13.803161 master-0 kubenswrapper[8606]: I1204 22:05:13.803115 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" containerName="installer" Dec 04 22:05:13.803161 master-0 kubenswrapper[8606]: E1204 22:05:13.803136 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.803401 master-0 kubenswrapper[8606]: I1204 22:05:13.803151 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.803401 master-0 kubenswrapper[8606]: E1204 22:05:13.803210 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.803401 master-0 kubenswrapper[8606]: I1204 22:05:13.803225 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.803401 master-0 kubenswrapper[8606]: E1204 22:05:13.803249 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" Dec 04 22:05:13.803401 master-0 kubenswrapper[8606]: I1204 22:05:13.803316 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: E1204 22:05:13.803412 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fca9b57-0b34-46d4-9f3a-dbd4acd630f6" containerName="installer" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.803436 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fca9b57-0b34-46d4-9f3a-dbd4acd630f6" containerName="installer" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: E1204 22:05:13.803580 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" containerName="installer" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.803601 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" containerName="installer" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.803954 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fca9b57-0b34-46d4-9f3a-dbd4acd630f6" containerName="installer" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.803986 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.804047 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="b977a2cf-4e95-4456-957d-b1ba05c0d1ff" containerName="installer" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.804069 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" containerName="installer" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.804092 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.804157 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.804177 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.804234 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" containerName="installer" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.804253 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.804268 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.804333 master-0 kubenswrapper[8606]: I1204 22:05:13.804326 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9fbd90-66d5-4637-9821-22242aa6f6d7" containerName="installer" Dec 04 22:05:13.804985 master-0 kubenswrapper[8606]: E1204 22:05:13.804720 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.804985 master-0 kubenswrapper[8606]: I1204 22:05:13.804742 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 04 22:05:13.809025 master-0 kubenswrapper[8606]: I1204 22:05:13.808034 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:13.925447 master-0 kubenswrapper[8606]: I1204 22:05:13.925366 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"fad55397ac8e23f218f25cb714ea5b2b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:13.926252 master-0 kubenswrapper[8606]: I1204 22:05:13.925494 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"fad55397ac8e23f218f25cb714ea5b2b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:14.027089 master-0 kubenswrapper[8606]: I1204 22:05:14.027017 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"fad55397ac8e23f218f25cb714ea5b2b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:14.027415 master-0 kubenswrapper[8606]: I1204 22:05:14.027192 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"fad55397ac8e23f218f25cb714ea5b2b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:14.027685 master-0 kubenswrapper[8606]: I1204 22:05:14.027541 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"fad55397ac8e23f218f25cb714ea5b2b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:14.027908 master-0 kubenswrapper[8606]: I1204 22:05:14.027773 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"fad55397ac8e23f218f25cb714ea5b2b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:14.183093 master-0 kubenswrapper[8606]: I1204 22:05:14.183035 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:14.202292 master-0 kubenswrapper[8606]: I1204 22:05:14.202244 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:05:14.230377 master-0 kubenswrapper[8606]: I1204 22:05:14.229189 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=7.229168761 podStartE2EDuration="7.229168761s" podCreationTimestamp="2025-12-04 22:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:05:14.227659078 +0000 UTC m=+279.037961373" watchObservedRunningTime="2025-12-04 22:05:14.229168761 +0000 UTC m=+279.039470986" Dec 04 22:05:14.373119 master-0 kubenswrapper[8606]: I1204 22:05:14.373055 8606 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="0a9e9530-6453-42c4-aac7-1814d6122b17" Dec 04 22:05:14.517401 master-0 kubenswrapper[8606]: I1204 22:05:14.517327 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:05:14.549818 master-0 kubenswrapper[8606]: I1204 22:05:14.549762 8606 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="0a9e9530-6453-42c4-aac7-1814d6122b17" Dec 04 22:05:14.638490 master-0 kubenswrapper[8606]: I1204 22:05:14.638434 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"8b47694fcc32464ab24d09c23d6efb57\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " Dec 04 22:05:14.638726 master-0 kubenswrapper[8606]: I1204 22:05:14.638535 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"8b47694fcc32464ab24d09c23d6efb57\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " Dec 04 22:05:14.638726 master-0 kubenswrapper[8606]: I1204 22:05:14.638564 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"8b47694fcc32464ab24d09c23d6efb57\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " Dec 04 22:05:14.638726 master-0 kubenswrapper[8606]: I1204 22:05:14.638608 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"8b47694fcc32464ab24d09c23d6efb57\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " Dec 04 22:05:14.638726 master-0 kubenswrapper[8606]: I1204 22:05:14.638689 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"8b47694fcc32464ab24d09c23d6efb57\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " Dec 04 22:05:14.638973 master-0 kubenswrapper[8606]: I1204 22:05:14.638936 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "8b47694fcc32464ab24d09c23d6efb57" (UID: "8b47694fcc32464ab24d09c23d6efb57"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:05:14.639048 master-0 kubenswrapper[8606]: I1204 22:05:14.639013 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs" (OuterVolumeSpecName: "logs") pod "8b47694fcc32464ab24d09c23d6efb57" (UID: "8b47694fcc32464ab24d09c23d6efb57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:05:14.639092 master-0 kubenswrapper[8606]: I1204 22:05:14.638998 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config" (OuterVolumeSpecName: "config") pod "8b47694fcc32464ab24d09c23d6efb57" (UID: "8b47694fcc32464ab24d09c23d6efb57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:05:14.639092 master-0 kubenswrapper[8606]: I1204 22:05:14.639075 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "8b47694fcc32464ab24d09c23d6efb57" (UID: "8b47694fcc32464ab24d09c23d6efb57"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:05:14.639191 master-0 kubenswrapper[8606]: I1204 22:05:14.638977 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets" (OuterVolumeSpecName: "secrets") pod "8b47694fcc32464ab24d09c23d6efb57" (UID: "8b47694fcc32464ab24d09c23d6efb57"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:05:14.740294 master-0 kubenswrapper[8606]: I1204 22:05:14.740212 8606 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:14.740294 master-0 kubenswrapper[8606]: I1204 22:05:14.740238 8606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:14.740294 master-0 kubenswrapper[8606]: I1204 22:05:14.740250 8606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:14.740294 master-0 kubenswrapper[8606]: I1204 22:05:14.740258 8606 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:14.740294 master-0 kubenswrapper[8606]: I1204 22:05:14.740267 8606 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:14.933213 master-0 kubenswrapper[8606]: I1204 22:05:14.933007 8606 generic.go:334] "Generic (PLEG): container finished" podID="0791dc66-67d9-42bd-b7c3-d45dc5513c3b" containerID="f70a0cabfa84fd6dac7eab4d978f050d5e781f995d7f4f93a12a51cc9706d0d9" exitCode=0 Dec 04 22:05:14.933213 master-0 kubenswrapper[8606]: I1204 22:05:14.933123 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0791dc66-67d9-42bd-b7c3-d45dc5513c3b","Type":"ContainerDied","Data":"f70a0cabfa84fd6dac7eab4d978f050d5e781f995d7f4f93a12a51cc9706d0d9"} Dec 04 22:05:14.937288 master-0 kubenswrapper[8606]: I1204 22:05:14.937044 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"90abdb91a6f0572c29e8db6a5253191316a65911b9cf466a77844b0b5c6a021d"} Dec 04 22:05:14.937288 master-0 kubenswrapper[8606]: I1204 22:05:14.937074 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5"} Dec 04 22:05:14.937288 master-0 kubenswrapper[8606]: I1204 22:05:14.937087 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"9ccb30a6243f7a894b4b3551e9274749c93d506bae1a70db70653ccadedbb5f2"} Dec 04 22:05:14.941617 master-0 kubenswrapper[8606]: I1204 22:05:14.941540 8606 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="0a040d82f9bfd9a8d213b7bca90e959915daaafe371835a7acd200542911284e" exitCode=0 Dec 04 22:05:14.941617 master-0 kubenswrapper[8606]: I1204 22:05:14.941602 8606 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="70ec2f528f522213daf96bac275fda7cf7f15b026ed56e4b58dab19aaca3bd29" exitCode=0 Dec 04 22:05:14.941784 master-0 kubenswrapper[8606]: I1204 22:05:14.941629 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7b0f16a53cc394a75f1d385a9c55dea4c65ab334eb9a3cd2bbdaa30b3396154" Dec 04 22:05:14.941784 master-0 kubenswrapper[8606]: I1204 22:05:14.941668 8606 scope.go:117] "RemoveContainer" containerID="e920fbcfee2c4b46f21096760788881d27cd16941b74072b602d714de3e9a0e4" Dec 04 22:05:14.941784 master-0 kubenswrapper[8606]: I1204 22:05:14.941684 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 04 22:05:15.403359 master-0 kubenswrapper[8606]: I1204 22:05:15.403280 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b47694fcc32464ab24d09c23d6efb57" path="/var/lib/kubelet/pods/8b47694fcc32464ab24d09c23d6efb57/volumes" Dec 04 22:05:15.403890 master-0 kubenswrapper[8606]: I1204 22:05:15.403800 8606 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Dec 04 22:05:15.431641 master-0 kubenswrapper[8606]: I1204 22:05:15.430559 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 04 22:05:15.431641 master-0 kubenswrapper[8606]: I1204 22:05:15.430644 8606 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="0a9e9530-6453-42c4-aac7-1814d6122b17" Dec 04 22:05:15.441860 master-0 kubenswrapper[8606]: I1204 22:05:15.441770 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 04 22:05:15.441860 master-0 kubenswrapper[8606]: I1204 22:05:15.441841 8606 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="0a9e9530-6453-42c4-aac7-1814d6122b17" Dec 04 22:05:15.955116 master-0 kubenswrapper[8606]: I1204 22:05:15.954995 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f"} Dec 04 22:05:15.956118 master-0 kubenswrapper[8606]: I1204 22:05:15.955145 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7"} Dec 04 22:05:16.358362 master-0 kubenswrapper[8606]: I1204 22:05:16.358323 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:05:16.386852 master-0 kubenswrapper[8606]: I1204 22:05:16.386773 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.386752652 podStartE2EDuration="2.386752652s" podCreationTimestamp="2025-12-04 22:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:05:15.990105093 +0000 UTC m=+280.800407348" watchObservedRunningTime="2025-12-04 22:05:16.386752652 +0000 UTC m=+281.197054887" Dec 04 22:05:16.468731 master-0 kubenswrapper[8606]: I1204 22:05:16.468676 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-var-lock\") pod \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " Dec 04 22:05:16.468731 master-0 kubenswrapper[8606]: I1204 22:05:16.468728 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kubelet-dir\") pod \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " Dec 04 22:05:16.469003 master-0 kubenswrapper[8606]: I1204 22:05:16.468766 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kube-api-access\") pod \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\" (UID: \"0791dc66-67d9-42bd-b7c3-d45dc5513c3b\") " Dec 04 22:05:16.469003 master-0 kubenswrapper[8606]: I1204 22:05:16.468805 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-var-lock" (OuterVolumeSpecName: "var-lock") pod "0791dc66-67d9-42bd-b7c3-d45dc5513c3b" (UID: "0791dc66-67d9-42bd-b7c3-d45dc5513c3b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:05:16.469003 master-0 kubenswrapper[8606]: I1204 22:05:16.468880 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0791dc66-67d9-42bd-b7c3-d45dc5513c3b" (UID: "0791dc66-67d9-42bd-b7c3-d45dc5513c3b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:05:16.469559 master-0 kubenswrapper[8606]: I1204 22:05:16.469328 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:16.469559 master-0 kubenswrapper[8606]: I1204 22:05:16.469378 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:16.472400 master-0 kubenswrapper[8606]: I1204 22:05:16.472337 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0791dc66-67d9-42bd-b7c3-d45dc5513c3b" (UID: "0791dc66-67d9-42bd-b7c3-d45dc5513c3b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:05:16.570815 master-0 kubenswrapper[8606]: I1204 22:05:16.570666 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0791dc66-67d9-42bd-b7c3-d45dc5513c3b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:16.969289 master-0 kubenswrapper[8606]: I1204 22:05:16.969133 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0791dc66-67d9-42bd-b7c3-d45dc5513c3b","Type":"ContainerDied","Data":"87f5905070547f303913fb037b80fa8791f83c49f484c86a7ba733227eaa9eb2"} Dec 04 22:05:16.969289 master-0 kubenswrapper[8606]: I1204 22:05:16.969196 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f5905070547f303913fb037b80fa8791f83c49f484c86a7ba733227eaa9eb2" Dec 04 22:05:16.969289 master-0 kubenswrapper[8606]: I1204 22:05:16.969198 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:05:24.183572 master-0 kubenswrapper[8606]: I1204 22:05:24.183476 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:24.184300 master-0 kubenswrapper[8606]: I1204 22:05:24.183592 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:24.184300 master-0 kubenswrapper[8606]: I1204 22:05:24.183616 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:24.184618 master-0 kubenswrapper[8606]: I1204 22:05:24.184566 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:24.188706 master-0 kubenswrapper[8606]: I1204 22:05:24.188654 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:24.191901 master-0 kubenswrapper[8606]: I1204 22:05:24.191859 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:25.032538 master-0 kubenswrapper[8606]: I1204 22:05:25.032454 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:25.034470 master-0 kubenswrapper[8606]: I1204 22:05:25.034415 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:05:34.257528 master-0 kubenswrapper[8606]: I1204 22:05:34.256620 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd"] Dec 04 22:05:34.257528 master-0 kubenswrapper[8606]: I1204 22:05:34.256988 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="kube-rbac-proxy" containerID="cri-o://ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663" gracePeriod=30 Dec 04 22:05:34.257528 master-0 kubenswrapper[8606]: I1204 22:05:34.257493 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="machine-approver-controller" containerID="cri-o://578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f" gracePeriod=30 Dec 04 22:05:34.263088 master-0 kubenswrapper[8606]: I1204 22:05:34.259805 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p"] Dec 04 22:05:34.263088 master-0 kubenswrapper[8606]: I1204 22:05:34.260145 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="cluster-cloud-controller-manager" containerID="cri-o://f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228" gracePeriod=30 Dec 04 22:05:34.263088 master-0 kubenswrapper[8606]: I1204 22:05:34.260319 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="kube-rbac-proxy" containerID="cri-o://0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82" gracePeriod=30 Dec 04 22:05:34.263088 master-0 kubenswrapper[8606]: I1204 22:05:34.260380 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="config-sync-controllers" containerID="cri-o://1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62" gracePeriod=30 Dec 04 22:05:34.526712 master-0 kubenswrapper[8606]: I1204 22:05:34.526043 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-f797d8546-4g7dd_c52111ac-30f6-47b7-a8ca-13659fbd71b4/machine-approver-controller/0.log" Dec 04 22:05:34.526712 master-0 kubenswrapper[8606]: I1204 22:05:34.526425 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:05:34.530838 master-0 kubenswrapper[8606]: I1204 22:05:34.530805 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:05:34.636714 master-0 kubenswrapper[8606]: I1204 22:05:34.636656 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcdst\" (UniqueName: \"kubernetes.io/projected/c52111ac-30f6-47b7-a8ca-13659fbd71b4-kube-api-access-rcdst\") pod \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " Dec 04 22:05:34.636714 master-0 kubenswrapper[8606]: I1204 22:05:34.636725 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-auth-proxy-config\") pod \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " Dec 04 22:05:34.636950 master-0 kubenswrapper[8606]: I1204 22:05:34.636751 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-auth-proxy-config\") pod \"3d178411-75a4-4ff8-9764-6f3e3944eca4\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " Dec 04 22:05:34.636950 master-0 kubenswrapper[8606]: I1204 22:05:34.636775 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c52111ac-30f6-47b7-a8ca-13659fbd71b4-machine-approver-tls\") pod \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " Dec 04 22:05:34.636950 master-0 kubenswrapper[8606]: I1204 22:05:34.636801 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3d178411-75a4-4ff8-9764-6f3e3944eca4-host-etc-kube\") pod \"3d178411-75a4-4ff8-9764-6f3e3944eca4\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " Dec 04 22:05:34.636950 master-0 kubenswrapper[8606]: I1204 22:05:34.636818 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwfwd\" (UniqueName: \"kubernetes.io/projected/3d178411-75a4-4ff8-9764-6f3e3944eca4-kube-api-access-jwfwd\") pod \"3d178411-75a4-4ff8-9764-6f3e3944eca4\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " Dec 04 22:05:34.636950 master-0 kubenswrapper[8606]: I1204 22:05:34.636865 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-images\") pod \"3d178411-75a4-4ff8-9764-6f3e3944eca4\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " Dec 04 22:05:34.636950 master-0 kubenswrapper[8606]: I1204 22:05:34.636899 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-config\") pod \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\" (UID: \"c52111ac-30f6-47b7-a8ca-13659fbd71b4\") " Dec 04 22:05:34.637110 master-0 kubenswrapper[8606]: I1204 22:05:34.636966 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d178411-75a4-4ff8-9764-6f3e3944eca4-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "3d178411-75a4-4ff8-9764-6f3e3944eca4" (UID: "3d178411-75a4-4ff8-9764-6f3e3944eca4"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:05:34.637148 master-0 kubenswrapper[8606]: I1204 22:05:34.637112 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d178411-75a4-4ff8-9764-6f3e3944eca4-cloud-controller-manager-operator-tls\") pod \"3d178411-75a4-4ff8-9764-6f3e3944eca4\" (UID: \"3d178411-75a4-4ff8-9764-6f3e3944eca4\") " Dec 04 22:05:34.637291 master-0 kubenswrapper[8606]: I1204 22:05:34.637259 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "3d178411-75a4-4ff8-9764-6f3e3944eca4" (UID: "3d178411-75a4-4ff8-9764-6f3e3944eca4"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:05:34.637337 master-0 kubenswrapper[8606]: I1204 22:05:34.637300 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-config" (OuterVolumeSpecName: "config") pod "c52111ac-30f6-47b7-a8ca-13659fbd71b4" (UID: "c52111ac-30f6-47b7-a8ca-13659fbd71b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:05:34.637438 master-0 kubenswrapper[8606]: I1204 22:05:34.637416 8606 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:34.637438 master-0 kubenswrapper[8606]: I1204 22:05:34.637434 8606 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/3d178411-75a4-4ff8-9764-6f3e3944eca4-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:34.637550 master-0 kubenswrapper[8606]: I1204 22:05:34.637446 8606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:34.637653 master-0 kubenswrapper[8606]: I1204 22:05:34.637583 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-images" (OuterVolumeSpecName: "images") pod "3d178411-75a4-4ff8-9764-6f3e3944eca4" (UID: "3d178411-75a4-4ff8-9764-6f3e3944eca4"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:05:34.638321 master-0 kubenswrapper[8606]: I1204 22:05:34.638270 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "c52111ac-30f6-47b7-a8ca-13659fbd71b4" (UID: "c52111ac-30f6-47b7-a8ca-13659fbd71b4"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:05:34.640422 master-0 kubenswrapper[8606]: I1204 22:05:34.640369 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c52111ac-30f6-47b7-a8ca-13659fbd71b4-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "c52111ac-30f6-47b7-a8ca-13659fbd71b4" (UID: "c52111ac-30f6-47b7-a8ca-13659fbd71b4"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:05:34.640607 master-0 kubenswrapper[8606]: I1204 22:05:34.640579 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d178411-75a4-4ff8-9764-6f3e3944eca4-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "3d178411-75a4-4ff8-9764-6f3e3944eca4" (UID: "3d178411-75a4-4ff8-9764-6f3e3944eca4"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:05:34.640728 master-0 kubenswrapper[8606]: I1204 22:05:34.640663 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d178411-75a4-4ff8-9764-6f3e3944eca4-kube-api-access-jwfwd" (OuterVolumeSpecName: "kube-api-access-jwfwd") pod "3d178411-75a4-4ff8-9764-6f3e3944eca4" (UID: "3d178411-75a4-4ff8-9764-6f3e3944eca4"). InnerVolumeSpecName "kube-api-access-jwfwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:05:34.640862 master-0 kubenswrapper[8606]: I1204 22:05:34.640839 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c52111ac-30f6-47b7-a8ca-13659fbd71b4-kube-api-access-rcdst" (OuterVolumeSpecName: "kube-api-access-rcdst") pod "c52111ac-30f6-47b7-a8ca-13659fbd71b4" (UID: "c52111ac-30f6-47b7-a8ca-13659fbd71b4"). InnerVolumeSpecName "kube-api-access-rcdst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:05:34.739413 master-0 kubenswrapper[8606]: I1204 22:05:34.739307 8606 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d178411-75a4-4ff8-9764-6f3e3944eca4-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:34.739413 master-0 kubenswrapper[8606]: I1204 22:05:34.739381 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcdst\" (UniqueName: \"kubernetes.io/projected/c52111ac-30f6-47b7-a8ca-13659fbd71b4-kube-api-access-rcdst\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:34.739413 master-0 kubenswrapper[8606]: I1204 22:05:34.739405 8606 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c52111ac-30f6-47b7-a8ca-13659fbd71b4-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:34.740307 master-0 kubenswrapper[8606]: I1204 22:05:34.740260 8606 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c52111ac-30f6-47b7-a8ca-13659fbd71b4-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:34.740362 master-0 kubenswrapper[8606]: I1204 22:05:34.740309 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwfwd\" (UniqueName: \"kubernetes.io/projected/3d178411-75a4-4ff8-9764-6f3e3944eca4-kube-api-access-jwfwd\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:34.740406 master-0 kubenswrapper[8606]: I1204 22:05:34.740364 8606 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3d178411-75a4-4ff8-9764-6f3e3944eca4-images\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:35.112308 master-0 kubenswrapper[8606]: I1204 22:05:35.112205 8606 generic.go:334] "Generic (PLEG): container finished" podID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerID="0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82" exitCode=0 Dec 04 22:05:35.112308 master-0 kubenswrapper[8606]: I1204 22:05:35.112277 8606 generic.go:334] "Generic (PLEG): container finished" podID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerID="1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62" exitCode=0 Dec 04 22:05:35.112308 master-0 kubenswrapper[8606]: I1204 22:05:35.112295 8606 generic.go:334] "Generic (PLEG): container finished" podID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerID="f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228" exitCode=0 Dec 04 22:05:35.112902 master-0 kubenswrapper[8606]: I1204 22:05:35.112308 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" Dec 04 22:05:35.112902 master-0 kubenswrapper[8606]: I1204 22:05:35.112293 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" event={"ID":"3d178411-75a4-4ff8-9764-6f3e3944eca4","Type":"ContainerDied","Data":"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82"} Dec 04 22:05:35.112902 master-0 kubenswrapper[8606]: I1204 22:05:35.112475 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" event={"ID":"3d178411-75a4-4ff8-9764-6f3e3944eca4","Type":"ContainerDied","Data":"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62"} Dec 04 22:05:35.112902 master-0 kubenswrapper[8606]: I1204 22:05:35.112492 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" event={"ID":"3d178411-75a4-4ff8-9764-6f3e3944eca4","Type":"ContainerDied","Data":"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228"} Dec 04 22:05:35.112902 master-0 kubenswrapper[8606]: I1204 22:05:35.112526 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p" event={"ID":"3d178411-75a4-4ff8-9764-6f3e3944eca4","Type":"ContainerDied","Data":"118140813e360bb571639fba61c3203c7502dddd911daa90a8dc26b3b0128dfb"} Dec 04 22:05:35.112902 master-0 kubenswrapper[8606]: I1204 22:05:35.112541 8606 scope.go:117] "RemoveContainer" containerID="0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82" Dec 04 22:05:35.117227 master-0 kubenswrapper[8606]: I1204 22:05:35.117158 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-f797d8546-4g7dd_c52111ac-30f6-47b7-a8ca-13659fbd71b4/machine-approver-controller/0.log" Dec 04 22:05:35.118692 master-0 kubenswrapper[8606]: I1204 22:05:35.118637 8606 generic.go:334] "Generic (PLEG): container finished" podID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerID="578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f" exitCode=0 Dec 04 22:05:35.118692 master-0 kubenswrapper[8606]: I1204 22:05:35.118674 8606 generic.go:334] "Generic (PLEG): container finished" podID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerID="ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663" exitCode=0 Dec 04 22:05:35.118902 master-0 kubenswrapper[8606]: I1204 22:05:35.118705 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" event={"ID":"c52111ac-30f6-47b7-a8ca-13659fbd71b4","Type":"ContainerDied","Data":"578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f"} Dec 04 22:05:35.118902 master-0 kubenswrapper[8606]: I1204 22:05:35.118764 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" event={"ID":"c52111ac-30f6-47b7-a8ca-13659fbd71b4","Type":"ContainerDied","Data":"ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663"} Dec 04 22:05:35.118902 master-0 kubenswrapper[8606]: I1204 22:05:35.118785 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" event={"ID":"c52111ac-30f6-47b7-a8ca-13659fbd71b4","Type":"ContainerDied","Data":"6a357f5e42bd47d188514cbe5323e44289a00a8afb8894fb4fbbebb634c903b1"} Dec 04 22:05:35.118902 master-0 kubenswrapper[8606]: I1204 22:05:35.118871 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd" Dec 04 22:05:35.142732 master-0 kubenswrapper[8606]: I1204 22:05:35.142362 8606 scope.go:117] "RemoveContainer" containerID="1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62" Dec 04 22:05:35.164057 master-0 kubenswrapper[8606]: I1204 22:05:35.163963 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p"] Dec 04 22:05:35.167660 master-0 kubenswrapper[8606]: I1204 22:05:35.167598 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-nr72p"] Dec 04 22:05:35.180565 master-0 kubenswrapper[8606]: I1204 22:05:35.180489 8606 scope.go:117] "RemoveContainer" containerID="f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228" Dec 04 22:05:35.187031 master-0 kubenswrapper[8606]: I1204 22:05:35.186955 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd"] Dec 04 22:05:35.200116 master-0 kubenswrapper[8606]: I1204 22:05:35.200035 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-f797d8546-4g7dd"] Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.213928 8606 scope.go:117] "RemoveContainer" containerID="0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: E1204 22:05:35.214456 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82\": container with ID starting with 0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82 not found: ID does not exist" containerID="0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.214546 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82"} err="failed to get container status \"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82\": rpc error: code = NotFound desc = could not find container \"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82\": container with ID starting with 0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82 not found: ID does not exist" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.214705 8606 scope.go:117] "RemoveContainer" containerID="1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: E1204 22:05:35.216093 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62\": container with ID starting with 1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62 not found: ID does not exist" containerID="1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.216130 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62"} err="failed to get container status \"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62\": rpc error: code = NotFound desc = could not find container \"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62\": container with ID starting with 1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62 not found: ID does not exist" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.216160 8606 scope.go:117] "RemoveContainer" containerID="f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: E1204 22:05:35.216529 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228\": container with ID starting with f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228 not found: ID does not exist" containerID="f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.216553 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228"} err="failed to get container status \"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228\": rpc error: code = NotFound desc = could not find container \"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228\": container with ID starting with f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228 not found: ID does not exist" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.216571 8606 scope.go:117] "RemoveContainer" containerID="0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.217968 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82"} err="failed to get container status \"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82\": rpc error: code = NotFound desc = could not find container \"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82\": container with ID starting with 0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82 not found: ID does not exist" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.218051 8606 scope.go:117] "RemoveContainer" containerID="1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.218821 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62"} err="failed to get container status \"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62\": rpc error: code = NotFound desc = could not find container \"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62\": container with ID starting with 1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62 not found: ID does not exist" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.218868 8606 scope.go:117] "RemoveContainer" containerID="f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.221842 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228"} err="failed to get container status \"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228\": rpc error: code = NotFound desc = could not find container \"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228\": container with ID starting with f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228 not found: ID does not exist" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.221911 8606 scope.go:117] "RemoveContainer" containerID="0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222295 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4"] Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222367 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82"} err="failed to get container status \"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82\": rpc error: code = NotFound desc = could not find container \"0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82\": container with ID starting with 0def5e0a78039fbfe6dcc25e52cac4a9ae73b5c75589e5f34c96ac9118312b82 not found: ID does not exist" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222402 8606 scope.go:117] "RemoveContainer" containerID="1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: E1204 22:05:35.222541 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0791dc66-67d9-42bd-b7c3-d45dc5513c3b" containerName="installer" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222557 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0791dc66-67d9-42bd-b7c3-d45dc5513c3b" containerName="installer" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: E1204 22:05:35.222569 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="machine-approver-controller" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222577 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="machine-approver-controller" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: E1204 22:05:35.222593 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="kube-rbac-proxy" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222602 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="kube-rbac-proxy" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: E1204 22:05:35.222612 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="machine-approver-controller" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222622 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="machine-approver-controller" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: E1204 22:05:35.222637 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="config-sync-controllers" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222645 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="config-sync-controllers" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: E1204 22:05:35.222663 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="kube-rbac-proxy" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222673 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="kube-rbac-proxy" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: E1204 22:05:35.222682 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="cluster-cloud-controller-manager" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222690 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="cluster-cloud-controller-manager" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222799 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="0791dc66-67d9-42bd-b7c3-d45dc5513c3b" containerName="installer" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222810 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="kube-rbac-proxy" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222827 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="kube-rbac-proxy" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222838 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="machine-approver-controller" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222848 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" containerName="machine-approver-controller" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222860 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="config-sync-controllers" Dec 04 22:05:35.222926 master-0 kubenswrapper[8606]: I1204 22:05:35.222873 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" containerName="cluster-cloud-controller-manager" Dec 04 22:05:35.225936 master-0 kubenswrapper[8606]: I1204 22:05:35.223806 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62"} err="failed to get container status \"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62\": rpc error: code = NotFound desc = could not find container \"1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62\": container with ID starting with 1d9dea52f1d01fe2753578f4ab3a512661b0e12bbec36e9286d1083e5d4ebc62 not found: ID does not exist" Dec 04 22:05:35.225936 master-0 kubenswrapper[8606]: I1204 22:05:35.223906 8606 scope.go:117] "RemoveContainer" containerID="f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228" Dec 04 22:05:35.225936 master-0 kubenswrapper[8606]: I1204 22:05:35.224451 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.225936 master-0 kubenswrapper[8606]: I1204 22:05:35.224459 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228"} err="failed to get container status \"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228\": rpc error: code = NotFound desc = could not find container \"f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228\": container with ID starting with f8d8cdc2a18f3b9ef4b19ea2ccb4c9abce7af52441f1788c031fcd673301f228 not found: ID does not exist" Dec 04 22:05:35.225936 master-0 kubenswrapper[8606]: I1204 22:05:35.224535 8606 scope.go:117] "RemoveContainer" containerID="578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f" Dec 04 22:05:35.226546 master-0 kubenswrapper[8606]: I1204 22:05:35.226446 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 04 22:05:35.226918 master-0 kubenswrapper[8606]: I1204 22:05:35.226856 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 04 22:05:35.228953 master-0 kubenswrapper[8606]: I1204 22:05:35.228878 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-sf4xn" Dec 04 22:05:35.229100 master-0 kubenswrapper[8606]: I1204 22:05:35.229018 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 04 22:05:35.229275 master-0 kubenswrapper[8606]: I1204 22:05:35.229219 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 22:05:35.229567 master-0 kubenswrapper[8606]: I1204 22:05:35.229480 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:05:35.250004 master-0 kubenswrapper[8606]: I1204 22:05:35.249315 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/74197c50-9a41-40e8-9289-c7e6afbd3737-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.250004 master-0 kubenswrapper[8606]: I1204 22:05:35.249377 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.250004 master-0 kubenswrapper[8606]: I1204 22:05:35.249429 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.250004 master-0 kubenswrapper[8606]: I1204 22:05:35.249453 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/74197c50-9a41-40e8-9289-c7e6afbd3737-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.250004 master-0 kubenswrapper[8606]: I1204 22:05:35.249486 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq44d\" (UniqueName: \"kubernetes.io/projected/74197c50-9a41-40e8-9289-c7e6afbd3737-kube-api-access-hq44d\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.256224 master-0 kubenswrapper[8606]: I1204 22:05:35.256201 8606 scope.go:117] "RemoveContainer" containerID="ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6" Dec 04 22:05:35.288161 master-0 kubenswrapper[8606]: I1204 22:05:35.287298 8606 scope.go:117] "RemoveContainer" containerID="ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663" Dec 04 22:05:35.304161 master-0 kubenswrapper[8606]: I1204 22:05:35.303208 8606 scope.go:117] "RemoveContainer" containerID="578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f" Dec 04 22:05:35.304161 master-0 kubenswrapper[8606]: E1204 22:05:35.303942 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f\": container with ID starting with 578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f not found: ID does not exist" containerID="578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f" Dec 04 22:05:35.304161 master-0 kubenswrapper[8606]: I1204 22:05:35.304031 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f"} err="failed to get container status \"578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f\": rpc error: code = NotFound desc = could not find container \"578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f\": container with ID starting with 578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f not found: ID does not exist" Dec 04 22:05:35.304161 master-0 kubenswrapper[8606]: I1204 22:05:35.304088 8606 scope.go:117] "RemoveContainer" containerID="ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6" Dec 04 22:05:35.304612 master-0 kubenswrapper[8606]: E1204 22:05:35.304570 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6\": container with ID starting with ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6 not found: ID does not exist" containerID="ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6" Dec 04 22:05:35.304721 master-0 kubenswrapper[8606]: I1204 22:05:35.304625 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6"} err="failed to get container status \"ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6\": rpc error: code = NotFound desc = could not find container \"ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6\": container with ID starting with ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6 not found: ID does not exist" Dec 04 22:05:35.304721 master-0 kubenswrapper[8606]: I1204 22:05:35.304658 8606 scope.go:117] "RemoveContainer" containerID="ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663" Dec 04 22:05:35.305221 master-0 kubenswrapper[8606]: E1204 22:05:35.305156 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663\": container with ID starting with ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663 not found: ID does not exist" containerID="ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663" Dec 04 22:05:35.305357 master-0 kubenswrapper[8606]: I1204 22:05:35.305204 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663"} err="failed to get container status \"ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663\": rpc error: code = NotFound desc = could not find container \"ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663\": container with ID starting with ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663 not found: ID does not exist" Dec 04 22:05:35.305357 master-0 kubenswrapper[8606]: I1204 22:05:35.305243 8606 scope.go:117] "RemoveContainer" containerID="578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f" Dec 04 22:05:35.305831 master-0 kubenswrapper[8606]: I1204 22:05:35.305765 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f"} err="failed to get container status \"578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f\": rpc error: code = NotFound desc = could not find container \"578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f\": container with ID starting with 578eb14f84bae571466727337a02717a37ddcab4727b44c7ed4506c9a057372f not found: ID does not exist" Dec 04 22:05:35.305831 master-0 kubenswrapper[8606]: I1204 22:05:35.305810 8606 scope.go:117] "RemoveContainer" containerID="ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6" Dec 04 22:05:35.306137 master-0 kubenswrapper[8606]: I1204 22:05:35.306089 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6"} err="failed to get container status \"ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6\": rpc error: code = NotFound desc = could not find container \"ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6\": container with ID starting with ecfecb64888d77e2865549d18214d9a15730a1a7f33ca7d9afe2ad4531e252b6 not found: ID does not exist" Dec 04 22:05:35.306137 master-0 kubenswrapper[8606]: I1204 22:05:35.306113 8606 scope.go:117] "RemoveContainer" containerID="ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663" Dec 04 22:05:35.306438 master-0 kubenswrapper[8606]: I1204 22:05:35.306381 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663"} err="failed to get container status \"ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663\": rpc error: code = NotFound desc = could not find container \"ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663\": container with ID starting with ae8ab11f6752894c89325cbf5b46265050c9949a2b6c466a1dfeefcf41d62663 not found: ID does not exist" Dec 04 22:05:35.313908 master-0 kubenswrapper[8606]: I1204 22:05:35.313141 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx"] Dec 04 22:05:35.314270 master-0 kubenswrapper[8606]: I1204 22:05:35.314151 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.316888 master-0 kubenswrapper[8606]: I1204 22:05:35.316851 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 22:05:35.317364 master-0 kubenswrapper[8606]: I1204 22:05:35.317331 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 22:05:35.317571 master-0 kubenswrapper[8606]: I1204 22:05:35.317555 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 22:05:35.318797 master-0 kubenswrapper[8606]: I1204 22:05:35.317771 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-shf98" Dec 04 22:05:35.318797 master-0 kubenswrapper[8606]: I1204 22:05:35.318026 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 22:05:35.318797 master-0 kubenswrapper[8606]: I1204 22:05:35.318159 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 22:05:35.350856 master-0 kubenswrapper[8606]: I1204 22:05:35.350792 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.350856 master-0 kubenswrapper[8606]: I1204 22:05:35.350848 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5dac8e25-0f51-4c04-929c-060479689a9d-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.351200 master-0 kubenswrapper[8606]: I1204 22:05:35.350883 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.351200 master-0 kubenswrapper[8606]: I1204 22:05:35.350978 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/74197c50-9a41-40e8-9289-c7e6afbd3737-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.351200 master-0 kubenswrapper[8606]: I1204 22:05:35.351052 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq44d\" (UniqueName: \"kubernetes.io/projected/74197c50-9a41-40e8-9289-c7e6afbd3737-kube-api-access-hq44d\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.351200 master-0 kubenswrapper[8606]: I1204 22:05:35.351194 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.351384 master-0 kubenswrapper[8606]: I1204 22:05:35.351254 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch6s4\" (UniqueName: \"kubernetes.io/projected/5dac8e25-0f51-4c04-929c-060479689a9d-kube-api-access-ch6s4\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.351384 master-0 kubenswrapper[8606]: I1204 22:05:35.351289 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/74197c50-9a41-40e8-9289-c7e6afbd3737-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.351384 master-0 kubenswrapper[8606]: I1204 22:05:35.351298 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/74197c50-9a41-40e8-9289-c7e6afbd3737-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.351384 master-0 kubenswrapper[8606]: I1204 22:05:35.351339 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.353092 master-0 kubenswrapper[8606]: I1204 22:05:35.353027 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 04 22:05:35.354643 master-0 kubenswrapper[8606]: I1204 22:05:35.353490 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 04 22:05:35.354643 master-0 kubenswrapper[8606]: I1204 22:05:35.353699 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 04 22:05:35.364926 master-0 kubenswrapper[8606]: I1204 22:05:35.363129 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.364926 master-0 kubenswrapper[8606]: I1204 22:05:35.363154 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.367915 master-0 kubenswrapper[8606]: I1204 22:05:35.365594 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/74197c50-9a41-40e8-9289-c7e6afbd3737-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.379922 master-0 kubenswrapper[8606]: I1204 22:05:35.379841 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:05:35.395823 master-0 kubenswrapper[8606]: I1204 22:05:35.395758 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 22:05:35.398888 master-0 kubenswrapper[8606]: I1204 22:05:35.398840 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d178411-75a4-4ff8-9764-6f3e3944eca4" path="/var/lib/kubelet/pods/3d178411-75a4-4ff8-9764-6f3e3944eca4/volumes" Dec 04 22:05:35.399623 master-0 kubenswrapper[8606]: I1204 22:05:35.399596 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c52111ac-30f6-47b7-a8ca-13659fbd71b4" path="/var/lib/kubelet/pods/c52111ac-30f6-47b7-a8ca-13659fbd71b4/volumes" Dec 04 22:05:35.407939 master-0 kubenswrapper[8606]: I1204 22:05:35.407875 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq44d\" (UniqueName: \"kubernetes.io/projected/74197c50-9a41-40e8-9289-c7e6afbd3737-kube-api-access-hq44d\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.452874 master-0 kubenswrapper[8606]: I1204 22:05:35.452793 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch6s4\" (UniqueName: \"kubernetes.io/projected/5dac8e25-0f51-4c04-929c-060479689a9d-kube-api-access-ch6s4\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.453136 master-0 kubenswrapper[8606]: I1204 22:05:35.452918 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.453136 master-0 kubenswrapper[8606]: I1204 22:05:35.452956 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5dac8e25-0f51-4c04-929c-060479689a9d-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.453136 master-0 kubenswrapper[8606]: I1204 22:05:35.453029 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.456668 master-0 kubenswrapper[8606]: I1204 22:05:35.456629 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 22:05:35.456806 master-0 kubenswrapper[8606]: I1204 22:05:35.456756 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 22:05:35.456876 master-0 kubenswrapper[8606]: I1204 22:05:35.456756 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 22:05:35.464741 master-0 kubenswrapper[8606]: I1204 22:05:35.464693 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.465192 master-0 kubenswrapper[8606]: I1204 22:05:35.465144 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.468788 master-0 kubenswrapper[8606]: I1204 22:05:35.468734 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5dac8e25-0f51-4c04-929c-060479689a9d-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.471265 master-0 kubenswrapper[8606]: I1204 22:05:35.471107 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 22:05:35.482039 master-0 kubenswrapper[8606]: I1204 22:05:35.481972 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 22:05:35.501445 master-0 kubenswrapper[8606]: I1204 22:05:35.501375 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch6s4\" (UniqueName: \"kubernetes.io/projected/5dac8e25-0f51-4c04-929c-060479689a9d-kube-api-access-ch6s4\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.565016 master-0 kubenswrapper[8606]: I1204 22:05:35.564261 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-sf4xn" Dec 04 22:05:35.573162 master-0 kubenswrapper[8606]: I1204 22:05:35.572899 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:05:35.594541 master-0 kubenswrapper[8606]: W1204 22:05:35.593776 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74197c50_9a41_40e8_9289_c7e6afbd3737.slice/crio-4fb8b7eb82d3c4d48b5f1cf2512cb01480c4c7ee72d7b27dc0bc0ec05cc4c756 WatchSource:0}: Error finding container 4fb8b7eb82d3c4d48b5f1cf2512cb01480c4c7ee72d7b27dc0bc0ec05cc4c756: Status 404 returned error can't find the container with id 4fb8b7eb82d3c4d48b5f1cf2512cb01480c4c7ee72d7b27dc0bc0ec05cc4c756 Dec 04 22:05:35.634519 master-0 kubenswrapper[8606]: I1204 22:05:35.634463 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-shf98" Dec 04 22:05:35.642305 master-0 kubenswrapper[8606]: I1204 22:05:35.642259 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:05:35.709891 master-0 kubenswrapper[8606]: W1204 22:05:35.700021 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dac8e25_0f51_4c04_929c_060479689a9d.slice/crio-09c1c58555576a7a5ad0c40263cb1b24f2532f6f0895e3889a790f41c5622cf5 WatchSource:0}: Error finding container 09c1c58555576a7a5ad0c40263cb1b24f2532f6f0895e3889a790f41c5622cf5: Status 404 returned error can't find the container with id 09c1c58555576a7a5ad0c40263cb1b24f2532f6f0895e3889a790f41c5622cf5 Dec 04 22:05:36.137543 master-0 kubenswrapper[8606]: I1204 22:05:36.137399 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerStarted","Data":"d4cd669f8e4bd3008a9642035eec139a13bd3a586fb003b3adb4d948956c28f6"} Dec 04 22:05:36.137543 master-0 kubenswrapper[8606]: I1204 22:05:36.137457 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerStarted","Data":"d7411bec11d115b15a691f3d3010646ecd1f289830d7539f76d0467f6cd83226"} Dec 04 22:05:36.137543 master-0 kubenswrapper[8606]: I1204 22:05:36.137469 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerStarted","Data":"4fb8b7eb82d3c4d48b5f1cf2512cb01480c4c7ee72d7b27dc0bc0ec05cc4c756"} Dec 04 22:05:36.141255 master-0 kubenswrapper[8606]: I1204 22:05:36.141212 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" event={"ID":"5dac8e25-0f51-4c04-929c-060479689a9d","Type":"ContainerStarted","Data":"82f29b76b48da3d841d1256de9fef86cdb6553d971418660ee7c3b3bf00fff6f"} Dec 04 22:05:36.141255 master-0 kubenswrapper[8606]: I1204 22:05:36.141244 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" event={"ID":"5dac8e25-0f51-4c04-929c-060479689a9d","Type":"ContainerStarted","Data":"09c1c58555576a7a5ad0c40263cb1b24f2532f6f0895e3889a790f41c5622cf5"} Dec 04 22:05:37.155955 master-0 kubenswrapper[8606]: I1204 22:05:37.155837 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerStarted","Data":"b1f6fb04eb6c1c9d551e263a6b6af6d08c8b7f2c8d5ec4566af25c8704b19d39"} Dec 04 22:05:37.158170 master-0 kubenswrapper[8606]: I1204 22:05:37.158110 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" event={"ID":"5dac8e25-0f51-4c04-929c-060479689a9d","Type":"ContainerStarted","Data":"028e28ae583843cf573a020510b23f29bb89888d81cbcae42e7af0446b2d3e61"} Dec 04 22:05:37.185141 master-0 kubenswrapper[8606]: I1204 22:05:37.185030 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" podStartSLOduration=2.185006784 podStartE2EDuration="2.185006784s" podCreationTimestamp="2025-12-04 22:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:05:37.182459169 +0000 UTC m=+301.992761404" watchObservedRunningTime="2025-12-04 22:05:37.185006784 +0000 UTC m=+301.995309009" Dec 04 22:05:37.208888 master-0 kubenswrapper[8606]: I1204 22:05:37.208753 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" podStartSLOduration=2.208720835 podStartE2EDuration="2.208720835s" podCreationTimestamp="2025-12-04 22:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:05:37.205585264 +0000 UTC m=+302.015887499" watchObservedRunningTime="2025-12-04 22:05:37.208720835 +0000 UTC m=+302.019023050" Dec 04 22:05:37.403690 master-0 kubenswrapper[8606]: I1204 22:05:37.403607 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68"] Dec 04 22:05:37.405065 master-0 kubenswrapper[8606]: I1204 22:05:37.405031 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:37.407196 master-0 kubenswrapper[8606]: I1204 22:05:37.407090 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-bktr2" Dec 04 22:05:37.407196 master-0 kubenswrapper[8606]: I1204 22:05:37.407184 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 22:05:37.419169 master-0 kubenswrapper[8606]: I1204 22:05:37.419112 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68"] Dec 04 22:05:37.480360 master-0 kubenswrapper[8606]: I1204 22:05:37.480261 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkrvr\" (UniqueName: \"kubernetes.io/projected/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-kube-api-access-xkrvr\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:37.480791 master-0 kubenswrapper[8606]: I1204 22:05:37.480398 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:37.480791 master-0 kubenswrapper[8606]: I1204 22:05:37.480586 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:37.581772 master-0 kubenswrapper[8606]: I1204 22:05:37.581715 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkrvr\" (UniqueName: \"kubernetes.io/projected/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-kube-api-access-xkrvr\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:37.581981 master-0 kubenswrapper[8606]: I1204 22:05:37.581786 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:37.581981 master-0 kubenswrapper[8606]: I1204 22:05:37.581811 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:37.583260 master-0 kubenswrapper[8606]: I1204 22:05:37.583194 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:37.585991 master-0 kubenswrapper[8606]: I1204 22:05:37.585920 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:37.599488 master-0 kubenswrapper[8606]: I1204 22:05:37.599017 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkrvr\" (UniqueName: \"kubernetes.io/projected/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-kube-api-access-xkrvr\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:37.727289 master-0 kubenswrapper[8606]: I1204 22:05:37.727168 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:05:38.217993 master-0 kubenswrapper[8606]: I1204 22:05:38.217929 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68"] Dec 04 22:05:38.228693 master-0 kubenswrapper[8606]: W1204 22:05:38.228652 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebfbb13d_c3f2_476d_bd89_cb8a13d2acee.slice/crio-8f9c23aa8f546cd0849cf585c0cd6540010999bcb6db0df49677dfec81935af9 WatchSource:0}: Error finding container 8f9c23aa8f546cd0849cf585c0cd6540010999bcb6db0df49677dfec81935af9: Status 404 returned error can't find the container with id 8f9c23aa8f546cd0849cf585c0cd6540010999bcb6db0df49677dfec81935af9 Dec 04 22:05:38.529124 master-0 kubenswrapper[8606]: I1204 22:05:38.529056 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx"] Dec 04 22:05:38.529775 master-0 kubenswrapper[8606]: I1204 22:05:38.529741 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:05:38.536282 master-0 kubenswrapper[8606]: I1204 22:05:38.536209 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 04 22:05:38.538479 master-0 kubenswrapper[8606]: I1204 22:05:38.538418 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4"] Dec 04 22:05:38.542675 master-0 kubenswrapper[8606]: I1204 22:05:38.539731 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" Dec 04 22:05:38.542675 master-0 kubenswrapper[8606]: I1204 22:05:38.541929 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5465c8b4db-8vm66"] Dec 04 22:05:38.543434 master-0 kubenswrapper[8606]: I1204 22:05:38.543387 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.544278 master-0 kubenswrapper[8606]: I1204 22:05:38.544220 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x"] Dec 04 22:05:38.545070 master-0 kubenswrapper[8606]: I1204 22:05:38.545034 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.545303 master-0 kubenswrapper[8606]: I1204 22:05:38.545231 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 22:05:38.551159 master-0 kubenswrapper[8606]: I1204 22:05:38.548603 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 22:05:38.551159 master-0 kubenswrapper[8606]: I1204 22:05:38.549060 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 22:05:38.551159 master-0 kubenswrapper[8606]: I1204 22:05:38.549867 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 22:05:38.551159 master-0 kubenswrapper[8606]: I1204 22:05:38.550400 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 22:05:38.551845 master-0 kubenswrapper[8606]: I1204 22:05:38.551165 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 22:05:38.551845 master-0 kubenswrapper[8606]: I1204 22:05:38.551314 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 22:05:38.563969 master-0 kubenswrapper[8606]: I1204 22:05:38.563846 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx"] Dec 04 22:05:38.580423 master-0 kubenswrapper[8606]: I1204 22:05:38.579796 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4"] Dec 04 22:05:38.586889 master-0 kubenswrapper[8606]: I1204 22:05:38.586817 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x"] Dec 04 22:05:38.713605 master-0 kubenswrapper[8606]: I1204 22:05:38.713456 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b966c210-5415-4fa5-88ab-c85aba979b28-tls-certificates\") pod \"prometheus-operator-admission-webhook-7c85c4dffd-mp4qx\" (UID: \"b966c210-5415-4fa5-88ab-c85aba979b28\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:05:38.713605 master-0 kubenswrapper[8606]: I1204 22:05:38.713570 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pctsn\" (UniqueName: \"kubernetes.io/projected/c178afcf-b713-4c74-b22b-6169ba3123f5-kube-api-access-pctsn\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.713605 master-0 kubenswrapper[8606]: I1204 22:05:38.713598 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-default-certificate\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.714172 master-0 kubenswrapper[8606]: I1204 22:05:38.713634 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da6da420-9631-4bce-b238-96ab361e23e9-secret-volume\") pod \"collect-profiles-29414760-r947x\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.714172 master-0 kubenswrapper[8606]: I1204 22:05:38.713673 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c178afcf-b713-4c74-b22b-6169ba3123f5-service-ca-bundle\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.714172 master-0 kubenswrapper[8606]: I1204 22:05:38.713880 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r2fn\" (UniqueName: \"kubernetes.io/projected/bda1cb0d-26cf-4b94-b359-432492112888-kube-api-access-8r2fn\") pod \"network-check-source-85d8db45d4-5gbc4\" (UID: \"bda1cb0d-26cf-4b94-b359-432492112888\") " pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" Dec 04 22:05:38.714172 master-0 kubenswrapper[8606]: I1204 22:05:38.713956 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-metrics-certs\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.714172 master-0 kubenswrapper[8606]: I1204 22:05:38.714023 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-stats-auth\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.714172 master-0 kubenswrapper[8606]: I1204 22:05:38.714096 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da6da420-9631-4bce-b238-96ab361e23e9-config-volume\") pod \"collect-profiles-29414760-r947x\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.714172 master-0 kubenswrapper[8606]: I1204 22:05:38.714119 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbj6l\" (UniqueName: \"kubernetes.io/projected/da6da420-9631-4bce-b238-96ab361e23e9-kube-api-access-vbj6l\") pod \"collect-profiles-29414760-r947x\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.815928 master-0 kubenswrapper[8606]: I1204 22:05:38.815810 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b966c210-5415-4fa5-88ab-c85aba979b28-tls-certificates\") pod \"prometheus-operator-admission-webhook-7c85c4dffd-mp4qx\" (UID: \"b966c210-5415-4fa5-88ab-c85aba979b28\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:05:38.816221 master-0 kubenswrapper[8606]: I1204 22:05:38.816165 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-default-certificate\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.816333 master-0 kubenswrapper[8606]: I1204 22:05:38.816284 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pctsn\" (UniqueName: \"kubernetes.io/projected/c178afcf-b713-4c74-b22b-6169ba3123f5-kube-api-access-pctsn\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.816478 master-0 kubenswrapper[8606]: I1204 22:05:38.816430 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da6da420-9631-4bce-b238-96ab361e23e9-secret-volume\") pod \"collect-profiles-29414760-r947x\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.816804 master-0 kubenswrapper[8606]: I1204 22:05:38.816746 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c178afcf-b713-4c74-b22b-6169ba3123f5-service-ca-bundle\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.817014 master-0 kubenswrapper[8606]: I1204 22:05:38.816949 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r2fn\" (UniqueName: \"kubernetes.io/projected/bda1cb0d-26cf-4b94-b359-432492112888-kube-api-access-8r2fn\") pod \"network-check-source-85d8db45d4-5gbc4\" (UID: \"bda1cb0d-26cf-4b94-b359-432492112888\") " pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" Dec 04 22:05:38.817079 master-0 kubenswrapper[8606]: I1204 22:05:38.817020 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-metrics-certs\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.817143 master-0 kubenswrapper[8606]: I1204 22:05:38.817108 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-stats-auth\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.817254 master-0 kubenswrapper[8606]: I1204 22:05:38.817218 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da6da420-9631-4bce-b238-96ab361e23e9-config-volume\") pod \"collect-profiles-29414760-r947x\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.817617 master-0 kubenswrapper[8606]: I1204 22:05:38.817549 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbj6l\" (UniqueName: \"kubernetes.io/projected/da6da420-9631-4bce-b238-96ab361e23e9-kube-api-access-vbj6l\") pod \"collect-profiles-29414760-r947x\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.818977 master-0 kubenswrapper[8606]: I1204 22:05:38.818920 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c178afcf-b713-4c74-b22b-6169ba3123f5-service-ca-bundle\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.821804 master-0 kubenswrapper[8606]: I1204 22:05:38.821690 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-metrics-certs\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.822419 master-0 kubenswrapper[8606]: I1204 22:05:38.822375 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b966c210-5415-4fa5-88ab-c85aba979b28-tls-certificates\") pod \"prometheus-operator-admission-webhook-7c85c4dffd-mp4qx\" (UID: \"b966c210-5415-4fa5-88ab-c85aba979b28\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:05:38.822642 master-0 kubenswrapper[8606]: I1204 22:05:38.822588 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da6da420-9631-4bce-b238-96ab361e23e9-config-volume\") pod \"collect-profiles-29414760-r947x\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.822720 master-0 kubenswrapper[8606]: I1204 22:05:38.822411 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-default-certificate\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.823496 master-0 kubenswrapper[8606]: I1204 22:05:38.823438 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da6da420-9631-4bce-b238-96ab361e23e9-secret-volume\") pod \"collect-profiles-29414760-r947x\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.825576 master-0 kubenswrapper[8606]: I1204 22:05:38.825482 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-stats-auth\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.846702 master-0 kubenswrapper[8606]: I1204 22:05:38.846619 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbj6l\" (UniqueName: \"kubernetes.io/projected/da6da420-9631-4bce-b238-96ab361e23e9-kube-api-access-vbj6l\") pod \"collect-profiles-29414760-r947x\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.847841 master-0 kubenswrapper[8606]: I1204 22:05:38.847797 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pctsn\" (UniqueName: \"kubernetes.io/projected/c178afcf-b713-4c74-b22b-6169ba3123f5-kube-api-access-pctsn\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.849071 master-0 kubenswrapper[8606]: I1204 22:05:38.849004 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r2fn\" (UniqueName: \"kubernetes.io/projected/bda1cb0d-26cf-4b94-b359-432492112888-kube-api-access-8r2fn\") pod \"network-check-source-85d8db45d4-5gbc4\" (UID: \"bda1cb0d-26cf-4b94-b359-432492112888\") " pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" Dec 04 22:05:38.867752 master-0 kubenswrapper[8606]: I1204 22:05:38.867645 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:05:38.888832 master-0 kubenswrapper[8606]: I1204 22:05:38.888709 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" Dec 04 22:05:38.940257 master-0 kubenswrapper[8606]: I1204 22:05:38.940168 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:38.957551 master-0 kubenswrapper[8606]: I1204 22:05:38.957412 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:38.997906 master-0 kubenswrapper[8606]: W1204 22:05:38.997807 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc178afcf_b713_4c74_b22b_6169ba3123f5.slice/crio-10dda04dd9f1ca247b35249d5e7333d86ebc7e3902573b98ac28839fb9bcb514 WatchSource:0}: Error finding container 10dda04dd9f1ca247b35249d5e7333d86ebc7e3902573b98ac28839fb9bcb514: Status 404 returned error can't find the container with id 10dda04dd9f1ca247b35249d5e7333d86ebc7e3902573b98ac28839fb9bcb514 Dec 04 22:05:39.001122 master-0 kubenswrapper[8606]: I1204 22:05:39.001084 8606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 22:05:39.201739 master-0 kubenswrapper[8606]: I1204 22:05:39.201664 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerStarted","Data":"10dda04dd9f1ca247b35249d5e7333d86ebc7e3902573b98ac28839fb9bcb514"} Dec 04 22:05:39.206063 master-0 kubenswrapper[8606]: I1204 22:05:39.206000 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" event={"ID":"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee","Type":"ContainerStarted","Data":"b50d3dde2385d41664f1a848281a1446b281ce5db83aba57c400f3d223be8bb9"} Dec 04 22:05:39.206119 master-0 kubenswrapper[8606]: I1204 22:05:39.206069 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" event={"ID":"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee","Type":"ContainerStarted","Data":"989d170e7a52318dc012c9f9d9615ad932d427b1e6a54fccb0b6d83f4ebb7d45"} Dec 04 22:05:39.206119 master-0 kubenswrapper[8606]: I1204 22:05:39.206087 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" event={"ID":"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee","Type":"ContainerStarted","Data":"8f9c23aa8f546cd0849cf585c0cd6540010999bcb6db0df49677dfec81935af9"} Dec 04 22:05:39.331525 master-0 kubenswrapper[8606]: I1204 22:05:39.331337 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" podStartSLOduration=2.331312553 podStartE2EDuration="2.331312553s" podCreationTimestamp="2025-12-04 22:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:05:39.234216217 +0000 UTC m=+304.044518452" watchObservedRunningTime="2025-12-04 22:05:39.331312553 +0000 UTC m=+304.141614778" Dec 04 22:05:39.335135 master-0 kubenswrapper[8606]: I1204 22:05:39.335101 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx"] Dec 04 22:05:39.340042 master-0 kubenswrapper[8606]: W1204 22:05:39.339975 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb966c210_5415_4fa5_88ab_c85aba979b28.slice/crio-9ff695d5f754bc1ab4c8a5e3ad28cb942f185e2cf70cdd2d8a2eeb6d3f679b39 WatchSource:0}: Error finding container 9ff695d5f754bc1ab4c8a5e3ad28cb942f185e2cf70cdd2d8a2eeb6d3f679b39: Status 404 returned error can't find the container with id 9ff695d5f754bc1ab4c8a5e3ad28cb942f185e2cf70cdd2d8a2eeb6d3f679b39 Dec 04 22:05:39.407237 master-0 kubenswrapper[8606]: I1204 22:05:39.407170 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4"] Dec 04 22:05:39.420343 master-0 kubenswrapper[8606]: W1204 22:05:39.420270 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda1cb0d_26cf_4b94_b359_432492112888.slice/crio-773c5e8477795f70534d191a1a57bd8d7b1ab26d3d0db825f97bcaae1e3dd144 WatchSource:0}: Error finding container 773c5e8477795f70534d191a1a57bd8d7b1ab26d3d0db825f97bcaae1e3dd144: Status 404 returned error can't find the container with id 773c5e8477795f70534d191a1a57bd8d7b1ab26d3d0db825f97bcaae1e3dd144 Dec 04 22:05:39.452465 master-0 kubenswrapper[8606]: I1204 22:05:39.452339 8606 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 22:05:39.482445 master-0 kubenswrapper[8606]: I1204 22:05:39.482391 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x"] Dec 04 22:05:39.502619 master-0 kubenswrapper[8606]: W1204 22:05:39.502569 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda6da420_9631_4bce_b238_96ab361e23e9.slice/crio-27e285c843ede4a182c19163bdbdbd9ef215a975e244f28e8e13fb302c8fae17 WatchSource:0}: Error finding container 27e285c843ede4a182c19163bdbdbd9ef215a975e244f28e8e13fb302c8fae17: Status 404 returned error can't find the container with id 27e285c843ede4a182c19163bdbdbd9ef215a975e244f28e8e13fb302c8fae17 Dec 04 22:05:40.212541 master-0 kubenswrapper[8606]: I1204 22:05:40.212458 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" event={"ID":"bda1cb0d-26cf-4b94-b359-432492112888","Type":"ContainerStarted","Data":"97e7fad06874576807015929933db6e964b960f7f73a618318b8ef08df129459"} Dec 04 22:05:40.212541 master-0 kubenswrapper[8606]: I1204 22:05:40.212541 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" event={"ID":"bda1cb0d-26cf-4b94-b359-432492112888","Type":"ContainerStarted","Data":"773c5e8477795f70534d191a1a57bd8d7b1ab26d3d0db825f97bcaae1e3dd144"} Dec 04 22:05:40.214490 master-0 kubenswrapper[8606]: I1204 22:05:40.214459 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" event={"ID":"da6da420-9631-4bce-b238-96ab361e23e9","Type":"ContainerStarted","Data":"d281c21cd4e6a5bdaa904d1cec96c25970563004aa9943074804eafd85bd5f1e"} Dec 04 22:05:40.214574 master-0 kubenswrapper[8606]: I1204 22:05:40.214491 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" event={"ID":"da6da420-9631-4bce-b238-96ab361e23e9","Type":"ContainerStarted","Data":"27e285c843ede4a182c19163bdbdbd9ef215a975e244f28e8e13fb302c8fae17"} Dec 04 22:05:40.215777 master-0 kubenswrapper[8606]: I1204 22:05:40.215719 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" event={"ID":"b966c210-5415-4fa5-88ab-c85aba979b28","Type":"ContainerStarted","Data":"9ff695d5f754bc1ab4c8a5e3ad28cb942f185e2cf70cdd2d8a2eeb6d3f679b39"} Dec 04 22:05:40.243116 master-0 kubenswrapper[8606]: I1204 22:05:40.242995 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" podStartSLOduration=370.242965521 podStartE2EDuration="6m10.242965521s" podCreationTimestamp="2025-12-04 21:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:05:40.231562799 +0000 UTC m=+305.041865014" watchObservedRunningTime="2025-12-04 22:05:40.242965521 +0000 UTC m=+305.053267736" Dec 04 22:05:40.257098 master-0 kubenswrapper[8606]: I1204 22:05:40.256988 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" podStartSLOduration=340.256953048 podStartE2EDuration="5m40.256953048s" podCreationTimestamp="2025-12-04 22:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:05:40.249893692 +0000 UTC m=+305.060195917" watchObservedRunningTime="2025-12-04 22:05:40.256953048 +0000 UTC m=+305.067255263" Dec 04 22:05:41.867165 master-0 kubenswrapper[8606]: I1204 22:05:41.866976 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-wmm89"] Dec 04 22:05:41.868304 master-0 kubenswrapper[8606]: I1204 22:05:41.867670 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:41.870185 master-0 kubenswrapper[8606]: I1204 22:05:41.870119 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 22:05:41.870760 master-0 kubenswrapper[8606]: I1204 22:05:41.870700 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 22:05:41.870918 master-0 kubenswrapper[8606]: I1204 22:05:41.870769 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-h8rp7" Dec 04 22:05:41.890159 master-0 kubenswrapper[8606]: I1204 22:05:41.890099 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-node-bootstrap-token\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:41.890345 master-0 kubenswrapper[8606]: I1204 22:05:41.890198 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-certs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:41.890345 master-0 kubenswrapper[8606]: I1204 22:05:41.890255 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w8vs\" (UniqueName: \"kubernetes.io/projected/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-kube-api-access-2w8vs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:41.991435 master-0 kubenswrapper[8606]: I1204 22:05:41.991378 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-node-bootstrap-token\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:41.991435 master-0 kubenswrapper[8606]: I1204 22:05:41.991450 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-certs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:41.991783 master-0 kubenswrapper[8606]: I1204 22:05:41.991485 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w8vs\" (UniqueName: \"kubernetes.io/projected/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-kube-api-access-2w8vs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:41.995937 master-0 kubenswrapper[8606]: I1204 22:05:41.995870 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-node-bootstrap-token\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:42.008623 master-0 kubenswrapper[8606]: I1204 22:05:42.008568 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-certs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:42.016985 master-0 kubenswrapper[8606]: I1204 22:05:42.016942 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w8vs\" (UniqueName: \"kubernetes.io/projected/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-kube-api-access-2w8vs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:42.188046 master-0 kubenswrapper[8606]: I1204 22:05:42.187959 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:05:42.234263 master-0 kubenswrapper[8606]: I1204 22:05:42.234126 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" event={"ID":"b966c210-5415-4fa5-88ab-c85aba979b28","Type":"ContainerStarted","Data":"e64f5f283df42fbfd3b016ddfaa5b8ed71386c26b5f0eb7a21d4b6a37b395d52"} Dec 04 22:05:42.234921 master-0 kubenswrapper[8606]: I1204 22:05:42.234411 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:05:42.238761 master-0 kubenswrapper[8606]: I1204 22:05:42.238627 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerStarted","Data":"b6b73ce0e49791c4429e0dc83a179c1414cef4c0eb61371c4801b941d91131b9"} Dec 04 22:05:42.241166 master-0 kubenswrapper[8606]: I1204 22:05:42.241110 8606 generic.go:334] "Generic (PLEG): container finished" podID="da6da420-9631-4bce-b238-96ab361e23e9" containerID="d281c21cd4e6a5bdaa904d1cec96c25970563004aa9943074804eafd85bd5f1e" exitCode=0 Dec 04 22:05:42.241290 master-0 kubenswrapper[8606]: I1204 22:05:42.241170 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" event={"ID":"da6da420-9631-4bce-b238-96ab361e23e9","Type":"ContainerDied","Data":"d281c21cd4e6a5bdaa904d1cec96c25970563004aa9943074804eafd85bd5f1e"} Dec 04 22:05:42.241480 master-0 kubenswrapper[8606]: I1204 22:05:42.241435 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:05:42.245887 master-0 kubenswrapper[8606]: W1204 22:05:42.245834 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4d8477_c3b5_4e88_aaa9_222ad56d974c.slice/crio-4e849604a662b099203e2c576ae634e61f44af1ebad609cae720f6f60b5023c0 WatchSource:0}: Error finding container 4e849604a662b099203e2c576ae634e61f44af1ebad609cae720f6f60b5023c0: Status 404 returned error can't find the container with id 4e849604a662b099203e2c576ae634e61f44af1ebad609cae720f6f60b5023c0 Dec 04 22:05:42.261106 master-0 kubenswrapper[8606]: I1204 22:05:42.259750 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" podStartSLOduration=270.976933935 podStartE2EDuration="4m33.259705968s" podCreationTimestamp="2025-12-04 22:01:09 +0000 UTC" firstStartedPulling="2025-12-04 22:05:39.343229271 +0000 UTC m=+304.153531486" lastFinishedPulling="2025-12-04 22:05:41.626001304 +0000 UTC m=+306.436303519" observedRunningTime="2025-12-04 22:05:42.257546614 +0000 UTC m=+307.067848919" watchObservedRunningTime="2025-12-04 22:05:42.259705968 +0000 UTC m=+307.070008183" Dec 04 22:05:42.296738 master-0 kubenswrapper[8606]: I1204 22:05:42.296653 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podStartSLOduration=270.676097424 podStartE2EDuration="4m33.296630153s" podCreationTimestamp="2025-12-04 22:01:09 +0000 UTC" firstStartedPulling="2025-12-04 22:05:39.000943743 +0000 UTC m=+303.811245948" lastFinishedPulling="2025-12-04 22:05:41.621476462 +0000 UTC m=+306.431778677" observedRunningTime="2025-12-04 22:05:42.291080721 +0000 UTC m=+307.101382956" watchObservedRunningTime="2025-12-04 22:05:42.296630153 +0000 UTC m=+307.106932388" Dec 04 22:05:42.941975 master-0 kubenswrapper[8606]: I1204 22:05:42.941677 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:42.946138 master-0 kubenswrapper[8606]: I1204 22:05:42.946046 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:42.946138 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:42.946138 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:42.946138 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:42.946583 master-0 kubenswrapper[8606]: I1204 22:05:42.946161 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:43.251471 master-0 kubenswrapper[8606]: I1204 22:05:43.251287 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wmm89" event={"ID":"eb4d8477-c3b5-4e88-aaa9-222ad56d974c","Type":"ContainerStarted","Data":"3250f93800f14f1984b89093aa1038684a73aea8a159904e7ccc7f265450fb5b"} Dec 04 22:05:43.251471 master-0 kubenswrapper[8606]: I1204 22:05:43.251390 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wmm89" event={"ID":"eb4d8477-c3b5-4e88-aaa9-222ad56d974c","Type":"ContainerStarted","Data":"4e849604a662b099203e2c576ae634e61f44af1ebad609cae720f6f60b5023c0"} Dec 04 22:05:43.274639 master-0 kubenswrapper[8606]: I1204 22:05:43.274535 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-wmm89" podStartSLOduration=2.274481937 podStartE2EDuration="2.274481937s" podCreationTimestamp="2025-12-04 22:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:05:43.270769169 +0000 UTC m=+308.081071394" watchObservedRunningTime="2025-12-04 22:05:43.274481937 +0000 UTC m=+308.084784152" Dec 04 22:05:43.303753 master-0 kubenswrapper[8606]: I1204 22:05:43.303677 8606 scope.go:117] "RemoveContainer" containerID="70ec2f528f522213daf96bac275fda7cf7f15b026ed56e4b58dab19aaca3bd29" Dec 04 22:05:43.320555 master-0 kubenswrapper[8606]: I1204 22:05:43.320488 8606 scope.go:117] "RemoveContainer" containerID="d2ec9d7da1c0e81ac2a2563a5da4eba0b637698001afaf92060cbb9b07bcf2c4" Dec 04 22:05:43.548145 master-0 kubenswrapper[8606]: I1204 22:05:43.545020 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:43.618239 master-0 kubenswrapper[8606]: I1204 22:05:43.618174 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da6da420-9631-4bce-b238-96ab361e23e9-config-volume\") pod \"da6da420-9631-4bce-b238-96ab361e23e9\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " Dec 04 22:05:43.618239 master-0 kubenswrapper[8606]: I1204 22:05:43.618252 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbj6l\" (UniqueName: \"kubernetes.io/projected/da6da420-9631-4bce-b238-96ab361e23e9-kube-api-access-vbj6l\") pod \"da6da420-9631-4bce-b238-96ab361e23e9\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " Dec 04 22:05:43.618700 master-0 kubenswrapper[8606]: I1204 22:05:43.618318 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da6da420-9631-4bce-b238-96ab361e23e9-secret-volume\") pod \"da6da420-9631-4bce-b238-96ab361e23e9\" (UID: \"da6da420-9631-4bce-b238-96ab361e23e9\") " Dec 04 22:05:43.622524 master-0 kubenswrapper[8606]: I1204 22:05:43.619388 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da6da420-9631-4bce-b238-96ab361e23e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "da6da420-9631-4bce-b238-96ab361e23e9" (UID: "da6da420-9631-4bce-b238-96ab361e23e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:05:43.626538 master-0 kubenswrapper[8606]: I1204 22:05:43.622725 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da6da420-9631-4bce-b238-96ab361e23e9-kube-api-access-vbj6l" (OuterVolumeSpecName: "kube-api-access-vbj6l") pod "da6da420-9631-4bce-b238-96ab361e23e9" (UID: "da6da420-9631-4bce-b238-96ab361e23e9"). InnerVolumeSpecName "kube-api-access-vbj6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:05:43.631527 master-0 kubenswrapper[8606]: I1204 22:05:43.628609 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da6da420-9631-4bce-b238-96ab361e23e9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da6da420-9631-4bce-b238-96ab361e23e9" (UID: "da6da420-9631-4bce-b238-96ab361e23e9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:05:43.719912 master-0 kubenswrapper[8606]: I1204 22:05:43.719843 8606 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da6da420-9631-4bce-b238-96ab361e23e9-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:43.719912 master-0 kubenswrapper[8606]: I1204 22:05:43.719896 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbj6l\" (UniqueName: \"kubernetes.io/projected/da6da420-9631-4bce-b238-96ab361e23e9-kube-api-access-vbj6l\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:43.719912 master-0 kubenswrapper[8606]: I1204 22:05:43.719909 8606 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da6da420-9631-4bce-b238-96ab361e23e9-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 22:05:43.857049 master-0 kubenswrapper[8606]: I1204 22:05:43.849249 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh"] Dec 04 22:05:43.857933 master-0 kubenswrapper[8606]: E1204 22:05:43.857892 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da6da420-9631-4bce-b238-96ab361e23e9" containerName="collect-profiles" Dec 04 22:05:43.858049 master-0 kubenswrapper[8606]: I1204 22:05:43.858035 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6da420-9631-4bce-b238-96ab361e23e9" containerName="collect-profiles" Dec 04 22:05:43.858323 master-0 kubenswrapper[8606]: I1204 22:05:43.858307 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="da6da420-9631-4bce-b238-96ab361e23e9" containerName="collect-profiles" Dec 04 22:05:43.860141 master-0 kubenswrapper[8606]: I1204 22:05:43.860119 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:43.866797 master-0 kubenswrapper[8606]: I1204 22:05:43.866349 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh"] Dec 04 22:05:43.868028 master-0 kubenswrapper[8606]: I1204 22:05:43.867961 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 04 22:05:43.868378 master-0 kubenswrapper[8606]: I1204 22:05:43.868344 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 04 22:05:43.868461 master-0 kubenswrapper[8606]: I1204 22:05:43.868431 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-8rl6s" Dec 04 22:05:43.869669 master-0 kubenswrapper[8606]: I1204 22:05:43.869652 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 04 22:05:43.926862 master-0 kubenswrapper[8606]: I1204 22:05:43.926352 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6684358b-d7a6-4396-9b4f-ea67d85e4517-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:43.926862 master-0 kubenswrapper[8606]: I1204 22:05:43.926416 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkg8s\" (UniqueName: \"kubernetes.io/projected/6684358b-d7a6-4396-9b4f-ea67d85e4517-kube-api-access-qkg8s\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:43.926862 master-0 kubenswrapper[8606]: I1204 22:05:43.926471 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:43.926862 master-0 kubenswrapper[8606]: I1204 22:05:43.926568 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:43.944126 master-0 kubenswrapper[8606]: I1204 22:05:43.944048 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:43.944126 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:43.944126 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:43.944126 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:43.944830 master-0 kubenswrapper[8606]: I1204 22:05:43.944128 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:44.027976 master-0 kubenswrapper[8606]: I1204 22:05:44.027908 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:44.028194 master-0 kubenswrapper[8606]: I1204 22:05:44.028135 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6684358b-d7a6-4396-9b4f-ea67d85e4517-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:44.028194 master-0 kubenswrapper[8606]: I1204 22:05:44.028177 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkg8s\" (UniqueName: \"kubernetes.io/projected/6684358b-d7a6-4396-9b4f-ea67d85e4517-kube-api-access-qkg8s\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:44.028258 master-0 kubenswrapper[8606]: I1204 22:05:44.028221 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:44.029344 master-0 kubenswrapper[8606]: I1204 22:05:44.029302 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6684358b-d7a6-4396-9b4f-ea67d85e4517-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:44.031779 master-0 kubenswrapper[8606]: I1204 22:05:44.031712 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:44.034246 master-0 kubenswrapper[8606]: I1204 22:05:44.034205 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:44.048599 master-0 kubenswrapper[8606]: I1204 22:05:44.048494 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkg8s\" (UniqueName: \"kubernetes.io/projected/6684358b-d7a6-4396-9b4f-ea67d85e4517-kube-api-access-qkg8s\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:44.190356 master-0 kubenswrapper[8606]: I1204 22:05:44.190176 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:05:44.263084 master-0 kubenswrapper[8606]: I1204 22:05:44.262939 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" event={"ID":"da6da420-9631-4bce-b238-96ab361e23e9","Type":"ContainerDied","Data":"27e285c843ede4a182c19163bdbdbd9ef215a975e244f28e8e13fb302c8fae17"} Dec 04 22:05:44.263084 master-0 kubenswrapper[8606]: I1204 22:05:44.263065 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27e285c843ede4a182c19163bdbdbd9ef215a975e244f28e8e13fb302c8fae17" Dec 04 22:05:44.263366 master-0 kubenswrapper[8606]: I1204 22:05:44.263228 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:05:44.685173 master-0 kubenswrapper[8606]: I1204 22:05:44.685117 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh"] Dec 04 22:05:44.688924 master-0 kubenswrapper[8606]: W1204 22:05:44.688879 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6684358b_d7a6_4396_9b4f_ea67d85e4517.slice/crio-de08a4b22951aecb57c029d3a74e638dc3b7212560569cf21e54820113aad20f WatchSource:0}: Error finding container de08a4b22951aecb57c029d3a74e638dc3b7212560569cf21e54820113aad20f: Status 404 returned error can't find the container with id de08a4b22951aecb57c029d3a74e638dc3b7212560569cf21e54820113aad20f Dec 04 22:05:44.944630 master-0 kubenswrapper[8606]: I1204 22:05:44.944359 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:44.944630 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:44.944630 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:44.944630 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:44.944630 master-0 kubenswrapper[8606]: I1204 22:05:44.944488 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:45.275201 master-0 kubenswrapper[8606]: I1204 22:05:45.275130 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" event={"ID":"6684358b-d7a6-4396-9b4f-ea67d85e4517","Type":"ContainerStarted","Data":"de08a4b22951aecb57c029d3a74e638dc3b7212560569cf21e54820113aad20f"} Dec 04 22:05:45.945230 master-0 kubenswrapper[8606]: I1204 22:05:45.944995 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:45.945230 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:45.945230 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:45.945230 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:45.945230 master-0 kubenswrapper[8606]: I1204 22:05:45.945109 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:46.944938 master-0 kubenswrapper[8606]: I1204 22:05:46.944808 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:46.944938 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:46.944938 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:46.944938 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:46.946251 master-0 kubenswrapper[8606]: I1204 22:05:46.944960 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:47.945545 master-0 kubenswrapper[8606]: I1204 22:05:47.945415 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:47.945545 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:47.945545 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:47.945545 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:47.945545 master-0 kubenswrapper[8606]: I1204 22:05:47.945536 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:48.945070 master-0 kubenswrapper[8606]: I1204 22:05:48.944886 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:05:48.947804 master-0 kubenswrapper[8606]: I1204 22:05:48.947729 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:48.947804 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:48.947804 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:48.947804 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:48.948388 master-0 kubenswrapper[8606]: I1204 22:05:48.947830 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:49.943932 master-0 kubenswrapper[8606]: I1204 22:05:49.943816 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:49.943932 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:49.943932 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:49.943932 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:49.943932 master-0 kubenswrapper[8606]: I1204 22:05:49.943900 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:50.943778 master-0 kubenswrapper[8606]: I1204 22:05:50.943657 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:50.943778 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:50.943778 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:50.943778 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:50.943778 master-0 kubenswrapper[8606]: I1204 22:05:50.943736 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:51.943257 master-0 kubenswrapper[8606]: I1204 22:05:51.943172 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:51.943257 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:51.943257 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:51.943257 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:51.943602 master-0 kubenswrapper[8606]: I1204 22:05:51.943268 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:52.944321 master-0 kubenswrapper[8606]: I1204 22:05:52.944228 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:52.944321 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:52.944321 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:52.944321 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:52.945486 master-0 kubenswrapper[8606]: I1204 22:05:52.945029 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:53.944596 master-0 kubenswrapper[8606]: I1204 22:05:53.944486 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:53.944596 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:53.944596 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:53.944596 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:53.944596 master-0 kubenswrapper[8606]: I1204 22:05:53.944642 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:54.345003 master-0 kubenswrapper[8606]: I1204 22:05:54.344935 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" event={"ID":"6684358b-d7a6-4396-9b4f-ea67d85e4517","Type":"ContainerStarted","Data":"d806d9899e8fa078727408a39e9114b2d7cbb567d62907d0beaaea2425600e9f"} Dec 04 22:05:54.944261 master-0 kubenswrapper[8606]: I1204 22:05:54.944151 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:54.944261 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:54.944261 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:54.944261 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:54.944709 master-0 kubenswrapper[8606]: I1204 22:05:54.944291 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:55.357169 master-0 kubenswrapper[8606]: I1204 22:05:55.357086 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" event={"ID":"6684358b-d7a6-4396-9b4f-ea67d85e4517","Type":"ContainerStarted","Data":"f4a6e4c5c5359ab21e9030099147343225d4aeaef29fb463a8e6710e457570df"} Dec 04 22:05:55.383682 master-0 kubenswrapper[8606]: I1204 22:05:55.383599 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" podStartSLOduration=2.957835614 podStartE2EDuration="12.383581949s" podCreationTimestamp="2025-12-04 22:05:43 +0000 UTC" firstStartedPulling="2025-12-04 22:05:44.693193119 +0000 UTC m=+309.503495334" lastFinishedPulling="2025-12-04 22:05:54.118939464 +0000 UTC m=+318.929241669" observedRunningTime="2025-12-04 22:05:55.38222789 +0000 UTC m=+320.192530095" watchObservedRunningTime="2025-12-04 22:05:55.383581949 +0000 UTC m=+320.193884164" Dec 04 22:05:55.944386 master-0 kubenswrapper[8606]: I1204 22:05:55.944273 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:55.944386 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:55.944386 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:55.944386 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:55.944984 master-0 kubenswrapper[8606]: I1204 22:05:55.944391 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:56.944688 master-0 kubenswrapper[8606]: I1204 22:05:56.944614 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:56.944688 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:56.944688 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:56.944688 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:56.945347 master-0 kubenswrapper[8606]: I1204 22:05:56.944732 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:56.965408 master-0 kubenswrapper[8606]: I1204 22:05:56.965362 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-p5qlk"] Dec 04 22:05:56.967161 master-0 kubenswrapper[8606]: I1204 22:05:56.967137 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:56.969751 master-0 kubenswrapper[8606]: I1204 22:05:56.969686 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 04 22:05:56.969867 master-0 kubenswrapper[8606]: I1204 22:05:56.969726 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-cmdvd" Dec 04 22:05:56.969867 master-0 kubenswrapper[8606]: I1204 22:05:56.969735 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 04 22:05:56.980163 master-0 kubenswrapper[8606]: I1204 22:05:56.980071 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq"] Dec 04 22:05:56.982061 master-0 kubenswrapper[8606]: I1204 22:05:56.981851 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:56.985728 master-0 kubenswrapper[8606]: I1204 22:05:56.985670 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-tm5gx" Dec 04 22:05:56.989634 master-0 kubenswrapper[8606]: I1204 22:05:56.985926 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 04 22:05:56.989634 master-0 kubenswrapper[8606]: I1204 22:05:56.986061 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 04 22:05:57.002748 master-0 kubenswrapper[8606]: I1204 22:05:56.999948 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq"] Dec 04 22:05:57.009412 master-0 kubenswrapper[8606]: I1204 22:05:57.009337 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-5857974f64-qqxk9"] Dec 04 22:05:57.014517 master-0 kubenswrapper[8606]: I1204 22:05:57.013445 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.018560 master-0 kubenswrapper[8606]: I1204 22:05:57.015813 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 04 22:05:57.018560 master-0 kubenswrapper[8606]: I1204 22:05:57.016023 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 04 22:05:57.018560 master-0 kubenswrapper[8606]: I1204 22:05:57.016132 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 04 22:05:57.018560 master-0 kubenswrapper[8606]: I1204 22:05:57.016148 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-5kj7n" Dec 04 22:05:57.024864 master-0 kubenswrapper[8606]: I1204 22:05:57.024804 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-5857974f64-qqxk9"] Dec 04 22:05:57.059139 master-0 kubenswrapper[8606]: I1204 22:05:57.059091 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.059357 master-0 kubenswrapper[8606]: I1204 22:05:57.059150 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-volume-directive-shadow\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.059357 master-0 kubenswrapper[8606]: I1204 22:05:57.059192 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-textfile\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.059441 master-0 kubenswrapper[8606]: I1204 22:05:57.059328 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.059487 master-0 kubenswrapper[8606]: I1204 22:05:57.059438 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a726f44-a509-46b3-a6d5-70afe3b55e9f-metrics-client-ca\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.059558 master-0 kubenswrapper[8606]: I1204 22:05:57.059536 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smtnh\" (UniqueName: \"kubernetes.io/projected/17912746-74eb-4c78-8c1b-2f66e7ce4299-kube-api-access-smtnh\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.059623 master-0 kubenswrapper[8606]: I1204 22:05:57.059589 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.059678 master-0 kubenswrapper[8606]: I1204 22:05:57.059664 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-sys\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.060200 master-0 kubenswrapper[8606]: I1204 22:05:57.060151 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.060365 master-0 kubenswrapper[8606]: I1204 22:05:57.060336 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9z4k\" (UniqueName: \"kubernetes.io/projected/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-api-access-b9z4k\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.060446 master-0 kubenswrapper[8606]: I1204 22:05:57.060395 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-wtmp\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.060446 master-0 kubenswrapper[8606]: I1204 22:05:57.060433 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.060587 master-0 kubenswrapper[8606]: I1204 22:05:57.060554 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.060780 master-0 kubenswrapper[8606]: I1204 22:05:57.060724 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-root\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.060832 master-0 kubenswrapper[8606]: I1204 22:05:57.060802 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.060890 master-0 kubenswrapper[8606]: I1204 22:05:57.060842 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.060925 master-0 kubenswrapper[8606]: I1204 22:05:57.060889 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17912746-74eb-4c78-8c1b-2f66e7ce4299-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.060994 master-0 kubenswrapper[8606]: I1204 22:05:57.060965 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jmv\" (UniqueName: \"kubernetes.io/projected/0a726f44-a509-46b3-a6d5-70afe3b55e9f-kube-api-access-k8jmv\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.162071 master-0 kubenswrapper[8606]: I1204 22:05:57.162007 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a726f44-a509-46b3-a6d5-70afe3b55e9f-metrics-client-ca\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.162071 master-0 kubenswrapper[8606]: I1204 22:05:57.162076 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smtnh\" (UniqueName: \"kubernetes.io/projected/17912746-74eb-4c78-8c1b-2f66e7ce4299-kube-api-access-smtnh\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.162334 master-0 kubenswrapper[8606]: I1204 22:05:57.162109 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.162334 master-0 kubenswrapper[8606]: I1204 22:05:57.162137 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-sys\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.162334 master-0 kubenswrapper[8606]: I1204 22:05:57.162159 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.162334 master-0 kubenswrapper[8606]: I1204 22:05:57.162189 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9z4k\" (UniqueName: \"kubernetes.io/projected/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-api-access-b9z4k\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.162334 master-0 kubenswrapper[8606]: I1204 22:05:57.162213 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-wtmp\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.162334 master-0 kubenswrapper[8606]: I1204 22:05:57.162235 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.162334 master-0 kubenswrapper[8606]: I1204 22:05:57.162263 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.162334 master-0 kubenswrapper[8606]: I1204 22:05:57.162295 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-root\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.162334 master-0 kubenswrapper[8606]: I1204 22:05:57.162318 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.162802 master-0 kubenswrapper[8606]: I1204 22:05:57.162343 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.162802 master-0 kubenswrapper[8606]: I1204 22:05:57.162369 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17912746-74eb-4c78-8c1b-2f66e7ce4299-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.162802 master-0 kubenswrapper[8606]: I1204 22:05:57.162403 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jmv\" (UniqueName: \"kubernetes.io/projected/0a726f44-a509-46b3-a6d5-70afe3b55e9f-kube-api-access-k8jmv\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.162802 master-0 kubenswrapper[8606]: I1204 22:05:57.162449 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.162802 master-0 kubenswrapper[8606]: I1204 22:05:57.162478 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-volume-directive-shadow\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.163113 master-0 kubenswrapper[8606]: I1204 22:05:57.163048 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-wtmp\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.163665 master-0 kubenswrapper[8606]: I1204 22:05:57.163638 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-volume-directive-shadow\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.163775 master-0 kubenswrapper[8606]: I1204 22:05:57.163699 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a726f44-a509-46b3-a6d5-70afe3b55e9f-metrics-client-ca\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.163866 master-0 kubenswrapper[8606]: I1204 22:05:57.163838 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17912746-74eb-4c78-8c1b-2f66e7ce4299-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.163926 master-0 kubenswrapper[8606]: I1204 22:05:57.163904 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-textfile\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.164377 master-0 kubenswrapper[8606]: I1204 22:05:57.164348 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-root\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.164449 master-0 kubenswrapper[8606]: I1204 22:05:57.164370 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.164585 master-0 kubenswrapper[8606]: I1204 22:05:57.164528 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-textfile\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.164728 master-0 kubenswrapper[8606]: I1204 22:05:57.164630 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-sys\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.165737 master-0 kubenswrapper[8606]: I1204 22:05:57.165012 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.166300 master-0 kubenswrapper[8606]: I1204 22:05:57.166247 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.167894 master-0 kubenswrapper[8606]: I1204 22:05:57.167162 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.167894 master-0 kubenswrapper[8606]: I1204 22:05:57.167206 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.167894 master-0 kubenswrapper[8606]: I1204 22:05:57.167669 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.168546 master-0 kubenswrapper[8606]: I1204 22:05:57.168521 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.168846 master-0 kubenswrapper[8606]: I1204 22:05:57.168735 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.171606 master-0 kubenswrapper[8606]: I1204 22:05:57.171551 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.181371 master-0 kubenswrapper[8606]: I1204 22:05:57.181320 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jmv\" (UniqueName: \"kubernetes.io/projected/0a726f44-a509-46b3-a6d5-70afe3b55e9f-kube-api-access-k8jmv\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.184949 master-0 kubenswrapper[8606]: I1204 22:05:57.184887 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smtnh\" (UniqueName: \"kubernetes.io/projected/17912746-74eb-4c78-8c1b-2f66e7ce4299-kube-api-access-smtnh\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.186769 master-0 kubenswrapper[8606]: I1204 22:05:57.186711 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9z4k\" (UniqueName: \"kubernetes.io/projected/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-api-access-b9z4k\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.292401 master-0 kubenswrapper[8606]: I1204 22:05:57.292347 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:05:57.342099 master-0 kubenswrapper[8606]: I1204 22:05:57.342068 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:05:57.365948 master-0 kubenswrapper[8606]: I1204 22:05:57.365881 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:05:57.372760 master-0 kubenswrapper[8606]: I1204 22:05:57.372555 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5qlk" event={"ID":"0a726f44-a509-46b3-a6d5-70afe3b55e9f","Type":"ContainerStarted","Data":"0c849ebda1ef05c2e7568afd8bbf5411d8e51e42f17fd972708d247af11d0983"} Dec 04 22:05:57.782525 master-0 kubenswrapper[8606]: I1204 22:05:57.782450 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq"] Dec 04 22:05:57.828411 master-0 kubenswrapper[8606]: I1204 22:05:57.828344 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-5857974f64-qqxk9"] Dec 04 22:05:57.835965 master-0 kubenswrapper[8606]: W1204 22:05:57.835915 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d84a7d3_46d1_48e3_83f3_f6b32f16cc76.slice/crio-857d5516010228f43de819bef01594930c5dd807f2fd4da2fb1509f797fa1774 WatchSource:0}: Error finding container 857d5516010228f43de819bef01594930c5dd807f2fd4da2fb1509f797fa1774: Status 404 returned error can't find the container with id 857d5516010228f43de819bef01594930c5dd807f2fd4da2fb1509f797fa1774 Dec 04 22:05:57.944314 master-0 kubenswrapper[8606]: I1204 22:05:57.944148 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:57.944314 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:57.944314 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:57.944314 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:57.944314 master-0 kubenswrapper[8606]: I1204 22:05:57.944249 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:58.394482 master-0 kubenswrapper[8606]: I1204 22:05:58.394211 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" event={"ID":"17912746-74eb-4c78-8c1b-2f66e7ce4299","Type":"ContainerStarted","Data":"62bb8fd14dd9c2077e5122579d140ab222e13a81de3357a0f0b9c3f9b8580e24"} Dec 04 22:05:58.394482 master-0 kubenswrapper[8606]: I1204 22:05:58.394271 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" event={"ID":"17912746-74eb-4c78-8c1b-2f66e7ce4299","Type":"ContainerStarted","Data":"ab56dbe7257c4c7482b150a1ba0d82ac0c93f28c32d4b4b263e8fd93ae1aee0c"} Dec 04 22:05:58.394482 master-0 kubenswrapper[8606]: I1204 22:05:58.394286 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" event={"ID":"17912746-74eb-4c78-8c1b-2f66e7ce4299","Type":"ContainerStarted","Data":"73496f020ec19048256b7ee616b5604b8f6faef21ddc2795a2639ad6cafa0a2c"} Dec 04 22:05:58.396392 master-0 kubenswrapper[8606]: I1204 22:05:58.396349 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" event={"ID":"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76","Type":"ContainerStarted","Data":"857d5516010228f43de819bef01594930c5dd807f2fd4da2fb1509f797fa1774"} Dec 04 22:05:58.945660 master-0 kubenswrapper[8606]: I1204 22:05:58.944018 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:58.945660 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:58.945660 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:58.945660 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:58.945660 master-0 kubenswrapper[8606]: I1204 22:05:58.944130 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:05:59.406943 master-0 kubenswrapper[8606]: I1204 22:05:59.406861 8606 generic.go:334] "Generic (PLEG): container finished" podID="0a726f44-a509-46b3-a6d5-70afe3b55e9f" containerID="b1d3f0ea9fb633db12f795b3c197259244e72196814e421d282a1fe412cb79f2" exitCode=0 Dec 04 22:05:59.407153 master-0 kubenswrapper[8606]: I1204 22:05:59.406941 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5qlk" event={"ID":"0a726f44-a509-46b3-a6d5-70afe3b55e9f","Type":"ContainerDied","Data":"b1d3f0ea9fb633db12f795b3c197259244e72196814e421d282a1fe412cb79f2"} Dec 04 22:05:59.943256 master-0 kubenswrapper[8606]: I1204 22:05:59.943210 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:05:59.943256 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:05:59.943256 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:05:59.943256 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:05:59.943426 master-0 kubenswrapper[8606]: I1204 22:05:59.943287 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:00.414445 master-0 kubenswrapper[8606]: I1204 22:06:00.414262 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" event={"ID":"17912746-74eb-4c78-8c1b-2f66e7ce4299","Type":"ContainerStarted","Data":"81f6c807c7b5b7b7589f341741012990ba8bc408248c52e232edf6a36c144642"} Dec 04 22:06:00.416807 master-0 kubenswrapper[8606]: I1204 22:06:00.416763 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" event={"ID":"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76","Type":"ContainerStarted","Data":"b0b856c01858e3a541b23da67afa5b732b7e863db4e3256d48d200dfe4e813a1"} Dec 04 22:06:00.416874 master-0 kubenswrapper[8606]: I1204 22:06:00.416825 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" event={"ID":"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76","Type":"ContainerStarted","Data":"e45c3db8e761eb5b44659f9feeda0856ca624c4d5c1890015c38703f5a40670b"} Dec 04 22:06:00.416910 master-0 kubenswrapper[8606]: I1204 22:06:00.416870 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" event={"ID":"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76","Type":"ContainerStarted","Data":"f52cf2210341d466000caf1944f36a1a658725324f859c977eae23b9f624b896"} Dec 04 22:06:00.418894 master-0 kubenswrapper[8606]: I1204 22:06:00.418808 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5qlk" event={"ID":"0a726f44-a509-46b3-a6d5-70afe3b55e9f","Type":"ContainerStarted","Data":"545625b842711fdda5eaa303742dcd8f82ccc1ed17d0148b2d986d425a02efdb"} Dec 04 22:06:00.418979 master-0 kubenswrapper[8606]: I1204 22:06:00.418908 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5qlk" event={"ID":"0a726f44-a509-46b3-a6d5-70afe3b55e9f","Type":"ContainerStarted","Data":"36fa51be80da1679d1224935a888d5f59bb5a385358c5ced2fca2235368c4bfe"} Dec 04 22:06:00.453609 master-0 kubenswrapper[8606]: I1204 22:06:00.451469 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" podStartSLOduration=3.185456611 podStartE2EDuration="4.451441876s" podCreationTimestamp="2025-12-04 22:05:56 +0000 UTC" firstStartedPulling="2025-12-04 22:05:58.085754648 +0000 UTC m=+322.896056863" lastFinishedPulling="2025-12-04 22:05:59.351739873 +0000 UTC m=+324.162042128" observedRunningTime="2025-12-04 22:06:00.447708927 +0000 UTC m=+325.258011192" watchObservedRunningTime="2025-12-04 22:06:00.451441876 +0000 UTC m=+325.261744091" Dec 04 22:06:00.480981 master-0 kubenswrapper[8606]: I1204 22:06:00.480847 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-p5qlk" podStartSLOduration=3.320962507 podStartE2EDuration="4.480808161s" podCreationTimestamp="2025-12-04 22:05:56 +0000 UTC" firstStartedPulling="2025-12-04 22:05:57.351731783 +0000 UTC m=+322.162033998" lastFinishedPulling="2025-12-04 22:05:58.511577417 +0000 UTC m=+323.321879652" observedRunningTime="2025-12-04 22:06:00.475669751 +0000 UTC m=+325.285971986" watchObservedRunningTime="2025-12-04 22:06:00.480808161 +0000 UTC m=+325.291110396" Dec 04 22:06:00.511751 master-0 kubenswrapper[8606]: I1204 22:06:00.511371 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" podStartSLOduration=3.006687165 podStartE2EDuration="4.511349481s" podCreationTimestamp="2025-12-04 22:05:56 +0000 UTC" firstStartedPulling="2025-12-04 22:05:57.839806545 +0000 UTC m=+322.650108760" lastFinishedPulling="2025-12-04 22:05:59.344468841 +0000 UTC m=+324.154771076" observedRunningTime="2025-12-04 22:06:00.510138735 +0000 UTC m=+325.320440950" watchObservedRunningTime="2025-12-04 22:06:00.511349481 +0000 UTC m=+325.321651696" Dec 04 22:06:00.944791 master-0 kubenswrapper[8606]: I1204 22:06:00.944705 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:00.944791 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:00.944791 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:00.944791 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:00.945615 master-0 kubenswrapper[8606]: I1204 22:06:00.944793 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:01.945853 master-0 kubenswrapper[8606]: I1204 22:06:01.945781 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:01.945853 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:01.945853 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:01.945853 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:01.946568 master-0 kubenswrapper[8606]: I1204 22:06:01.945872 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:02.943305 master-0 kubenswrapper[8606]: I1204 22:06:02.943215 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:02.943305 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:02.943305 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:02.943305 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:02.943780 master-0 kubenswrapper[8606]: I1204 22:06:02.943309 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:03.244237 master-0 kubenswrapper[8606]: I1204 22:06:03.243541 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-55c77559c8-g74sm"] Dec 04 22:06:03.245002 master-0 kubenswrapper[8606]: I1204 22:06:03.244402 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.249083 master-0 kubenswrapper[8606]: I1204 22:06:03.248766 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 04 22:06:03.249083 master-0 kubenswrapper[8606]: I1204 22:06:03.249037 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-bvrgs" Dec 04 22:06:03.249313 master-0 kubenswrapper[8606]: I1204 22:06:03.249289 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 04 22:06:03.249475 master-0 kubenswrapper[8606]: I1204 22:06:03.249442 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 04 22:06:03.253169 master-0 kubenswrapper[8606]: I1204 22:06:03.253115 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 04 22:06:03.253644 master-0 kubenswrapper[8606]: I1204 22:06:03.253547 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-3h94rftr47kot" Dec 04 22:06:03.265864 master-0 kubenswrapper[8606]: I1204 22:06:03.265760 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55c77559c8-g74sm"] Dec 04 22:06:03.284719 master-0 kubenswrapper[8606]: I1204 22:06:03.284648 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.285005 master-0 kubenswrapper[8606]: I1204 22:06:03.284737 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.285005 master-0 kubenswrapper[8606]: I1204 22:06:03.284774 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.285005 master-0 kubenswrapper[8606]: I1204 22:06:03.284818 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.285005 master-0 kubenswrapper[8606]: I1204 22:06:03.284862 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.285005 master-0 kubenswrapper[8606]: I1204 22:06:03.284907 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-audit-log\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.285005 master-0 kubenswrapper[8606]: I1204 22:06:03.284952 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w82st\" (UniqueName: \"kubernetes.io/projected/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-kube-api-access-w82st\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.386633 master-0 kubenswrapper[8606]: I1204 22:06:03.386560 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.386633 master-0 kubenswrapper[8606]: I1204 22:06:03.386634 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.387165 master-0 kubenswrapper[8606]: I1204 22:06:03.386878 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.387165 master-0 kubenswrapper[8606]: I1204 22:06:03.387065 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.387165 master-0 kubenswrapper[8606]: I1204 22:06:03.387159 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-audit-log\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.387352 master-0 kubenswrapper[8606]: I1204 22:06:03.387248 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w82st\" (UniqueName: \"kubernetes.io/projected/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-kube-api-access-w82st\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.387352 master-0 kubenswrapper[8606]: I1204 22:06:03.387337 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.388015 master-0 kubenswrapper[8606]: I1204 22:06:03.387973 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-audit-log\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.388254 master-0 kubenswrapper[8606]: I1204 22:06:03.388215 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.388579 master-0 kubenswrapper[8606]: I1204 22:06:03.388544 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.394052 master-0 kubenswrapper[8606]: I1204 22:06:03.393986 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.405441 master-0 kubenswrapper[8606]: I1204 22:06:03.405373 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.405634 master-0 kubenswrapper[8606]: I1204 22:06:03.405487 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.418377 master-0 kubenswrapper[8606]: I1204 22:06:03.418309 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w82st\" (UniqueName: \"kubernetes.io/projected/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-kube-api-access-w82st\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.623939 master-0 kubenswrapper[8606]: I1204 22:06:03.623849 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:03.944831 master-0 kubenswrapper[8606]: I1204 22:06:03.944635 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:03.944831 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:03.944831 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:03.944831 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:03.944831 master-0 kubenswrapper[8606]: I1204 22:06:03.944735 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:04.069217 master-0 kubenswrapper[8606]: I1204 22:06:04.068980 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55c77559c8-g74sm"] Dec 04 22:06:04.074419 master-0 kubenswrapper[8606]: W1204 22:06:04.074328 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72679051_6a4b_4991_85c4_e5d2cbbc6ed7.slice/crio-8688dbdb4594010fcb7f8a4ca909c4672a49bcb4f5b75dd3a827d0833c1ed0fe WatchSource:0}: Error finding container 8688dbdb4594010fcb7f8a4ca909c4672a49bcb4f5b75dd3a827d0833c1ed0fe: Status 404 returned error can't find the container with id 8688dbdb4594010fcb7f8a4ca909c4672a49bcb4f5b75dd3a827d0833c1ed0fe Dec 04 22:06:04.450441 master-0 kubenswrapper[8606]: I1204 22:06:04.450353 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" event={"ID":"72679051-6a4b-4991-85c4-e5d2cbbc6ed7","Type":"ContainerStarted","Data":"8688dbdb4594010fcb7f8a4ca909c4672a49bcb4f5b75dd3a827d0833c1ed0fe"} Dec 04 22:06:04.944722 master-0 kubenswrapper[8606]: I1204 22:06:04.944625 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:04.944722 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:04.944722 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:04.944722 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:04.945381 master-0 kubenswrapper[8606]: I1204 22:06:04.944738 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:05.944773 master-0 kubenswrapper[8606]: I1204 22:06:05.944553 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:05.944773 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:05.944773 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:05.944773 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:05.944773 master-0 kubenswrapper[8606]: I1204 22:06:05.944687 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:06.479271 master-0 kubenswrapper[8606]: I1204 22:06:06.479206 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" event={"ID":"72679051-6a4b-4991-85c4-e5d2cbbc6ed7","Type":"ContainerStarted","Data":"8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c"} Dec 04 22:06:06.513585 master-0 kubenswrapper[8606]: I1204 22:06:06.513435 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" podStartSLOduration=2.01825702 podStartE2EDuration="3.513400538s" podCreationTimestamp="2025-12-04 22:06:03 +0000 UTC" firstStartedPulling="2025-12-04 22:06:04.080631727 +0000 UTC m=+328.890933972" lastFinishedPulling="2025-12-04 22:06:05.575775275 +0000 UTC m=+330.386077490" observedRunningTime="2025-12-04 22:06:06.506878188 +0000 UTC m=+331.317180433" watchObservedRunningTime="2025-12-04 22:06:06.513400538 +0000 UTC m=+331.323702793" Dec 04 22:06:06.944604 master-0 kubenswrapper[8606]: I1204 22:06:06.944482 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:06.944604 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:06.944604 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:06.944604 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:06.945976 master-0 kubenswrapper[8606]: I1204 22:06:06.944620 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:07.944200 master-0 kubenswrapper[8606]: I1204 22:06:07.944074 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:07.944200 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:07.944200 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:07.944200 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:07.944665 master-0 kubenswrapper[8606]: I1204 22:06:07.944197 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:08.945232 master-0 kubenswrapper[8606]: I1204 22:06:08.945086 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:08.945232 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:08.945232 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:08.945232 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:08.947054 master-0 kubenswrapper[8606]: I1204 22:06:08.945230 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:09.945221 master-0 kubenswrapper[8606]: I1204 22:06:09.945097 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:09.945221 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:09.945221 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:09.945221 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:09.946402 master-0 kubenswrapper[8606]: I1204 22:06:09.945235 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:10.944493 master-0 kubenswrapper[8606]: I1204 22:06:10.944378 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:10.944493 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:10.944493 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:10.944493 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:10.944493 master-0 kubenswrapper[8606]: I1204 22:06:10.944474 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:11.944155 master-0 kubenswrapper[8606]: I1204 22:06:11.944078 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:11.944155 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:11.944155 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:11.944155 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:11.944743 master-0 kubenswrapper[8606]: I1204 22:06:11.944165 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:12.947345 master-0 kubenswrapper[8606]: I1204 22:06:12.947229 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:12.947345 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:12.947345 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:12.947345 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:12.948021 master-0 kubenswrapper[8606]: I1204 22:06:12.947376 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:13.944848 master-0 kubenswrapper[8606]: I1204 22:06:13.944716 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:13.944848 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:13.944848 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:13.944848 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:13.945350 master-0 kubenswrapper[8606]: I1204 22:06:13.944868 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:14.943910 master-0 kubenswrapper[8606]: I1204 22:06:14.943805 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:14.943910 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:14.943910 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:14.943910 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:14.945168 master-0 kubenswrapper[8606]: I1204 22:06:14.943919 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:15.944662 master-0 kubenswrapper[8606]: I1204 22:06:15.944463 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:15.944662 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:15.944662 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:15.944662 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:15.944662 master-0 kubenswrapper[8606]: I1204 22:06:15.944616 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:16.944973 master-0 kubenswrapper[8606]: I1204 22:06:16.944865 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:16.944973 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:16.944973 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:16.944973 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:16.945950 master-0 kubenswrapper[8606]: I1204 22:06:16.944980 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:17.945144 master-0 kubenswrapper[8606]: I1204 22:06:17.945002 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:17.945144 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:17.945144 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:17.945144 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:17.946159 master-0 kubenswrapper[8606]: I1204 22:06:17.945155 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:18.944790 master-0 kubenswrapper[8606]: I1204 22:06:18.944671 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:18.944790 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:18.944790 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:18.944790 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:18.944790 master-0 kubenswrapper[8606]: I1204 22:06:18.944793 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:19.944452 master-0 kubenswrapper[8606]: I1204 22:06:19.944243 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:19.944452 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:19.944452 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:19.944452 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:19.944452 master-0 kubenswrapper[8606]: I1204 22:06:19.944327 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:20.943834 master-0 kubenswrapper[8606]: I1204 22:06:20.943693 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:20.943834 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:20.943834 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:20.943834 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:20.943834 master-0 kubenswrapper[8606]: I1204 22:06:20.943782 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:21.944291 master-0 kubenswrapper[8606]: I1204 22:06:21.944133 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:21.944291 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:21.944291 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:21.944291 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:21.945468 master-0 kubenswrapper[8606]: I1204 22:06:21.944309 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:22.944556 master-0 kubenswrapper[8606]: I1204 22:06:22.944444 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:22.944556 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:22.944556 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:22.944556 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:22.945555 master-0 kubenswrapper[8606]: I1204 22:06:22.944559 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:23.624448 master-0 kubenswrapper[8606]: I1204 22:06:23.624357 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:23.624448 master-0 kubenswrapper[8606]: I1204 22:06:23.624457 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:23.943760 master-0 kubenswrapper[8606]: I1204 22:06:23.943478 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:23.943760 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:23.943760 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:23.943760 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:23.943760 master-0 kubenswrapper[8606]: I1204 22:06:23.943588 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:24.944806 master-0 kubenswrapper[8606]: I1204 22:06:24.944730 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:24.944806 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:24.944806 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:24.944806 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:24.945477 master-0 kubenswrapper[8606]: I1204 22:06:24.944831 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:25.945130 master-0 kubenswrapper[8606]: I1204 22:06:25.944954 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:25.945130 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:25.945130 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:25.945130 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:25.945130 master-0 kubenswrapper[8606]: I1204 22:06:25.945067 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:26.945210 master-0 kubenswrapper[8606]: I1204 22:06:26.945071 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:26.945210 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:26.945210 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:26.945210 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:26.945210 master-0 kubenswrapper[8606]: I1204 22:06:26.945192 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:27.944155 master-0 kubenswrapper[8606]: I1204 22:06:27.943908 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:27.944155 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:27.944155 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:27.944155 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:27.944155 master-0 kubenswrapper[8606]: I1204 22:06:27.944054 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:28.946288 master-0 kubenswrapper[8606]: I1204 22:06:28.946190 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:28.946288 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:28.946288 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:28.946288 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:28.947275 master-0 kubenswrapper[8606]: I1204 22:06:28.946301 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:29.944994 master-0 kubenswrapper[8606]: I1204 22:06:29.944855 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:29.944994 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:29.944994 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:29.944994 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:29.944994 master-0 kubenswrapper[8606]: I1204 22:06:29.944971 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:30.943991 master-0 kubenswrapper[8606]: I1204 22:06:30.943879 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:30.943991 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:30.943991 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:30.943991 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:30.943991 master-0 kubenswrapper[8606]: I1204 22:06:30.943981 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:31.945863 master-0 kubenswrapper[8606]: I1204 22:06:31.945761 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:31.945863 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:31.945863 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:31.945863 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:31.946945 master-0 kubenswrapper[8606]: I1204 22:06:31.945883 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:32.944433 master-0 kubenswrapper[8606]: I1204 22:06:32.944338 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:32.944433 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:32.944433 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:32.944433 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:32.945132 master-0 kubenswrapper[8606]: I1204 22:06:32.944440 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:33.944253 master-0 kubenswrapper[8606]: I1204 22:06:33.944161 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:33.944253 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:33.944253 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:33.944253 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:33.945286 master-0 kubenswrapper[8606]: I1204 22:06:33.944254 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:34.943180 master-0 kubenswrapper[8606]: I1204 22:06:34.943111 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:34.943180 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:34.943180 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:34.943180 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:34.943624 master-0 kubenswrapper[8606]: I1204 22:06:34.943189 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:35.946933 master-0 kubenswrapper[8606]: I1204 22:06:35.945842 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:35.946933 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:35.946933 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:35.946933 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:35.947888 master-0 kubenswrapper[8606]: I1204 22:06:35.947766 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:36.945140 master-0 kubenswrapper[8606]: I1204 22:06:36.945061 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:36.945140 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:36.945140 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:36.945140 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:36.945816 master-0 kubenswrapper[8606]: I1204 22:06:36.945764 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:37.944560 master-0 kubenswrapper[8606]: I1204 22:06:37.944303 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:37.944560 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:37.944560 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:37.944560 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:37.944560 master-0 kubenswrapper[8606]: I1204 22:06:37.944417 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:38.944806 master-0 kubenswrapper[8606]: I1204 22:06:38.944720 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:38.944806 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:38.944806 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:38.944806 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:38.946118 master-0 kubenswrapper[8606]: I1204 22:06:38.945526 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:39.944250 master-0 kubenswrapper[8606]: I1204 22:06:39.944130 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:39.944250 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:39.944250 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:39.944250 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:39.944751 master-0 kubenswrapper[8606]: I1204 22:06:39.944275 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:40.945139 master-0 kubenswrapper[8606]: I1204 22:06:40.945020 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:40.945139 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:40.945139 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:40.945139 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:40.945139 master-0 kubenswrapper[8606]: I1204 22:06:40.945127 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:41.944793 master-0 kubenswrapper[8606]: I1204 22:06:41.944671 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:41.944793 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:41.944793 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:41.944793 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:41.944793 master-0 kubenswrapper[8606]: I1204 22:06:41.944779 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:42.944773 master-0 kubenswrapper[8606]: I1204 22:06:42.944679 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:42.944773 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:42.944773 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:42.944773 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:42.944773 master-0 kubenswrapper[8606]: I1204 22:06:42.944756 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:43.633319 master-0 kubenswrapper[8606]: I1204 22:06:43.633231 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:43.638444 master-0 kubenswrapper[8606]: I1204 22:06:43.638371 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:06:43.944289 master-0 kubenswrapper[8606]: I1204 22:06:43.944092 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:43.944289 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:43.944289 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:43.944289 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:43.944289 master-0 kubenswrapper[8606]: I1204 22:06:43.944210 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:44.270651 master-0 kubenswrapper[8606]: I1204 22:06:44.270612 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7cr8g"] Dec 04 22:06:44.271656 master-0 kubenswrapper[8606]: I1204 22:06:44.271639 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:06:44.274154 master-0 kubenswrapper[8606]: I1204 22:06:44.273958 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 22:06:44.274539 master-0 kubenswrapper[8606]: I1204 22:06:44.274521 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 22:06:44.274647 master-0 kubenswrapper[8606]: I1204 22:06:44.274611 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-wc4xq" Dec 04 22:06:44.274762 master-0 kubenswrapper[8606]: I1204 22:06:44.274528 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 22:06:44.287647 master-0 kubenswrapper[8606]: I1204 22:06:44.287579 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7cr8g"] Dec 04 22:06:44.372388 master-0 kubenswrapper[8606]: I1204 22:06:44.372299 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:06:44.372666 master-0 kubenswrapper[8606]: I1204 22:06:44.372537 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8gh2\" (UniqueName: \"kubernetes.io/projected/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-kube-api-access-n8gh2\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:06:44.474425 master-0 kubenswrapper[8606]: I1204 22:06:44.474311 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:06:44.474788 master-0 kubenswrapper[8606]: I1204 22:06:44.474541 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8gh2\" (UniqueName: \"kubernetes.io/projected/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-kube-api-access-n8gh2\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:06:44.485289 master-0 kubenswrapper[8606]: I1204 22:06:44.485216 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:06:44.501068 master-0 kubenswrapper[8606]: I1204 22:06:44.500993 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8gh2\" (UniqueName: \"kubernetes.io/projected/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-kube-api-access-n8gh2\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:06:44.600284 master-0 kubenswrapper[8606]: I1204 22:06:44.600087 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:06:44.772000 master-0 kubenswrapper[8606]: I1204 22:06:44.771953 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/1.log" Dec 04 22:06:44.774762 master-0 kubenswrapper[8606]: I1204 22:06:44.774743 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/0.log" Dec 04 22:06:44.774917 master-0 kubenswrapper[8606]: I1204 22:06:44.774892 8606 generic.go:334] "Generic (PLEG): container finished" podID="addddaac-a31a-4dbf-b78f-87225b11b463" containerID="6d94211dde773ea9f092db6ee8825549019f84f69e33c14265dafd1be2140e92" exitCode=1 Dec 04 22:06:44.775025 master-0 kubenswrapper[8606]: I1204 22:06:44.775004 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerDied","Data":"6d94211dde773ea9f092db6ee8825549019f84f69e33c14265dafd1be2140e92"} Dec 04 22:06:44.775129 master-0 kubenswrapper[8606]: I1204 22:06:44.775114 8606 scope.go:117] "RemoveContainer" containerID="f32a0325771ce40043e2990b6e044b2e673986f92037baf7df71e61135c7bd82" Dec 04 22:06:44.775945 master-0 kubenswrapper[8606]: I1204 22:06:44.775922 8606 scope.go:117] "RemoveContainer" containerID="6d94211dde773ea9f092db6ee8825549019f84f69e33c14265dafd1be2140e92" Dec 04 22:06:44.776310 master-0 kubenswrapper[8606]: E1204 22:06:44.776289 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:06:44.947872 master-0 kubenswrapper[8606]: I1204 22:06:44.947811 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:44.947872 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:44.947872 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:44.947872 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:44.948173 master-0 kubenswrapper[8606]: I1204 22:06:44.947888 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:45.192006 master-0 kubenswrapper[8606]: I1204 22:06:45.191945 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7cr8g"] Dec 04 22:06:45.195918 master-0 kubenswrapper[8606]: W1204 22:06:45.195860 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod651c0fad_1577_4a7f_8718_ec2fd2f06c3e.slice/crio-28f1828f69f1c4d0d02aa7e938a400a27e1212d35ae3b194af92227cb1a24b54 WatchSource:0}: Error finding container 28f1828f69f1c4d0d02aa7e938a400a27e1212d35ae3b194af92227cb1a24b54: Status 404 returned error can't find the container with id 28f1828f69f1c4d0d02aa7e938a400a27e1212d35ae3b194af92227cb1a24b54 Dec 04 22:06:45.782818 master-0 kubenswrapper[8606]: I1204 22:06:45.782729 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/1.log" Dec 04 22:06:45.784909 master-0 kubenswrapper[8606]: I1204 22:06:45.784845 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7cr8g" event={"ID":"651c0fad-1577-4a7f-8718-ec2fd2f06c3e","Type":"ContainerStarted","Data":"2faf8b075190630ed17989d83d75b4c309209b6e0d5c61ff8a357ef81ff71f02"} Dec 04 22:06:45.784909 master-0 kubenswrapper[8606]: I1204 22:06:45.784880 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7cr8g" event={"ID":"651c0fad-1577-4a7f-8718-ec2fd2f06c3e","Type":"ContainerStarted","Data":"28f1828f69f1c4d0d02aa7e938a400a27e1212d35ae3b194af92227cb1a24b54"} Dec 04 22:06:45.807220 master-0 kubenswrapper[8606]: I1204 22:06:45.807134 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7cr8g" podStartSLOduration=1.807109617 podStartE2EDuration="1.807109617s" podCreationTimestamp="2025-12-04 22:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:06:45.802897843 +0000 UTC m=+370.613200058" watchObservedRunningTime="2025-12-04 22:06:45.807109617 +0000 UTC m=+370.617411832" Dec 04 22:06:45.944409 master-0 kubenswrapper[8606]: I1204 22:06:45.944259 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:45.944409 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:45.944409 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:45.944409 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:45.944409 master-0 kubenswrapper[8606]: I1204 22:06:45.944344 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:46.944676 master-0 kubenswrapper[8606]: I1204 22:06:46.944392 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:46.944676 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:46.944676 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:46.944676 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:46.945887 master-0 kubenswrapper[8606]: I1204 22:06:46.944719 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:47.945393 master-0 kubenswrapper[8606]: I1204 22:06:47.945262 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:47.945393 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:47.945393 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:47.945393 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:47.946403 master-0 kubenswrapper[8606]: I1204 22:06:47.945416 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:48.944902 master-0 kubenswrapper[8606]: I1204 22:06:48.944844 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:48.944902 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:48.944902 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:48.944902 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:48.945396 master-0 kubenswrapper[8606]: I1204 22:06:48.945361 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:49.945121 master-0 kubenswrapper[8606]: I1204 22:06:49.945061 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:49.945121 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:49.945121 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:49.945121 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:49.946580 master-0 kubenswrapper[8606]: I1204 22:06:49.946481 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:50.943963 master-0 kubenswrapper[8606]: I1204 22:06:50.943783 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:50.943963 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:50.943963 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:50.943963 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:50.944620 master-0 kubenswrapper[8606]: I1204 22:06:50.943994 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:51.943095 master-0 kubenswrapper[8606]: I1204 22:06:51.943028 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:51.943095 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:51.943095 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:51.943095 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:51.943685 master-0 kubenswrapper[8606]: I1204 22:06:51.943103 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:52.944227 master-0 kubenswrapper[8606]: I1204 22:06:52.944093 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:52.944227 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:52.944227 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:52.944227 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:52.944227 master-0 kubenswrapper[8606]: I1204 22:06:52.944197 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:53.944486 master-0 kubenswrapper[8606]: I1204 22:06:53.944397 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:53.944486 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:53.944486 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:53.944486 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:53.945101 master-0 kubenswrapper[8606]: I1204 22:06:53.944538 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:54.945212 master-0 kubenswrapper[8606]: I1204 22:06:54.945124 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:54.945212 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:54.945212 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:54.945212 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:54.946592 master-0 kubenswrapper[8606]: I1204 22:06:54.945254 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:55.943578 master-0 kubenswrapper[8606]: I1204 22:06:55.943435 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:55.943578 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:55.943578 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:55.943578 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:55.943874 master-0 kubenswrapper[8606]: I1204 22:06:55.943583 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:56.944690 master-0 kubenswrapper[8606]: I1204 22:06:56.944589 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:56.944690 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:56.944690 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:56.944690 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:56.945829 master-0 kubenswrapper[8606]: I1204 22:06:56.945407 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:57.944694 master-0 kubenswrapper[8606]: I1204 22:06:57.944601 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:57.944694 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:57.944694 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:57.944694 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:57.945714 master-0 kubenswrapper[8606]: I1204 22:06:57.944730 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:58.391594 master-0 kubenswrapper[8606]: I1204 22:06:58.391519 8606 scope.go:117] "RemoveContainer" containerID="6d94211dde773ea9f092db6ee8825549019f84f69e33c14265dafd1be2140e92" Dec 04 22:06:58.900100 master-0 kubenswrapper[8606]: I1204 22:06:58.900048 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/1.log" Dec 04 22:06:58.900431 master-0 kubenswrapper[8606]: I1204 22:06:58.900395 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"b0ed4a9fe3a00c4a1d8cd842129319027df79a8ddd43dbd8b1904f3bc2dc1059"} Dec 04 22:06:58.944245 master-0 kubenswrapper[8606]: I1204 22:06:58.944173 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:58.944245 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:58.944245 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:58.944245 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:58.944568 master-0 kubenswrapper[8606]: I1204 22:06:58.944275 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:06:59.944367 master-0 kubenswrapper[8606]: I1204 22:06:59.944275 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:06:59.944367 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:06:59.944367 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:06:59.944367 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:06:59.945087 master-0 kubenswrapper[8606]: I1204 22:06:59.944401 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:00.944586 master-0 kubenswrapper[8606]: I1204 22:07:00.944478 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:00.944586 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:00.944586 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:00.944586 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:00.944586 master-0 kubenswrapper[8606]: I1204 22:07:00.944583 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:01.945274 master-0 kubenswrapper[8606]: I1204 22:07:01.945194 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:01.945274 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:01.945274 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:01.945274 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:01.946625 master-0 kubenswrapper[8606]: I1204 22:07:01.945298 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:02.945054 master-0 kubenswrapper[8606]: I1204 22:07:02.944967 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:02.945054 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:02.945054 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:02.945054 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:02.946166 master-0 kubenswrapper[8606]: I1204 22:07:02.945065 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:03.944054 master-0 kubenswrapper[8606]: I1204 22:07:03.943963 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:03.944054 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:03.944054 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:03.944054 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:03.944425 master-0 kubenswrapper[8606]: I1204 22:07:03.944084 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:04.944188 master-0 kubenswrapper[8606]: I1204 22:07:04.944098 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:04.944188 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:04.944188 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:04.944188 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:04.945361 master-0 kubenswrapper[8606]: I1204 22:07:04.944194 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:05.945346 master-0 kubenswrapper[8606]: I1204 22:07:05.945160 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:05.945346 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:05.945346 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:05.945346 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:05.947646 master-0 kubenswrapper[8606]: I1204 22:07:05.945429 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:06.945022 master-0 kubenswrapper[8606]: I1204 22:07:06.944924 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:06.945022 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:06.945022 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:06.945022 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:06.945022 master-0 kubenswrapper[8606]: I1204 22:07:06.945031 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:07.944872 master-0 kubenswrapper[8606]: I1204 22:07:07.944795 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:07.944872 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:07.944872 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:07.944872 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:07.946149 master-0 kubenswrapper[8606]: I1204 22:07:07.944905 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:08.945178 master-0 kubenswrapper[8606]: I1204 22:07:08.945080 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:08.945178 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:08.945178 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:08.945178 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:08.946291 master-0 kubenswrapper[8606]: I1204 22:07:08.945187 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:09.944301 master-0 kubenswrapper[8606]: I1204 22:07:09.944104 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:09.944301 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:09.944301 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:09.944301 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:09.944301 master-0 kubenswrapper[8606]: I1204 22:07:09.944227 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:10.944875 master-0 kubenswrapper[8606]: I1204 22:07:10.944751 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:10.944875 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:10.944875 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:10.944875 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:10.946142 master-0 kubenswrapper[8606]: I1204 22:07:10.944887 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:11.944404 master-0 kubenswrapper[8606]: I1204 22:07:11.944297 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:11.944404 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:11.944404 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:11.944404 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:11.945622 master-0 kubenswrapper[8606]: I1204 22:07:11.944414 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:12.944754 master-0 kubenswrapper[8606]: I1204 22:07:12.944602 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:12.944754 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:12.944754 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:12.944754 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:12.945848 master-0 kubenswrapper[8606]: I1204 22:07:12.944741 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:13.944370 master-0 kubenswrapper[8606]: I1204 22:07:13.944284 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:13.944370 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:13.944370 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:13.944370 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:13.946027 master-0 kubenswrapper[8606]: I1204 22:07:13.944410 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:14.945178 master-0 kubenswrapper[8606]: I1204 22:07:14.945059 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:14.945178 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:14.945178 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:14.945178 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:14.945178 master-0 kubenswrapper[8606]: I1204 22:07:14.945165 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:15.944742 master-0 kubenswrapper[8606]: I1204 22:07:15.944411 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:15.944742 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:15.944742 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:15.944742 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:15.944742 master-0 kubenswrapper[8606]: I1204 22:07:15.944512 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:16.943682 master-0 kubenswrapper[8606]: I1204 22:07:16.943613 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:16.943682 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:16.943682 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:16.943682 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:16.943972 master-0 kubenswrapper[8606]: I1204 22:07:16.943714 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:17.945128 master-0 kubenswrapper[8606]: I1204 22:07:17.945023 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:17.945128 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:17.945128 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:17.945128 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:17.946398 master-0 kubenswrapper[8606]: I1204 22:07:17.945150 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:18.944085 master-0 kubenswrapper[8606]: I1204 22:07:18.943975 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:18.944085 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:18.944085 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:18.944085 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:18.944085 master-0 kubenswrapper[8606]: I1204 22:07:18.944062 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:19.944185 master-0 kubenswrapper[8606]: I1204 22:07:19.944036 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:19.944185 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:19.944185 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:19.944185 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:19.944185 master-0 kubenswrapper[8606]: I1204 22:07:19.944154 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:20.943397 master-0 kubenswrapper[8606]: I1204 22:07:20.943244 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:20.943397 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:20.943397 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:20.943397 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:20.943397 master-0 kubenswrapper[8606]: I1204 22:07:20.943377 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:21.944419 master-0 kubenswrapper[8606]: I1204 22:07:21.944296 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:21.944419 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:21.944419 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:21.944419 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:21.944419 master-0 kubenswrapper[8606]: I1204 22:07:21.944394 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:22.945216 master-0 kubenswrapper[8606]: I1204 22:07:22.945108 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:22.945216 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:22.945216 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:22.945216 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:22.945216 master-0 kubenswrapper[8606]: I1204 22:07:22.945197 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:23.944621 master-0 kubenswrapper[8606]: I1204 22:07:23.944532 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:23.944621 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:23.944621 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:23.944621 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:23.944959 master-0 kubenswrapper[8606]: I1204 22:07:23.944623 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:24.944851 master-0 kubenswrapper[8606]: I1204 22:07:24.944727 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:24.944851 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:24.944851 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:24.944851 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:24.945859 master-0 kubenswrapper[8606]: I1204 22:07:24.944859 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:25.944309 master-0 kubenswrapper[8606]: I1204 22:07:25.944122 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:25.944309 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:25.944309 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:25.944309 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:25.944309 master-0 kubenswrapper[8606]: I1204 22:07:25.944239 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:26.944386 master-0 kubenswrapper[8606]: I1204 22:07:26.944307 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:26.944386 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:26.944386 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:26.944386 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:26.945348 master-0 kubenswrapper[8606]: I1204 22:07:26.944411 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:27.944743 master-0 kubenswrapper[8606]: I1204 22:07:27.944566 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:27.944743 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:27.944743 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:27.944743 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:27.944743 master-0 kubenswrapper[8606]: I1204 22:07:27.944727 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:28.944482 master-0 kubenswrapper[8606]: I1204 22:07:28.944417 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:28.944482 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:28.944482 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:28.944482 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:28.945191 master-0 kubenswrapper[8606]: I1204 22:07:28.944519 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:29.944127 master-0 kubenswrapper[8606]: I1204 22:07:29.944030 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:29.944127 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:29.944127 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:29.944127 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:29.944667 master-0 kubenswrapper[8606]: I1204 22:07:29.944150 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:30.944935 master-0 kubenswrapper[8606]: I1204 22:07:30.944258 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:30.944935 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:30.944935 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:30.944935 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:30.946444 master-0 kubenswrapper[8606]: I1204 22:07:30.944947 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:31.945484 master-0 kubenswrapper[8606]: I1204 22:07:31.945415 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:31.945484 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:31.945484 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:31.945484 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:31.946676 master-0 kubenswrapper[8606]: I1204 22:07:31.946622 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:32.944240 master-0 kubenswrapper[8606]: I1204 22:07:32.944151 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:32.944240 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:32.944240 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:32.944240 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:32.944708 master-0 kubenswrapper[8606]: I1204 22:07:32.944249 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:33.944321 master-0 kubenswrapper[8606]: I1204 22:07:33.944082 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:33.944321 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:33.944321 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:33.944321 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:33.944321 master-0 kubenswrapper[8606]: I1204 22:07:33.944176 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:34.944418 master-0 kubenswrapper[8606]: I1204 22:07:34.944337 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:34.944418 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:34.944418 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:34.944418 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:34.944418 master-0 kubenswrapper[8606]: I1204 22:07:34.944410 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:35.974544 master-0 kubenswrapper[8606]: I1204 22:07:35.944125 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:35.974544 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:35.974544 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:35.974544 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:35.974544 master-0 kubenswrapper[8606]: I1204 22:07:35.944231 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:36.944786 master-0 kubenswrapper[8606]: I1204 22:07:36.944707 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:36.944786 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:36.944786 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:36.944786 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:36.945484 master-0 kubenswrapper[8606]: I1204 22:07:36.944810 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:37.944757 master-0 kubenswrapper[8606]: I1204 22:07:37.944628 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:37.944757 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:37.944757 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:37.944757 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:37.945865 master-0 kubenswrapper[8606]: I1204 22:07:37.944781 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:38.944540 master-0 kubenswrapper[8606]: I1204 22:07:38.944391 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:38.944540 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:38.944540 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:38.944540 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:38.945626 master-0 kubenswrapper[8606]: I1204 22:07:38.944583 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:39.963074 master-0 kubenswrapper[8606]: I1204 22:07:39.962976 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:39.963074 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:39.963074 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:39.963074 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:39.963074 master-0 kubenswrapper[8606]: I1204 22:07:39.963065 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:40.945044 master-0 kubenswrapper[8606]: I1204 22:07:40.944965 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:40.945044 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:40.945044 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:40.945044 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:40.945395 master-0 kubenswrapper[8606]: I1204 22:07:40.945068 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:41.945608 master-0 kubenswrapper[8606]: I1204 22:07:41.945465 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:07:41.945608 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:07:41.945608 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:07:41.945608 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:07:41.945608 master-0 kubenswrapper[8606]: I1204 22:07:41.945600 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:07:41.946703 master-0 kubenswrapper[8606]: I1204 22:07:41.945678 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:07:41.946703 master-0 kubenswrapper[8606]: I1204 22:07:41.946548 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"b6b73ce0e49791c4429e0dc83a179c1414cef4c0eb61371c4801b941d91131b9"} pod="openshift-ingress/router-default-5465c8b4db-8vm66" containerMessage="Container router failed startup probe, will be restarted" Dec 04 22:07:41.946703 master-0 kubenswrapper[8606]: I1204 22:07:41.946631 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" containerID="cri-o://b6b73ce0e49791c4429e0dc83a179c1414cef4c0eb61371c4801b941d91131b9" gracePeriod=3600 Dec 04 22:08:28.638830 master-0 kubenswrapper[8606]: I1204 22:08:28.638636 8606 generic.go:334] "Generic (PLEG): container finished" podID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerID="b6b73ce0e49791c4429e0dc83a179c1414cef4c0eb61371c4801b941d91131b9" exitCode=0 Dec 04 22:08:28.638830 master-0 kubenswrapper[8606]: I1204 22:08:28.638703 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerDied","Data":"b6b73ce0e49791c4429e0dc83a179c1414cef4c0eb61371c4801b941d91131b9"} Dec 04 22:08:28.638830 master-0 kubenswrapper[8606]: I1204 22:08:28.638740 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerStarted","Data":"ff93336d691c6b7a2b8fa61f8706675e304c74a0c7baba380e59deb94bba182a"} Dec 04 22:08:28.941853 master-0 kubenswrapper[8606]: I1204 22:08:28.941616 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:08:28.941853 master-0 kubenswrapper[8606]: I1204 22:08:28.941684 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:08:28.946282 master-0 kubenswrapper[8606]: I1204 22:08:28.946169 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:28.946282 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:28.946282 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:28.946282 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:28.946750 master-0 kubenswrapper[8606]: I1204 22:08:28.946304 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:29.944650 master-0 kubenswrapper[8606]: I1204 22:08:29.944573 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:29.944650 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:29.944650 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:29.944650 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:29.945562 master-0 kubenswrapper[8606]: I1204 22:08:29.944665 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:30.944036 master-0 kubenswrapper[8606]: I1204 22:08:30.943952 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:30.944036 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:30.944036 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:30.944036 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:30.944468 master-0 kubenswrapper[8606]: I1204 22:08:30.944057 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:31.942990 master-0 kubenswrapper[8606]: I1204 22:08:31.942923 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:31.942990 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:31.942990 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:31.942990 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:31.944317 master-0 kubenswrapper[8606]: I1204 22:08:31.944268 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:32.944442 master-0 kubenswrapper[8606]: I1204 22:08:32.944376 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:32.944442 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:32.944442 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:32.944442 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:32.945254 master-0 kubenswrapper[8606]: I1204 22:08:32.944480 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:33.943398 master-0 kubenswrapper[8606]: I1204 22:08:33.943308 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:33.943398 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:33.943398 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:33.943398 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:33.943837 master-0 kubenswrapper[8606]: I1204 22:08:33.943413 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:34.945188 master-0 kubenswrapper[8606]: I1204 22:08:34.945100 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:34.945188 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:34.945188 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:34.945188 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:34.946181 master-0 kubenswrapper[8606]: I1204 22:08:34.945208 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:35.946563 master-0 kubenswrapper[8606]: I1204 22:08:35.946420 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:35.946563 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:35.946563 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:35.946563 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:35.947377 master-0 kubenswrapper[8606]: I1204 22:08:35.947339 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:36.944227 master-0 kubenswrapper[8606]: I1204 22:08:36.944115 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:36.944227 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:36.944227 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:36.944227 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:36.944227 master-0 kubenswrapper[8606]: I1204 22:08:36.944203 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:37.943972 master-0 kubenswrapper[8606]: I1204 22:08:37.943874 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:37.943972 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:37.943972 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:37.943972 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:37.943972 master-0 kubenswrapper[8606]: I1204 22:08:37.943938 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:38.944287 master-0 kubenswrapper[8606]: I1204 22:08:38.944189 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:38.944287 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:38.944287 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:38.944287 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:38.944287 master-0 kubenswrapper[8606]: I1204 22:08:38.944278 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:39.944101 master-0 kubenswrapper[8606]: I1204 22:08:39.944010 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:39.944101 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:39.944101 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:39.944101 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:39.945138 master-0 kubenswrapper[8606]: I1204 22:08:39.944104 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:40.944402 master-0 kubenswrapper[8606]: I1204 22:08:40.944280 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:40.944402 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:40.944402 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:40.944402 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:40.944402 master-0 kubenswrapper[8606]: I1204 22:08:40.944381 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:41.942941 master-0 kubenswrapper[8606]: I1204 22:08:41.942836 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:41.942941 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:41.942941 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:41.942941 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:41.943463 master-0 kubenswrapper[8606]: I1204 22:08:41.942954 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:42.944166 master-0 kubenswrapper[8606]: I1204 22:08:42.944079 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:42.944166 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:42.944166 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:42.944166 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:42.945263 master-0 kubenswrapper[8606]: I1204 22:08:42.944166 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:43.943908 master-0 kubenswrapper[8606]: I1204 22:08:43.943830 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:43.943908 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:43.943908 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:43.943908 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:43.944784 master-0 kubenswrapper[8606]: I1204 22:08:43.943911 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:44.943525 master-0 kubenswrapper[8606]: I1204 22:08:44.943404 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:44.943525 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:44.943525 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:44.943525 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:44.943525 master-0 kubenswrapper[8606]: I1204 22:08:44.943467 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:45.943972 master-0 kubenswrapper[8606]: I1204 22:08:45.943758 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:45.943972 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:45.943972 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:45.943972 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:45.943972 master-0 kubenswrapper[8606]: I1204 22:08:45.943898 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:46.943937 master-0 kubenswrapper[8606]: I1204 22:08:46.943817 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:46.943937 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:46.943937 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:46.943937 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:46.943937 master-0 kubenswrapper[8606]: I1204 22:08:46.943905 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:47.944415 master-0 kubenswrapper[8606]: I1204 22:08:47.944346 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:47.944415 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:47.944415 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:47.944415 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:47.945580 master-0 kubenswrapper[8606]: I1204 22:08:47.945490 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:48.943428 master-0 kubenswrapper[8606]: I1204 22:08:48.943345 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:48.943428 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:48.943428 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:48.943428 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:48.943835 master-0 kubenswrapper[8606]: I1204 22:08:48.943441 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:49.944379 master-0 kubenswrapper[8606]: I1204 22:08:49.944274 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:49.944379 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:49.944379 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:49.944379 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:49.944379 master-0 kubenswrapper[8606]: I1204 22:08:49.944358 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:50.945207 master-0 kubenswrapper[8606]: I1204 22:08:50.945085 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:50.945207 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:50.945207 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:50.945207 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:50.945207 master-0 kubenswrapper[8606]: I1204 22:08:50.945195 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:51.944976 master-0 kubenswrapper[8606]: I1204 22:08:51.944852 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:51.944976 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:51.944976 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:51.944976 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:51.944976 master-0 kubenswrapper[8606]: I1204 22:08:51.944929 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:52.944381 master-0 kubenswrapper[8606]: I1204 22:08:52.944236 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:52.944381 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:52.944381 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:52.944381 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:52.944381 master-0 kubenswrapper[8606]: I1204 22:08:52.944366 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:53.943395 master-0 kubenswrapper[8606]: I1204 22:08:53.943302 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:53.943395 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:53.943395 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:53.943395 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:53.944001 master-0 kubenswrapper[8606]: I1204 22:08:53.943420 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:54.945185 master-0 kubenswrapper[8606]: I1204 22:08:54.945077 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:54.945185 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:54.945185 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:54.945185 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:54.945185 master-0 kubenswrapper[8606]: I1204 22:08:54.945170 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:55.945210 master-0 kubenswrapper[8606]: I1204 22:08:55.944972 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:55.945210 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:55.945210 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:55.945210 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:55.945210 master-0 kubenswrapper[8606]: I1204 22:08:55.945133 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:56.944910 master-0 kubenswrapper[8606]: I1204 22:08:56.944769 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:56.944910 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:56.944910 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:56.944910 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:56.944910 master-0 kubenswrapper[8606]: I1204 22:08:56.944853 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:57.944322 master-0 kubenswrapper[8606]: I1204 22:08:57.944257 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:57.944322 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:57.944322 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:57.944322 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:57.944606 master-0 kubenswrapper[8606]: I1204 22:08:57.944351 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:58.944767 master-0 kubenswrapper[8606]: I1204 22:08:58.944697 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:58.944767 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:58.944767 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:58.944767 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:58.945765 master-0 kubenswrapper[8606]: I1204 22:08:58.944792 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:08:59.891706 master-0 kubenswrapper[8606]: I1204 22:08:59.891636 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/2.log" Dec 04 22:08:59.892903 master-0 kubenswrapper[8606]: I1204 22:08:59.892847 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/1.log" Dec 04 22:08:59.893493 master-0 kubenswrapper[8606]: I1204 22:08:59.893431 8606 generic.go:334] "Generic (PLEG): container finished" podID="addddaac-a31a-4dbf-b78f-87225b11b463" containerID="b0ed4a9fe3a00c4a1d8cd842129319027df79a8ddd43dbd8b1904f3bc2dc1059" exitCode=1 Dec 04 22:08:59.893684 master-0 kubenswrapper[8606]: I1204 22:08:59.893488 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerDied","Data":"b0ed4a9fe3a00c4a1d8cd842129319027df79a8ddd43dbd8b1904f3bc2dc1059"} Dec 04 22:08:59.893684 master-0 kubenswrapper[8606]: I1204 22:08:59.893612 8606 scope.go:117] "RemoveContainer" containerID="6d94211dde773ea9f092db6ee8825549019f84f69e33c14265dafd1be2140e92" Dec 04 22:08:59.894321 master-0 kubenswrapper[8606]: I1204 22:08:59.894258 8606 scope.go:117] "RemoveContainer" containerID="b0ed4a9fe3a00c4a1d8cd842129319027df79a8ddd43dbd8b1904f3bc2dc1059" Dec 04 22:08:59.894954 master-0 kubenswrapper[8606]: E1204 22:08:59.894880 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:08:59.944515 master-0 kubenswrapper[8606]: I1204 22:08:59.944401 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:08:59.944515 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:08:59.944515 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:08:59.944515 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:08:59.944816 master-0 kubenswrapper[8606]: I1204 22:08:59.944555 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:00.905480 master-0 kubenswrapper[8606]: I1204 22:09:00.905375 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/2.log" Dec 04 22:09:00.944650 master-0 kubenswrapper[8606]: I1204 22:09:00.944566 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:00.944650 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:00.944650 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:00.944650 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:00.945773 master-0 kubenswrapper[8606]: I1204 22:09:00.944652 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:01.944038 master-0 kubenswrapper[8606]: I1204 22:09:01.943953 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:01.944038 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:01.944038 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:01.944038 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:01.944541 master-0 kubenswrapper[8606]: I1204 22:09:01.944063 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:02.944383 master-0 kubenswrapper[8606]: I1204 22:09:02.944317 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:02.944383 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:02.944383 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:02.944383 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:02.945050 master-0 kubenswrapper[8606]: I1204 22:09:02.944432 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:03.944592 master-0 kubenswrapper[8606]: I1204 22:09:03.944481 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:03.944592 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:03.944592 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:03.944592 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:03.945538 master-0 kubenswrapper[8606]: I1204 22:09:03.944601 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:04.948644 master-0 kubenswrapper[8606]: I1204 22:09:04.948550 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:04.948644 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:04.948644 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:04.948644 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:04.949587 master-0 kubenswrapper[8606]: I1204 22:09:04.948671 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:05.944142 master-0 kubenswrapper[8606]: I1204 22:09:05.943967 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:05.944142 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:05.944142 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:05.944142 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:05.944142 master-0 kubenswrapper[8606]: I1204 22:09:05.944085 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:06.944567 master-0 kubenswrapper[8606]: I1204 22:09:06.944422 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:06.944567 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:06.944567 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:06.944567 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:06.945944 master-0 kubenswrapper[8606]: I1204 22:09:06.945898 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:07.944438 master-0 kubenswrapper[8606]: I1204 22:09:07.944341 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:07.944438 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:07.944438 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:07.944438 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:07.945571 master-0 kubenswrapper[8606]: I1204 22:09:07.944449 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:08.944570 master-0 kubenswrapper[8606]: I1204 22:09:08.944458 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:08.944570 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:08.944570 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:08.944570 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:08.945569 master-0 kubenswrapper[8606]: I1204 22:09:08.944581 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:09.944454 master-0 kubenswrapper[8606]: I1204 22:09:09.944380 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:09.944454 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:09.944454 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:09.944454 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:09.945401 master-0 kubenswrapper[8606]: I1204 22:09:09.944475 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:10.392698 master-0 kubenswrapper[8606]: I1204 22:09:10.392552 8606 scope.go:117] "RemoveContainer" containerID="b0ed4a9fe3a00c4a1d8cd842129319027df79a8ddd43dbd8b1904f3bc2dc1059" Dec 04 22:09:10.393066 master-0 kubenswrapper[8606]: E1204 22:09:10.393009 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:09:10.944607 master-0 kubenswrapper[8606]: I1204 22:09:10.944529 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:10.944607 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:10.944607 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:10.944607 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:10.945555 master-0 kubenswrapper[8606]: I1204 22:09:10.944635 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:11.943729 master-0 kubenswrapper[8606]: I1204 22:09:11.943638 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:11.943729 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:11.943729 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:11.943729 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:11.944194 master-0 kubenswrapper[8606]: I1204 22:09:11.943736 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:12.943976 master-0 kubenswrapper[8606]: I1204 22:09:12.943890 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:12.943976 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:12.943976 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:12.943976 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:12.945546 master-0 kubenswrapper[8606]: I1204 22:09:12.944016 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:13.944308 master-0 kubenswrapper[8606]: I1204 22:09:13.944239 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:13.944308 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:13.944308 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:13.944308 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:13.945302 master-0 kubenswrapper[8606]: I1204 22:09:13.944310 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:14.944364 master-0 kubenswrapper[8606]: I1204 22:09:14.944275 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:14.944364 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:14.944364 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:14.944364 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:14.945354 master-0 kubenswrapper[8606]: I1204 22:09:14.944378 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:15.944274 master-0 kubenswrapper[8606]: I1204 22:09:15.944168 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:15.944274 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:15.944274 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:15.944274 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:15.945808 master-0 kubenswrapper[8606]: I1204 22:09:15.944288 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:16.944150 master-0 kubenswrapper[8606]: I1204 22:09:16.944075 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:16.944150 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:16.944150 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:16.944150 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:16.944150 master-0 kubenswrapper[8606]: I1204 22:09:16.944143 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:17.945375 master-0 kubenswrapper[8606]: I1204 22:09:17.945287 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:17.945375 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:17.945375 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:17.945375 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:17.946439 master-0 kubenswrapper[8606]: I1204 22:09:17.945388 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:18.944291 master-0 kubenswrapper[8606]: I1204 22:09:18.944218 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:18.944291 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:18.944291 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:18.944291 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:18.944291 master-0 kubenswrapper[8606]: I1204 22:09:18.944280 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:19.944830 master-0 kubenswrapper[8606]: I1204 22:09:19.944657 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:19.944830 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:19.944830 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:19.944830 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:19.944830 master-0 kubenswrapper[8606]: I1204 22:09:19.944768 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:20.944461 master-0 kubenswrapper[8606]: I1204 22:09:20.944382 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:20.944461 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:20.944461 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:20.944461 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:20.945594 master-0 kubenswrapper[8606]: I1204 22:09:20.944838 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:21.944711 master-0 kubenswrapper[8606]: I1204 22:09:21.944623 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:21.944711 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:21.944711 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:21.944711 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:21.945473 master-0 kubenswrapper[8606]: I1204 22:09:21.944743 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:22.944178 master-0 kubenswrapper[8606]: I1204 22:09:22.944088 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:22.944178 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:22.944178 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:22.944178 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:22.944479 master-0 kubenswrapper[8606]: I1204 22:09:22.944201 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:23.392839 master-0 kubenswrapper[8606]: I1204 22:09:23.392773 8606 scope.go:117] "RemoveContainer" containerID="b0ed4a9fe3a00c4a1d8cd842129319027df79a8ddd43dbd8b1904f3bc2dc1059" Dec 04 22:09:23.944954 master-0 kubenswrapper[8606]: I1204 22:09:23.944796 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:23.944954 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:23.944954 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:23.944954 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:23.945399 master-0 kubenswrapper[8606]: I1204 22:09:23.944987 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:24.111732 master-0 kubenswrapper[8606]: I1204 22:09:24.111665 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/2.log" Dec 04 22:09:24.112337 master-0 kubenswrapper[8606]: I1204 22:09:24.112275 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"8aef2ad5ac5490ab1a81df3b63461f15674a4e79c00fcefd7f3c846aef27f271"} Dec 04 22:09:24.944656 master-0 kubenswrapper[8606]: I1204 22:09:24.944541 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:24.944656 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:24.944656 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:24.944656 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:24.945910 master-0 kubenswrapper[8606]: I1204 22:09:24.944651 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:25.945034 master-0 kubenswrapper[8606]: I1204 22:09:25.944928 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:25.945034 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:25.945034 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:25.945034 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:25.945034 master-0 kubenswrapper[8606]: I1204 22:09:25.945025 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:26.944830 master-0 kubenswrapper[8606]: I1204 22:09:26.944713 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:26.944830 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:26.944830 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:26.944830 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:26.945804 master-0 kubenswrapper[8606]: I1204 22:09:26.944838 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:27.944864 master-0 kubenswrapper[8606]: I1204 22:09:27.944761 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:27.944864 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:27.944864 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:27.944864 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:27.945921 master-0 kubenswrapper[8606]: I1204 22:09:27.944894 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:28.944822 master-0 kubenswrapper[8606]: I1204 22:09:28.944747 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:28.944822 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:28.944822 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:28.944822 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:28.946075 master-0 kubenswrapper[8606]: I1204 22:09:28.944855 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:29.944546 master-0 kubenswrapper[8606]: I1204 22:09:29.944419 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:29.944546 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:29.944546 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:29.944546 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:29.944963 master-0 kubenswrapper[8606]: I1204 22:09:29.944543 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:30.944670 master-0 kubenswrapper[8606]: I1204 22:09:30.944580 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:30.944670 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:30.944670 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:30.944670 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:30.945948 master-0 kubenswrapper[8606]: I1204 22:09:30.944689 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:31.944548 master-0 kubenswrapper[8606]: I1204 22:09:31.944405 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:31.944548 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:31.944548 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:31.944548 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:31.944548 master-0 kubenswrapper[8606]: I1204 22:09:31.944498 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:32.944551 master-0 kubenswrapper[8606]: I1204 22:09:32.944386 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:32.944551 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:32.944551 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:32.944551 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:32.945637 master-0 kubenswrapper[8606]: I1204 22:09:32.944547 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:33.944273 master-0 kubenswrapper[8606]: I1204 22:09:33.944184 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:33.944273 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:33.944273 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:33.944273 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:33.944755 master-0 kubenswrapper[8606]: I1204 22:09:33.944284 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:34.944182 master-0 kubenswrapper[8606]: I1204 22:09:34.944098 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:34.944182 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:34.944182 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:34.944182 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:34.944902 master-0 kubenswrapper[8606]: I1204 22:09:34.944212 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:35.945112 master-0 kubenswrapper[8606]: I1204 22:09:35.945017 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:35.945112 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:35.945112 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:35.945112 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:35.945836 master-0 kubenswrapper[8606]: I1204 22:09:35.945146 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:36.945467 master-0 kubenswrapper[8606]: I1204 22:09:36.945365 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:36.945467 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:36.945467 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:36.945467 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:36.946832 master-0 kubenswrapper[8606]: I1204 22:09:36.945495 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:37.943823 master-0 kubenswrapper[8606]: I1204 22:09:37.943715 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:37.943823 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:37.943823 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:37.943823 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:37.943823 master-0 kubenswrapper[8606]: I1204 22:09:37.943810 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:38.944083 master-0 kubenswrapper[8606]: I1204 22:09:38.943972 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:38.944083 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:38.944083 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:38.944083 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:38.944083 master-0 kubenswrapper[8606]: I1204 22:09:38.944071 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:39.945088 master-0 kubenswrapper[8606]: I1204 22:09:39.945012 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:39.945088 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:39.945088 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:39.945088 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:39.946051 master-0 kubenswrapper[8606]: I1204 22:09:39.945104 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:40.944878 master-0 kubenswrapper[8606]: I1204 22:09:40.944770 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:40.944878 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:40.944878 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:40.944878 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:40.944878 master-0 kubenswrapper[8606]: I1204 22:09:40.944854 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:41.944623 master-0 kubenswrapper[8606]: I1204 22:09:41.944539 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:41.944623 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:41.944623 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:41.944623 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:41.945057 master-0 kubenswrapper[8606]: I1204 22:09:41.944626 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:42.946109 master-0 kubenswrapper[8606]: I1204 22:09:42.945984 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:42.946109 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:42.946109 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:42.946109 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:42.946109 master-0 kubenswrapper[8606]: I1204 22:09:42.946093 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:43.944144 master-0 kubenswrapper[8606]: I1204 22:09:43.944039 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:43.944144 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:43.944144 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:43.944144 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:43.944606 master-0 kubenswrapper[8606]: I1204 22:09:43.944152 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:44.944850 master-0 kubenswrapper[8606]: I1204 22:09:44.944732 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:44.944850 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:44.944850 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:44.944850 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:44.946009 master-0 kubenswrapper[8606]: I1204 22:09:44.944852 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:45.944530 master-0 kubenswrapper[8606]: I1204 22:09:45.944419 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:45.944530 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:45.944530 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:45.944530 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:45.944865 master-0 kubenswrapper[8606]: I1204 22:09:45.944601 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:46.944628 master-0 kubenswrapper[8606]: I1204 22:09:46.944546 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:46.944628 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:46.944628 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:46.944628 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:46.945620 master-0 kubenswrapper[8606]: I1204 22:09:46.944651 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:47.944925 master-0 kubenswrapper[8606]: I1204 22:09:47.944831 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:47.944925 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:47.944925 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:47.944925 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:47.945736 master-0 kubenswrapper[8606]: I1204 22:09:47.944949 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:48.944181 master-0 kubenswrapper[8606]: I1204 22:09:48.944127 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:48.944181 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:48.944181 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:48.944181 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:48.945041 master-0 kubenswrapper[8606]: I1204 22:09:48.944952 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:49.944411 master-0 kubenswrapper[8606]: I1204 22:09:49.944319 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:49.944411 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:49.944411 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:49.944411 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:49.944804 master-0 kubenswrapper[8606]: I1204 22:09:49.944451 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:50.945648 master-0 kubenswrapper[8606]: I1204 22:09:50.945404 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:50.945648 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:50.945648 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:50.945648 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:50.946294 master-0 kubenswrapper[8606]: I1204 22:09:50.945684 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:51.944931 master-0 kubenswrapper[8606]: I1204 22:09:51.944851 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:51.944931 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:51.944931 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:51.944931 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:51.945435 master-0 kubenswrapper[8606]: I1204 22:09:51.944962 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:52.944655 master-0 kubenswrapper[8606]: I1204 22:09:52.944574 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:52.944655 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:52.944655 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:52.944655 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:52.945668 master-0 kubenswrapper[8606]: I1204 22:09:52.944671 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:53.945106 master-0 kubenswrapper[8606]: I1204 22:09:53.945048 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:53.945106 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:53.945106 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:53.945106 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:53.946121 master-0 kubenswrapper[8606]: I1204 22:09:53.945822 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:54.944275 master-0 kubenswrapper[8606]: I1204 22:09:54.944164 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:54.944275 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:54.944275 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:54.944275 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:54.944963 master-0 kubenswrapper[8606]: I1204 22:09:54.944327 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:55.944841 master-0 kubenswrapper[8606]: I1204 22:09:55.944733 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:55.944841 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:55.944841 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:55.944841 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:55.945735 master-0 kubenswrapper[8606]: I1204 22:09:55.944854 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:56.945361 master-0 kubenswrapper[8606]: I1204 22:09:56.945269 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:56.945361 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:56.945361 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:56.945361 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:56.946468 master-0 kubenswrapper[8606]: I1204 22:09:56.945377 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:57.944141 master-0 kubenswrapper[8606]: I1204 22:09:57.944033 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:57.944141 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:57.944141 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:57.944141 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:57.944684 master-0 kubenswrapper[8606]: I1204 22:09:57.944174 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:58.944426 master-0 kubenswrapper[8606]: I1204 22:09:58.944264 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:58.944426 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:58.944426 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:58.944426 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:58.945142 master-0 kubenswrapper[8606]: I1204 22:09:58.944545 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:09:59.944043 master-0 kubenswrapper[8606]: I1204 22:09:59.943960 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:09:59.944043 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:09:59.944043 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:09:59.944043 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:09:59.945316 master-0 kubenswrapper[8606]: I1204 22:09:59.944851 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:00.944335 master-0 kubenswrapper[8606]: I1204 22:10:00.944221 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:00.944335 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:00.944335 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:00.944335 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:00.945780 master-0 kubenswrapper[8606]: I1204 22:10:00.944345 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:01.944803 master-0 kubenswrapper[8606]: I1204 22:10:01.944332 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:01.944803 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:01.944803 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:01.944803 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:01.944803 master-0 kubenswrapper[8606]: I1204 22:10:01.944439 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:02.943904 master-0 kubenswrapper[8606]: I1204 22:10:02.943824 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:02.943904 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:02.943904 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:02.943904 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:02.944258 master-0 kubenswrapper[8606]: I1204 22:10:02.943948 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:03.945531 master-0 kubenswrapper[8606]: I1204 22:10:03.945419 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:03.945531 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:03.945531 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:03.945531 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:03.945531 master-0 kubenswrapper[8606]: I1204 22:10:03.945519 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:04.944715 master-0 kubenswrapper[8606]: I1204 22:10:04.944584 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:04.944715 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:04.944715 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:04.944715 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:04.944715 master-0 kubenswrapper[8606]: I1204 22:10:04.944692 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:05.944791 master-0 kubenswrapper[8606]: I1204 22:10:05.944715 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:05.944791 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:05.944791 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:05.944791 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:05.944791 master-0 kubenswrapper[8606]: I1204 22:10:05.944797 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:06.944325 master-0 kubenswrapper[8606]: I1204 22:10:06.944211 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:06.944325 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:06.944325 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:06.944325 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:06.945553 master-0 kubenswrapper[8606]: I1204 22:10:06.944353 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:07.943799 master-0 kubenswrapper[8606]: I1204 22:10:07.943711 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:07.943799 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:07.943799 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:07.943799 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:07.944145 master-0 kubenswrapper[8606]: I1204 22:10:07.943797 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:08.944646 master-0 kubenswrapper[8606]: I1204 22:10:08.944534 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:08.944646 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:08.944646 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:08.944646 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:08.944646 master-0 kubenswrapper[8606]: I1204 22:10:08.944626 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:09.943902 master-0 kubenswrapper[8606]: I1204 22:10:09.943784 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:09.943902 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:09.943902 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:09.943902 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:09.943902 master-0 kubenswrapper[8606]: I1204 22:10:09.943886 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:10.944232 master-0 kubenswrapper[8606]: I1204 22:10:10.944137 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:10.944232 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:10.944232 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:10.944232 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:10.945224 master-0 kubenswrapper[8606]: I1204 22:10:10.944251 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:11.944883 master-0 kubenswrapper[8606]: I1204 22:10:11.944766 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:11.944883 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:11.944883 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:11.944883 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:11.944883 master-0 kubenswrapper[8606]: I1204 22:10:11.944863 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:12.944650 master-0 kubenswrapper[8606]: I1204 22:10:12.944553 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:12.944650 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:12.944650 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:12.944650 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:12.945925 master-0 kubenswrapper[8606]: I1204 22:10:12.944650 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:13.944259 master-0 kubenswrapper[8606]: I1204 22:10:13.944127 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:13.944259 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:13.944259 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:13.944259 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:13.944750 master-0 kubenswrapper[8606]: I1204 22:10:13.944256 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:14.943976 master-0 kubenswrapper[8606]: I1204 22:10:14.943890 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:14.943976 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:14.943976 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:14.943976 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:14.944785 master-0 kubenswrapper[8606]: I1204 22:10:14.944006 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:15.670208 master-0 kubenswrapper[8606]: I1204 22:10:15.670106 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 04 22:10:15.671610 master-0 kubenswrapper[8606]: I1204 22:10:15.671568 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:15.680603 master-0 kubenswrapper[8606]: I1204 22:10:15.680532 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-dtj8j" Dec 04 22:10:15.680836 master-0 kubenswrapper[8606]: I1204 22:10:15.680735 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Dec 04 22:10:15.700017 master-0 kubenswrapper[8606]: I1204 22:10:15.699287 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 04 22:10:15.736298 master-0 kubenswrapper[8606]: I1204 22:10:15.736210 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e011b0a-89e2-47e3-9112-d46a828416b1-kube-api-access\") pod \"installer-2-master-0\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:15.737854 master-0 kubenswrapper[8606]: I1204 22:10:15.736367 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-var-lock\") pod \"installer-2-master-0\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:15.737854 master-0 kubenswrapper[8606]: I1204 22:10:15.736745 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:15.838394 master-0 kubenswrapper[8606]: I1204 22:10:15.838316 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e011b0a-89e2-47e3-9112-d46a828416b1-kube-api-access\") pod \"installer-2-master-0\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:15.838699 master-0 kubenswrapper[8606]: I1204 22:10:15.838431 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-var-lock\") pod \"installer-2-master-0\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:15.838699 master-0 kubenswrapper[8606]: I1204 22:10:15.838529 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:15.838862 master-0 kubenswrapper[8606]: I1204 22:10:15.838734 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-var-lock\") pod \"installer-2-master-0\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:15.838933 master-0 kubenswrapper[8606]: I1204 22:10:15.838858 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:15.874395 master-0 kubenswrapper[8606]: I1204 22:10:15.874347 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e011b0a-89e2-47e3-9112-d46a828416b1-kube-api-access\") pod \"installer-2-master-0\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:15.943951 master-0 kubenswrapper[8606]: I1204 22:10:15.943821 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:15.943951 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:15.943951 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:15.943951 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:15.944615 master-0 kubenswrapper[8606]: I1204 22:10:15.943942 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:16.020719 master-0 kubenswrapper[8606]: I1204 22:10:16.020314 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 04 22:10:16.444073 master-0 kubenswrapper[8606]: I1204 22:10:16.443995 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 04 22:10:16.569215 master-0 kubenswrapper[8606]: I1204 22:10:16.569158 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"6e011b0a-89e2-47e3-9112-d46a828416b1","Type":"ContainerStarted","Data":"fcbb33183ef82fa1ce0f5d881f45fbf26ffc5cbcffaeaf0d3d41a2fb848e78fa"} Dec 04 22:10:16.944657 master-0 kubenswrapper[8606]: I1204 22:10:16.944569 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:16.944657 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:16.944657 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:16.944657 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:16.944657 master-0 kubenswrapper[8606]: I1204 22:10:16.944643 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:17.581882 master-0 kubenswrapper[8606]: I1204 22:10:17.581750 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"6e011b0a-89e2-47e3-9112-d46a828416b1","Type":"ContainerStarted","Data":"81f5bc53e7bd37d1c3167c411f68ef8d2e2f1eae21a167bd8c740d425e144c3a"} Dec 04 22:10:17.610634 master-0 kubenswrapper[8606]: I1204 22:10:17.610474 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=2.610442613 podStartE2EDuration="2.610442613s" podCreationTimestamp="2025-12-04 22:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:10:17.609059465 +0000 UTC m=+582.419361760" watchObservedRunningTime="2025-12-04 22:10:17.610442613 +0000 UTC m=+582.420744878" Dec 04 22:10:17.943940 master-0 kubenswrapper[8606]: I1204 22:10:17.943750 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:17.943940 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:17.943940 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:17.943940 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:17.943940 master-0 kubenswrapper[8606]: I1204 22:10:17.943867 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:18.944332 master-0 kubenswrapper[8606]: I1204 22:10:18.944228 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:18.944332 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:18.944332 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:18.944332 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:18.945641 master-0 kubenswrapper[8606]: I1204 22:10:18.944348 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:19.943691 master-0 kubenswrapper[8606]: I1204 22:10:19.943611 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:19.943691 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:19.943691 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:19.943691 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:19.944104 master-0 kubenswrapper[8606]: I1204 22:10:19.943707 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:20.945576 master-0 kubenswrapper[8606]: I1204 22:10:20.945468 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:20.945576 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:20.945576 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:20.945576 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:20.946338 master-0 kubenswrapper[8606]: I1204 22:10:20.945601 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:21.944426 master-0 kubenswrapper[8606]: I1204 22:10:21.944322 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:21.944426 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:21.944426 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:21.944426 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:21.944426 master-0 kubenswrapper[8606]: I1204 22:10:21.944430 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:22.943957 master-0 kubenswrapper[8606]: I1204 22:10:22.943882 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:22.943957 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:22.943957 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:22.943957 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:22.944966 master-0 kubenswrapper[8606]: I1204 22:10:22.943974 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:23.943687 master-0 kubenswrapper[8606]: I1204 22:10:23.943571 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:23.943687 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:23.943687 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:23.943687 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:23.944760 master-0 kubenswrapper[8606]: I1204 22:10:23.943685 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:24.944768 master-0 kubenswrapper[8606]: I1204 22:10:24.944682 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:24.944768 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:24.944768 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:24.944768 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:24.945697 master-0 kubenswrapper[8606]: I1204 22:10:24.944775 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:25.943655 master-0 kubenswrapper[8606]: I1204 22:10:25.943579 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:25.943655 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:25.943655 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:25.943655 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:25.944053 master-0 kubenswrapper[8606]: I1204 22:10:25.943667 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:26.944985 master-0 kubenswrapper[8606]: I1204 22:10:26.944897 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:26.944985 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:26.944985 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:26.944985 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:26.945696 master-0 kubenswrapper[8606]: I1204 22:10:26.945016 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:27.944567 master-0 kubenswrapper[8606]: I1204 22:10:27.944420 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:10:27.944567 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:10:27.944567 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:10:27.944567 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:10:27.945701 master-0 kubenswrapper[8606]: I1204 22:10:27.944579 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:10:27.945701 master-0 kubenswrapper[8606]: I1204 22:10:27.944675 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:10:27.945944 master-0 kubenswrapper[8606]: I1204 22:10:27.945869 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"ff93336d691c6b7a2b8fa61f8706675e304c74a0c7baba380e59deb94bba182a"} pod="openshift-ingress/router-default-5465c8b4db-8vm66" containerMessage="Container router failed startup probe, will be restarted" Dec 04 22:10:27.945994 master-0 kubenswrapper[8606]: I1204 22:10:27.945943 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" containerID="cri-o://ff93336d691c6b7a2b8fa61f8706675e304c74a0c7baba380e59deb94bba182a" gracePeriod=3600 Dec 04 22:10:48.286303 master-0 kubenswrapper[8606]: I1204 22:10:48.286187 8606 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Dec 04 22:10:48.287323 master-0 kubenswrapper[8606]: I1204 22:10:48.286795 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcdctl" containerID="cri-o://c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865" gracePeriod=30 Dec 04 22:10:48.287323 master-0 kubenswrapper[8606]: I1204 22:10:48.286841 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-rev" containerID="cri-o://9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d" gracePeriod=30 Dec 04 22:10:48.287323 master-0 kubenswrapper[8606]: I1204 22:10:48.286908 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-readyz" containerID="cri-o://6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459" gracePeriod=30 Dec 04 22:10:48.287323 master-0 kubenswrapper[8606]: I1204 22:10:48.286914 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-metrics" containerID="cri-o://e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4" gracePeriod=30 Dec 04 22:10:48.287323 master-0 kubenswrapper[8606]: I1204 22:10:48.286984 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd" containerID="cri-o://a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00" gracePeriod=30 Dec 04 22:10:48.289577 master-0 kubenswrapper[8606]: I1204 22:10:48.289479 8606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Dec 04 22:10:48.289977 master-0 kubenswrapper[8606]: E1204 22:10:48.289919 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="setup" Dec 04 22:10:48.289977 master-0 kubenswrapper[8606]: I1204 22:10:48.289953 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="setup" Dec 04 22:10:48.289977 master-0 kubenswrapper[8606]: E1204 22:10:48.289976 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-rev" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: I1204 22:10:48.289989 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-rev" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: E1204 22:10:48.290009 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-metrics" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: I1204 22:10:48.290022 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-metrics" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: E1204 22:10:48.290040 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-ensure-env-vars" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: I1204 22:10:48.290052 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-ensure-env-vars" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: E1204 22:10:48.290068 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: I1204 22:10:48.290081 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: E1204 22:10:48.290104 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-readyz" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: I1204 22:10:48.290116 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-readyz" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: E1204 22:10:48.290137 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcdctl" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: I1204 22:10:48.290149 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcdctl" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: E1204 22:10:48.290171 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-resources-copy" Dec 04 22:10:48.290315 master-0 kubenswrapper[8606]: I1204 22:10:48.290183 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-resources-copy" Dec 04 22:10:48.291893 master-0 kubenswrapper[8606]: I1204 22:10:48.290386 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-metrics" Dec 04 22:10:48.291893 master-0 kubenswrapper[8606]: I1204 22:10:48.290414 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd" Dec 04 22:10:48.291893 master-0 kubenswrapper[8606]: I1204 22:10:48.290458 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-rev" Dec 04 22:10:48.291893 master-0 kubenswrapper[8606]: I1204 22:10:48.290478 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-readyz" Dec 04 22:10:48.291893 master-0 kubenswrapper[8606]: I1204 22:10:48.290522 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcdctl" Dec 04 22:10:48.374472 master-0 kubenswrapper[8606]: I1204 22:10:48.374372 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.374740 master-0 kubenswrapper[8606]: I1204 22:10:48.374596 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.374740 master-0 kubenswrapper[8606]: I1204 22:10:48.374673 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.374906 master-0 kubenswrapper[8606]: I1204 22:10:48.374852 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.374972 master-0 kubenswrapper[8606]: I1204 22:10:48.374941 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.375073 master-0 kubenswrapper[8606]: I1204 22:10:48.375028 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.439663 master-0 kubenswrapper[8606]: I1204 22:10:48.439579 8606 patch_prober.go:28] interesting pod/etcd-master-0 container/etcd namespace/openshift-etcd: Readiness probe status=failure output="Get \"https://192.168.32.10:9980/readyz\": dial tcp 192.168.32.10:9980: connect: connection refused" start-of-body= Dec 04 22:10:48.439863 master-0 kubenswrapper[8606]: I1204 22:10:48.439690 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd" probeResult="failure" output="Get \"https://192.168.32.10:9980/readyz\": dial tcp 192.168.32.10:9980: connect: connection refused" Dec 04 22:10:48.477051 master-0 kubenswrapper[8606]: I1204 22:10:48.476953 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477051 master-0 kubenswrapper[8606]: I1204 22:10:48.477021 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477351 master-0 kubenswrapper[8606]: I1204 22:10:48.477116 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477351 master-0 kubenswrapper[8606]: I1204 22:10:48.477155 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477351 master-0 kubenswrapper[8606]: I1204 22:10:48.477179 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477351 master-0 kubenswrapper[8606]: I1204 22:10:48.477205 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477351 master-0 kubenswrapper[8606]: I1204 22:10:48.477236 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477611 master-0 kubenswrapper[8606]: I1204 22:10:48.477350 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477611 master-0 kubenswrapper[8606]: I1204 22:10:48.477444 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477611 master-0 kubenswrapper[8606]: I1204 22:10:48.477536 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477801 master-0 kubenswrapper[8606]: I1204 22:10:48.477627 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.477801 master-0 kubenswrapper[8606]: I1204 22:10:48.477680 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:10:48.873677 master-0 kubenswrapper[8606]: I1204 22:10:48.873616 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-rev/0.log" Dec 04 22:10:48.875425 master-0 kubenswrapper[8606]: I1204 22:10:48.875348 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-metrics/0.log" Dec 04 22:10:48.878741 master-0 kubenswrapper[8606]: I1204 22:10:48.878689 8606 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d" exitCode=2 Dec 04 22:10:48.878887 master-0 kubenswrapper[8606]: I1204 22:10:48.878747 8606 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459" exitCode=0 Dec 04 22:10:48.878887 master-0 kubenswrapper[8606]: I1204 22:10:48.878767 8606 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4" exitCode=2 Dec 04 22:11:00.760084 master-0 kubenswrapper[8606]: E1204 22:11:00.759978 8606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 22:11:01.986815 master-0 kubenswrapper[8606]: I1204 22:11:01.986731 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:11:01.987764 master-0 kubenswrapper[8606]: I1204 22:11:01.986829 8606 generic.go:334] "Generic (PLEG): container finished" podID="fad55397ac8e23f218f25cb714ea5b2b" containerID="dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5" exitCode=1 Dec 04 22:11:01.987764 master-0 kubenswrapper[8606]: I1204 22:11:01.986888 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerDied","Data":"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5"} Dec 04 22:11:01.987927 master-0 kubenswrapper[8606]: I1204 22:11:01.987769 8606 scope.go:117] "RemoveContainer" containerID="dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5" Dec 04 22:11:03.000777 master-0 kubenswrapper[8606]: I1204 22:11:03.000692 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:11:03.001742 master-0 kubenswrapper[8606]: I1204 22:11:03.000865 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa"} Dec 04 22:11:03.003726 master-0 kubenswrapper[8606]: I1204 22:11:03.003627 8606 generic.go:334] "Generic (PLEG): container finished" podID="6e011b0a-89e2-47e3-9112-d46a828416b1" containerID="81f5bc53e7bd37d1c3167c411f68ef8d2e2f1eae21a167bd8c740d425e144c3a" exitCode=0 Dec 04 22:11:03.003726 master-0 kubenswrapper[8606]: I1204 22:11:03.003713 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"6e011b0a-89e2-47e3-9112-d46a828416b1","Type":"ContainerDied","Data":"81f5bc53e7bd37d1c3167c411f68ef8d2e2f1eae21a167bd8c740d425e144c3a"} Dec 04 22:11:04.183694 master-0 kubenswrapper[8606]: I1204 22:11:04.183133 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:11:04.183694 master-0 kubenswrapper[8606]: I1204 22:11:04.183220 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:11:04.205745 master-0 kubenswrapper[8606]: I1204 22:11:04.205642 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:11:04.361373 master-0 kubenswrapper[8606]: I1204 22:11:04.361328 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 04 22:11:04.450549 master-0 kubenswrapper[8606]: I1204 22:11:04.450446 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e011b0a-89e2-47e3-9112-d46a828416b1-kube-api-access\") pod \"6e011b0a-89e2-47e3-9112-d46a828416b1\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " Dec 04 22:11:04.450812 master-0 kubenswrapper[8606]: I1204 22:11:04.450655 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-kubelet-dir\") pod \"6e011b0a-89e2-47e3-9112-d46a828416b1\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " Dec 04 22:11:04.450812 master-0 kubenswrapper[8606]: I1204 22:11:04.450719 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-var-lock\") pod \"6e011b0a-89e2-47e3-9112-d46a828416b1\" (UID: \"6e011b0a-89e2-47e3-9112-d46a828416b1\") " Dec 04 22:11:04.451012 master-0 kubenswrapper[8606]: I1204 22:11:04.450775 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6e011b0a-89e2-47e3-9112-d46a828416b1" (UID: "6e011b0a-89e2-47e3-9112-d46a828416b1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:11:04.451099 master-0 kubenswrapper[8606]: I1204 22:11:04.450984 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-var-lock" (OuterVolumeSpecName: "var-lock") pod "6e011b0a-89e2-47e3-9112-d46a828416b1" (UID: "6e011b0a-89e2-47e3-9112-d46a828416b1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:11:04.452581 master-0 kubenswrapper[8606]: I1204 22:11:04.451702 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:11:04.452581 master-0 kubenswrapper[8606]: I1204 22:11:04.451756 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6e011b0a-89e2-47e3-9112-d46a828416b1-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:11:04.455865 master-0 kubenswrapper[8606]: I1204 22:11:04.455758 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e011b0a-89e2-47e3-9112-d46a828416b1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6e011b0a-89e2-47e3-9112-d46a828416b1" (UID: "6e011b0a-89e2-47e3-9112-d46a828416b1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:11:04.553560 master-0 kubenswrapper[8606]: I1204 22:11:04.553409 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6e011b0a-89e2-47e3-9112-d46a828416b1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:11:05.019135 master-0 kubenswrapper[8606]: I1204 22:11:05.019053 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 04 22:11:05.019135 master-0 kubenswrapper[8606]: I1204 22:11:05.019052 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"6e011b0a-89e2-47e3-9112-d46a828416b1","Type":"ContainerDied","Data":"fcbb33183ef82fa1ce0f5d881f45fbf26ffc5cbcffaeaf0d3d41a2fb848e78fa"} Dec 04 22:11:05.019135 master-0 kubenswrapper[8606]: I1204 22:11:05.019128 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcbb33183ef82fa1ce0f5d881f45fbf26ffc5cbcffaeaf0d3d41a2fb848e78fa" Dec 04 22:11:06.029664 master-0 kubenswrapper[8606]: I1204 22:11:06.029335 8606 generic.go:334] "Generic (PLEG): container finished" podID="5e09e2af7200e6f9be469dbfd9bb1127" containerID="229f04f01e031c096a8d66a3a3b9f5322d73a495869829416a90812a311e2aee" exitCode=1 Dec 04 22:11:06.029664 master-0 kubenswrapper[8606]: I1204 22:11:06.029414 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerDied","Data":"229f04f01e031c096a8d66a3a3b9f5322d73a495869829416a90812a311e2aee"} Dec 04 22:11:06.029664 master-0 kubenswrapper[8606]: I1204 22:11:06.029468 8606 scope.go:117] "RemoveContainer" containerID="d00575d56d81b17e8c0212c4cad634fe2f3afd13660a8de6afbe8a4381dd50d7" Dec 04 22:11:06.030593 master-0 kubenswrapper[8606]: I1204 22:11:06.030193 8606 scope.go:117] "RemoveContainer" containerID="229f04f01e031c096a8d66a3a3b9f5322d73a495869829416a90812a311e2aee" Dec 04 22:11:06.030593 master-0 kubenswrapper[8606]: E1204 22:11:06.030543 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(5e09e2af7200e6f9be469dbfd9bb1127)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="5e09e2af7200e6f9be469dbfd9bb1127" Dec 04 22:11:10.761321 master-0 kubenswrapper[8606]: E1204 22:11:10.760777 8606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:11:14.054287 master-0 kubenswrapper[8606]: E1204 22:11:14.054201 8606 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc178afcf_b713_4c74_b22b_6169ba3123f5.slice/crio-ff93336d691c6b7a2b8fa61f8706675e304c74a0c7baba380e59deb94bba182a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc178afcf_b713_4c74_b22b_6169ba3123f5.slice/crio-conmon-ff93336d691c6b7a2b8fa61f8706675e304c74a0c7baba380e59deb94bba182a.scope\": RecentStats: unable to find data in memory cache]" Dec 04 22:11:14.104953 master-0 kubenswrapper[8606]: I1204 22:11:14.104857 8606 generic.go:334] "Generic (PLEG): container finished" podID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerID="ff93336d691c6b7a2b8fa61f8706675e304c74a0c7baba380e59deb94bba182a" exitCode=0 Dec 04 22:11:14.104953 master-0 kubenswrapper[8606]: I1204 22:11:14.104920 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerDied","Data":"ff93336d691c6b7a2b8fa61f8706675e304c74a0c7baba380e59deb94bba182a"} Dec 04 22:11:14.104953 master-0 kubenswrapper[8606]: I1204 22:11:14.104961 8606 scope.go:117] "RemoveContainer" containerID="b6b73ce0e49791c4429e0dc83a179c1414cef4c0eb61371c4801b941d91131b9" Dec 04 22:11:14.191211 master-0 kubenswrapper[8606]: I1204 22:11:14.191147 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:11:15.117562 master-0 kubenswrapper[8606]: I1204 22:11:15.117462 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerStarted","Data":"57c531c30874bf4998e2715db0baeccbcbac79537b43a4f58e8644a7f87789e1"} Dec 04 22:11:15.940994 master-0 kubenswrapper[8606]: I1204 22:11:15.940913 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:11:15.944348 master-0 kubenswrapper[8606]: I1204 22:11:15.944290 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:15.944348 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:15.944348 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:15.944348 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:15.944726 master-0 kubenswrapper[8606]: I1204 22:11:15.944369 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:16.945255 master-0 kubenswrapper[8606]: I1204 22:11:16.945124 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:16.945255 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:16.945255 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:16.945255 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:16.946243 master-0 kubenswrapper[8606]: I1204 22:11:16.945286 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:17.944759 master-0 kubenswrapper[8606]: I1204 22:11:17.944681 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:17.944759 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:17.944759 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:17.944759 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:17.945188 master-0 kubenswrapper[8606]: I1204 22:11:17.944765 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:18.903223 master-0 kubenswrapper[8606]: I1204 22:11:18.903132 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-rev/0.log" Dec 04 22:11:18.904580 master-0 kubenswrapper[8606]: I1204 22:11:18.904487 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-metrics/0.log" Dec 04 22:11:18.905639 master-0 kubenswrapper[8606]: I1204 22:11:18.905601 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcdctl/0.log" Dec 04 22:11:18.907204 master-0 kubenswrapper[8606]: I1204 22:11:18.907154 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 04 22:11:18.941289 master-0 kubenswrapper[8606]: I1204 22:11:18.941202 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:11:18.944402 master-0 kubenswrapper[8606]: I1204 22:11:18.944343 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:18.944402 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:18.944402 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:18.944402 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:18.944714 master-0 kubenswrapper[8606]: I1204 22:11:18.944415 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:18.996439 master-0 kubenswrapper[8606]: I1204 22:11:18.996346 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 04 22:11:18.996439 master-0 kubenswrapper[8606]: I1204 22:11:18.996442 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 04 22:11:18.996876 master-0 kubenswrapper[8606]: I1204 22:11:18.996461 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:11:18.996876 master-0 kubenswrapper[8606]: I1204 22:11:18.996492 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 04 22:11:18.996876 master-0 kubenswrapper[8606]: I1204 22:11:18.996561 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:11:18.996876 master-0 kubenswrapper[8606]: I1204 22:11:18.996585 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir" (OuterVolumeSpecName: "data-dir") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:11:18.996876 master-0 kubenswrapper[8606]: I1204 22:11:18.996736 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 04 22:11:18.996876 master-0 kubenswrapper[8606]: I1204 22:11:18.996783 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir" (OuterVolumeSpecName: "log-dir") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:11:18.996876 master-0 kubenswrapper[8606]: I1204 22:11:18.996791 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 04 22:11:18.996876 master-0 kubenswrapper[8606]: I1204 22:11:18.996860 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 04 22:11:18.997440 master-0 kubenswrapper[8606]: I1204 22:11:18.996914 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:11:18.997440 master-0 kubenswrapper[8606]: I1204 22:11:18.996990 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:11:18.997967 master-0 kubenswrapper[8606]: I1204 22:11:18.997904 8606 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:11:18.998059 master-0 kubenswrapper[8606]: I1204 22:11:18.997965 8606 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:11:18.998059 master-0 kubenswrapper[8606]: I1204 22:11:18.997995 8606 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:11:18.998059 master-0 kubenswrapper[8606]: I1204 22:11:18.998016 8606 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:11:18.998059 master-0 kubenswrapper[8606]: I1204 22:11:18.998033 8606 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:11:18.998059 master-0 kubenswrapper[8606]: I1204 22:11:18.998052 8606 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Dec 04 22:11:19.152179 master-0 kubenswrapper[8606]: I1204 22:11:19.152026 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-rev/0.log" Dec 04 22:11:19.153475 master-0 kubenswrapper[8606]: I1204 22:11:19.153413 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-metrics/0.log" Dec 04 22:11:19.154468 master-0 kubenswrapper[8606]: I1204 22:11:19.154421 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcdctl/0.log" Dec 04 22:11:19.155786 master-0 kubenswrapper[8606]: I1204 22:11:19.155724 8606 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00" exitCode=0 Dec 04 22:11:19.155786 master-0 kubenswrapper[8606]: I1204 22:11:19.155780 8606 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865" exitCode=137 Dec 04 22:11:19.155931 master-0 kubenswrapper[8606]: I1204 22:11:19.155826 8606 scope.go:117] "RemoveContainer" containerID="9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d" Dec 04 22:11:19.155931 master-0 kubenswrapper[8606]: I1204 22:11:19.155862 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 04 22:11:19.180812 master-0 kubenswrapper[8606]: I1204 22:11:19.180711 8606 scope.go:117] "RemoveContainer" containerID="6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459" Dec 04 22:11:19.213228 master-0 kubenswrapper[8606]: I1204 22:11:19.213056 8606 scope.go:117] "RemoveContainer" containerID="e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4" Dec 04 22:11:19.241154 master-0 kubenswrapper[8606]: I1204 22:11:19.240958 8606 scope.go:117] "RemoveContainer" containerID="a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00" Dec 04 22:11:19.267159 master-0 kubenswrapper[8606]: I1204 22:11:19.267075 8606 scope.go:117] "RemoveContainer" containerID="c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865" Dec 04 22:11:19.294649 master-0 kubenswrapper[8606]: I1204 22:11:19.294393 8606 scope.go:117] "RemoveContainer" containerID="5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd" Dec 04 22:11:19.325145 master-0 kubenswrapper[8606]: I1204 22:11:19.324787 8606 scope.go:117] "RemoveContainer" containerID="38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d" Dec 04 22:11:19.356715 master-0 kubenswrapper[8606]: I1204 22:11:19.356464 8606 scope.go:117] "RemoveContainer" containerID="e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c" Dec 04 22:11:19.387001 master-0 kubenswrapper[8606]: I1204 22:11:19.386803 8606 scope.go:117] "RemoveContainer" containerID="9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d" Dec 04 22:11:19.387349 master-0 kubenswrapper[8606]: E1204 22:11:19.387300 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d\": container with ID starting with 9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d not found: ID does not exist" containerID="9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d" Dec 04 22:11:19.387449 master-0 kubenswrapper[8606]: I1204 22:11:19.387360 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d"} err="failed to get container status \"9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d\": rpc error: code = NotFound desc = could not find container \"9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d\": container with ID starting with 9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d not found: ID does not exist" Dec 04 22:11:19.387449 master-0 kubenswrapper[8606]: I1204 22:11:19.387406 8606 scope.go:117] "RemoveContainer" containerID="6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459" Dec 04 22:11:19.388063 master-0 kubenswrapper[8606]: E1204 22:11:19.388013 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459\": container with ID starting with 6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459 not found: ID does not exist" containerID="6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459" Dec 04 22:11:19.388188 master-0 kubenswrapper[8606]: I1204 22:11:19.388060 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459"} err="failed to get container status \"6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459\": rpc error: code = NotFound desc = could not find container \"6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459\": container with ID starting with 6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459 not found: ID does not exist" Dec 04 22:11:19.388188 master-0 kubenswrapper[8606]: I1204 22:11:19.388091 8606 scope.go:117] "RemoveContainer" containerID="e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4" Dec 04 22:11:19.388728 master-0 kubenswrapper[8606]: E1204 22:11:19.388437 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4\": container with ID starting with e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4 not found: ID does not exist" containerID="e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4" Dec 04 22:11:19.388728 master-0 kubenswrapper[8606]: I1204 22:11:19.388476 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4"} err="failed to get container status \"e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4\": rpc error: code = NotFound desc = could not find container \"e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4\": container with ID starting with e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4 not found: ID does not exist" Dec 04 22:11:19.388728 master-0 kubenswrapper[8606]: I1204 22:11:19.388538 8606 scope.go:117] "RemoveContainer" containerID="a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00" Dec 04 22:11:19.389409 master-0 kubenswrapper[8606]: E1204 22:11:19.389128 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00\": container with ID starting with a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00 not found: ID does not exist" containerID="a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00" Dec 04 22:11:19.389409 master-0 kubenswrapper[8606]: I1204 22:11:19.389167 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00"} err="failed to get container status \"a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00\": rpc error: code = NotFound desc = could not find container \"a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00\": container with ID starting with a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00 not found: ID does not exist" Dec 04 22:11:19.389409 master-0 kubenswrapper[8606]: I1204 22:11:19.389197 8606 scope.go:117] "RemoveContainer" containerID="c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865" Dec 04 22:11:19.389801 master-0 kubenswrapper[8606]: E1204 22:11:19.389749 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865\": container with ID starting with c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865 not found: ID does not exist" containerID="c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865" Dec 04 22:11:19.389890 master-0 kubenswrapper[8606]: I1204 22:11:19.389802 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865"} err="failed to get container status \"c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865\": rpc error: code = NotFound desc = could not find container \"c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865\": container with ID starting with c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865 not found: ID does not exist" Dec 04 22:11:19.389890 master-0 kubenswrapper[8606]: I1204 22:11:19.389840 8606 scope.go:117] "RemoveContainer" containerID="5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: E1204 22:11:19.390400 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd\": container with ID starting with 5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd not found: ID does not exist" containerID="5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: I1204 22:11:19.390454 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd"} err="failed to get container status \"5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd\": rpc error: code = NotFound desc = could not find container \"5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd\": container with ID starting with 5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd not found: ID does not exist" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: I1204 22:11:19.390492 8606 scope.go:117] "RemoveContainer" containerID="38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: E1204 22:11:19.391090 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d\": container with ID starting with 38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d not found: ID does not exist" containerID="38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: I1204 22:11:19.391162 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d"} err="failed to get container status \"38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d\": rpc error: code = NotFound desc = could not find container \"38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d\": container with ID starting with 38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d not found: ID does not exist" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: I1204 22:11:19.391212 8606 scope.go:117] "RemoveContainer" containerID="e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: E1204 22:11:19.391724 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c\": container with ID starting with e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c not found: ID does not exist" containerID="e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: I1204 22:11:19.391758 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c"} err="failed to get container status \"e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c\": rpc error: code = NotFound desc = could not find container \"e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c\": container with ID starting with e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c not found: ID does not exist" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: I1204 22:11:19.391784 8606 scope.go:117] "RemoveContainer" containerID="9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: I1204 22:11:19.392238 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d"} err="failed to get container status \"9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d\": rpc error: code = NotFound desc = could not find container \"9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d\": container with ID starting with 9f8328e5c32599f0b86739f6ededc94f8fd989d3029a17c83c0820a8e233cb8d not found: ID does not exist" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: I1204 22:11:19.392268 8606 scope.go:117] "RemoveContainer" containerID="6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: I1204 22:11:19.394382 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459"} err="failed to get container status \"6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459\": rpc error: code = NotFound desc = could not find container \"6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459\": container with ID starting with 6da0eb90f54a5797c8d3275b6fd2177ce6f0a932abe6de324f8cbcb14505d459 not found: ID does not exist" Dec 04 22:11:19.394567 master-0 kubenswrapper[8606]: I1204 22:11:19.394425 8606 scope.go:117] "RemoveContainer" containerID="e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4" Dec 04 22:11:19.395348 master-0 kubenswrapper[8606]: I1204 22:11:19.394993 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4"} err="failed to get container status \"e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4\": rpc error: code = NotFound desc = could not find container \"e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4\": container with ID starting with e4944d73540d410491242b077fcd7f190f94c648b18e396a4150691083f0e3b4 not found: ID does not exist" Dec 04 22:11:19.395348 master-0 kubenswrapper[8606]: I1204 22:11:19.395022 8606 scope.go:117] "RemoveContainer" containerID="a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00" Dec 04 22:11:19.395498 master-0 kubenswrapper[8606]: I1204 22:11:19.395451 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00"} err="failed to get container status \"a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00\": rpc error: code = NotFound desc = could not find container \"a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00\": container with ID starting with a69fb28ccb56aed47a5f648d78ee05d771b75426db5b7670e6fb84ea1da05e00 not found: ID does not exist" Dec 04 22:11:19.395498 master-0 kubenswrapper[8606]: I1204 22:11:19.395487 8606 scope.go:117] "RemoveContainer" containerID="c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865" Dec 04 22:11:19.395880 master-0 kubenswrapper[8606]: I1204 22:11:19.395831 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865"} err="failed to get container status \"c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865\": rpc error: code = NotFound desc = could not find container \"c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865\": container with ID starting with c6670e5461b1353b2128b2dcf8a20b1bfdfbd7090dd8c3b50619738b6c5f6865 not found: ID does not exist" Dec 04 22:11:19.395880 master-0 kubenswrapper[8606]: I1204 22:11:19.395869 8606 scope.go:117] "RemoveContainer" containerID="5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd" Dec 04 22:11:19.396310 master-0 kubenswrapper[8606]: I1204 22:11:19.396258 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd"} err="failed to get container status \"5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd\": rpc error: code = NotFound desc = could not find container \"5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd\": container with ID starting with 5fa9168362870db45ee28341d521b1ed3e3b716d968bedca0faac2eb04c852cd not found: ID does not exist" Dec 04 22:11:19.396310 master-0 kubenswrapper[8606]: I1204 22:11:19.396299 8606 scope.go:117] "RemoveContainer" containerID="38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d" Dec 04 22:11:19.396709 master-0 kubenswrapper[8606]: I1204 22:11:19.396661 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d"} err="failed to get container status \"38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d\": rpc error: code = NotFound desc = could not find container \"38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d\": container with ID starting with 38b299f2266359898070e8e873b514708c2f57a523003566696531ac7aabe43d not found: ID does not exist" Dec 04 22:11:19.396709 master-0 kubenswrapper[8606]: I1204 22:11:19.396699 8606 scope.go:117] "RemoveContainer" containerID="e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c" Dec 04 22:11:19.397126 master-0 kubenswrapper[8606]: I1204 22:11:19.397070 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c"} err="failed to get container status \"e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c\": rpc error: code = NotFound desc = could not find container \"e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c\": container with ID starting with e6966d362ea6c46b4acc0dea1e8eeec75546f100e9c2ecf9804f2df40891964c not found: ID does not exist" Dec 04 22:11:19.405558 master-0 kubenswrapper[8606]: I1204 22:11:19.405431 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24e01603234fe8003f8aae8171b0065" path="/var/lib/kubelet/pods/c24e01603234fe8003f8aae8171b0065/volumes" Dec 04 22:11:19.944448 master-0 kubenswrapper[8606]: I1204 22:11:19.944342 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:19.944448 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:19.944448 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:19.944448 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:19.944448 master-0 kubenswrapper[8606]: I1204 22:11:19.944428 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:20.392805 master-0 kubenswrapper[8606]: I1204 22:11:20.392701 8606 scope.go:117] "RemoveContainer" containerID="229f04f01e031c096a8d66a3a3b9f5322d73a495869829416a90812a311e2aee" Dec 04 22:11:20.761900 master-0 kubenswrapper[8606]: E1204 22:11:20.761835 8606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:11:20.944106 master-0 kubenswrapper[8606]: I1204 22:11:20.943930 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:20.944106 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:20.944106 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:20.944106 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:20.944106 master-0 kubenswrapper[8606]: I1204 22:11:20.944051 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:21.178239 master-0 kubenswrapper[8606]: I1204 22:11:21.178040 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"b2de34afcf16d55af0ab629ff305e02bc4e8c470038e92112248dabc18c8bf30"} Dec 04 22:11:21.943618 master-0 kubenswrapper[8606]: I1204 22:11:21.943473 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:21.943618 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:21.943618 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:21.943618 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:21.944166 master-0 kubenswrapper[8606]: I1204 22:11:21.943617 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:22.306500 master-0 kubenswrapper[8606]: E1204 22:11:22.306351 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.187e22b12a1b6cfa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:c24e01603234fe8003f8aae8171b0065,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:10:48.286809338 +0000 UTC m=+613.097111613,LastTimestamp:2025-12-04 22:10:48.286809338 +0000 UTC m=+613.097111613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:11:22.945714 master-0 kubenswrapper[8606]: I1204 22:11:22.945597 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:22.945714 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:22.945714 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:22.945714 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:22.946242 master-0 kubenswrapper[8606]: I1204 22:11:22.945724 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:23.945091 master-0 kubenswrapper[8606]: I1204 22:11:23.944944 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:23.945091 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:23.945091 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:23.945091 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:23.946132 master-0 kubenswrapper[8606]: I1204 22:11:23.945119 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:24.944616 master-0 kubenswrapper[8606]: I1204 22:11:24.944471 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:24.944616 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:24.944616 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:24.944616 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:24.945099 master-0 kubenswrapper[8606]: I1204 22:11:24.944608 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:25.211400 master-0 kubenswrapper[8606]: I1204 22:11:25.211265 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/3.log" Dec 04 22:11:25.212691 master-0 kubenswrapper[8606]: I1204 22:11:25.212630 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/2.log" Dec 04 22:11:25.213672 master-0 kubenswrapper[8606]: I1204 22:11:25.213606 8606 generic.go:334] "Generic (PLEG): container finished" podID="addddaac-a31a-4dbf-b78f-87225b11b463" containerID="8aef2ad5ac5490ab1a81df3b63461f15674a4e79c00fcefd7f3c846aef27f271" exitCode=1 Dec 04 22:11:25.213752 master-0 kubenswrapper[8606]: I1204 22:11:25.213679 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerDied","Data":"8aef2ad5ac5490ab1a81df3b63461f15674a4e79c00fcefd7f3c846aef27f271"} Dec 04 22:11:25.213800 master-0 kubenswrapper[8606]: I1204 22:11:25.213783 8606 scope.go:117] "RemoveContainer" containerID="b0ed4a9fe3a00c4a1d8cd842129319027df79a8ddd43dbd8b1904f3bc2dc1059" Dec 04 22:11:25.214736 master-0 kubenswrapper[8606]: I1204 22:11:25.214684 8606 scope.go:117] "RemoveContainer" containerID="8aef2ad5ac5490ab1a81df3b63461f15674a4e79c00fcefd7f3c846aef27f271" Dec 04 22:11:25.215166 master-0 kubenswrapper[8606]: E1204 22:11:25.215113 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:11:25.945682 master-0 kubenswrapper[8606]: I1204 22:11:25.945608 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:25.945682 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:25.945682 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:25.945682 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:25.946400 master-0 kubenswrapper[8606]: I1204 22:11:25.945695 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:26.225492 master-0 kubenswrapper[8606]: I1204 22:11:26.225324 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/3.log" Dec 04 22:11:26.944443 master-0 kubenswrapper[8606]: I1204 22:11:26.944321 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:26.944443 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:26.944443 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:26.944443 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:26.944443 master-0 kubenswrapper[8606]: I1204 22:11:26.944422 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:27.945057 master-0 kubenswrapper[8606]: I1204 22:11:27.944946 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:27.945057 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:27.945057 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:27.945057 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:27.945057 master-0 kubenswrapper[8606]: I1204 22:11:27.945049 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:28.944471 master-0 kubenswrapper[8606]: I1204 22:11:28.944362 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:28.944471 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:28.944471 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:28.944471 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:28.944471 master-0 kubenswrapper[8606]: I1204 22:11:28.944454 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:29.391992 master-0 kubenswrapper[8606]: I1204 22:11:29.391896 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 04 22:11:29.421263 master-0 kubenswrapper[8606]: I1204 22:11:29.421166 8606 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:11:29.421263 master-0 kubenswrapper[8606]: I1204 22:11:29.421266 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:11:29.943959 master-0 kubenswrapper[8606]: I1204 22:11:29.943865 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:29.943959 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:29.943959 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:29.943959 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:29.944392 master-0 kubenswrapper[8606]: I1204 22:11:29.943965 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:30.762548 master-0 kubenswrapper[8606]: E1204 22:11:30.762430 8606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 22:11:30.943984 master-0 kubenswrapper[8606]: I1204 22:11:30.943869 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:30.943984 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:30.943984 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:30.943984 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:30.943984 master-0 kubenswrapper[8606]: I1204 22:11:30.943964 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:31.943920 master-0 kubenswrapper[8606]: I1204 22:11:31.943817 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:31.943920 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:31.943920 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:31.943920 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:31.943920 master-0 kubenswrapper[8606]: I1204 22:11:31.943918 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:32.944111 master-0 kubenswrapper[8606]: I1204 22:11:32.944029 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:32.944111 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:32.944111 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:32.944111 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:32.944111 master-0 kubenswrapper[8606]: I1204 22:11:32.944117 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:33.943975 master-0 kubenswrapper[8606]: I1204 22:11:33.943885 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:33.943975 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:33.943975 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:33.943975 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:33.944700 master-0 kubenswrapper[8606]: I1204 22:11:33.943988 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:34.944478 master-0 kubenswrapper[8606]: I1204 22:11:34.944375 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:34.944478 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:34.944478 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:34.944478 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:34.944478 master-0 kubenswrapper[8606]: I1204 22:11:34.944472 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:35.391877 master-0 kubenswrapper[8606]: I1204 22:11:35.391795 8606 scope.go:117] "RemoveContainer" containerID="8aef2ad5ac5490ab1a81df3b63461f15674a4e79c00fcefd7f3c846aef27f271" Dec 04 22:11:35.392273 master-0 kubenswrapper[8606]: E1204 22:11:35.392200 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:11:35.944527 master-0 kubenswrapper[8606]: I1204 22:11:35.944420 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:35.944527 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:35.944527 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:35.944527 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:35.945840 master-0 kubenswrapper[8606]: I1204 22:11:35.944576 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:36.944112 master-0 kubenswrapper[8606]: I1204 22:11:36.944031 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:36.944112 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:36.944112 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:36.944112 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:36.945299 master-0 kubenswrapper[8606]: I1204 22:11:36.944113 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:37.944084 master-0 kubenswrapper[8606]: I1204 22:11:37.943994 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:37.944084 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:37.944084 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:37.944084 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:37.944494 master-0 kubenswrapper[8606]: I1204 22:11:37.944095 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:38.335745 master-0 kubenswrapper[8606]: I1204 22:11:38.335626 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-nk92d_634c1df6-de4d-4e26-8c71-d39311cae0ce/approver/1.log" Dec 04 22:11:38.336862 master-0 kubenswrapper[8606]: I1204 22:11:38.336450 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-nk92d_634c1df6-de4d-4e26-8c71-d39311cae0ce/approver/0.log" Dec 04 22:11:38.337046 master-0 kubenswrapper[8606]: I1204 22:11:38.336974 8606 generic.go:334] "Generic (PLEG): container finished" podID="634c1df6-de4d-4e26-8c71-d39311cae0ce" containerID="a1cf3561aceffd368c0ca4d9cb40d000ada9182cc1eeba5246056ae555e8f11e" exitCode=1 Dec 04 22:11:38.337046 master-0 kubenswrapper[8606]: I1204 22:11:38.337031 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerDied","Data":"a1cf3561aceffd368c0ca4d9cb40d000ada9182cc1eeba5246056ae555e8f11e"} Dec 04 22:11:38.337261 master-0 kubenswrapper[8606]: I1204 22:11:38.337088 8606 scope.go:117] "RemoveContainer" containerID="a679264390b031ae4f297359e8c908ad01e2a92651d2cb70742a5a02fd398618" Dec 04 22:11:38.337906 master-0 kubenswrapper[8606]: I1204 22:11:38.337855 8606 scope.go:117] "RemoveContainer" containerID="a1cf3561aceffd368c0ca4d9cb40d000ada9182cc1eeba5246056ae555e8f11e" Dec 04 22:11:38.338224 master-0 kubenswrapper[8606]: E1204 22:11:38.338171 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"approver\" with CrashLoopBackOff: \"back-off 10s restarting failed container=approver pod=network-node-identity-nk92d_openshift-network-node-identity(634c1df6-de4d-4e26-8c71-d39311cae0ce)\"" pod="openshift-network-node-identity/network-node-identity-nk92d" podUID="634c1df6-de4d-4e26-8c71-d39311cae0ce" Dec 04 22:11:38.944979 master-0 kubenswrapper[8606]: I1204 22:11:38.944895 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:38.944979 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:38.944979 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:38.944979 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:38.944979 master-0 kubenswrapper[8606]: I1204 22:11:38.944979 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:39.347974 master-0 kubenswrapper[8606]: I1204 22:11:39.347893 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-nk92d_634c1df6-de4d-4e26-8c71-d39311cae0ce/approver/1.log" Dec 04 22:11:39.944143 master-0 kubenswrapper[8606]: I1204 22:11:39.944044 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:39.944143 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:39.944143 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:39.944143 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:39.944761 master-0 kubenswrapper[8606]: I1204 22:11:39.944147 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:40.763561 master-0 kubenswrapper[8606]: E1204 22:11:40.763439 8606 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:11:40.764131 master-0 kubenswrapper[8606]: I1204 22:11:40.763626 8606 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 04 22:11:40.944365 master-0 kubenswrapper[8606]: I1204 22:11:40.944227 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:40.944365 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:40.944365 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:40.944365 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:40.944365 master-0 kubenswrapper[8606]: I1204 22:11:40.944353 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:41.944445 master-0 kubenswrapper[8606]: I1204 22:11:41.944366 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:41.944445 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:41.944445 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:41.944445 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:41.945544 master-0 kubenswrapper[8606]: I1204 22:11:41.944460 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:42.943987 master-0 kubenswrapper[8606]: I1204 22:11:42.943902 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:42.943987 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:42.943987 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:42.943987 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:42.944412 master-0 kubenswrapper[8606]: I1204 22:11:42.944032 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:43.488270 master-0 kubenswrapper[8606]: I1204 22:11:43.488176 8606 scope.go:117] "RemoveContainer" containerID="0a040d82f9bfd9a8d213b7bca90e959915daaafe371835a7acd200542911284e" Dec 04 22:11:43.945555 master-0 kubenswrapper[8606]: I1204 22:11:43.945391 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:43.945555 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:43.945555 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:43.945555 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:43.945555 master-0 kubenswrapper[8606]: I1204 22:11:43.945535 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:44.944453 master-0 kubenswrapper[8606]: I1204 22:11:44.944367 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:44.944453 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:44.944453 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:44.944453 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:44.945486 master-0 kubenswrapper[8606]: I1204 22:11:44.944476 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:45.943208 master-0 kubenswrapper[8606]: I1204 22:11:45.943124 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:45.943208 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:45.943208 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:45.943208 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:45.943580 master-0 kubenswrapper[8606]: I1204 22:11:45.943247 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:46.944311 master-0 kubenswrapper[8606]: I1204 22:11:46.944228 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:46.944311 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:46.944311 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:46.944311 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:46.945001 master-0 kubenswrapper[8606]: I1204 22:11:46.944339 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:47.945430 master-0 kubenswrapper[8606]: I1204 22:11:47.945150 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:47.945430 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:47.945430 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:47.945430 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:47.946436 master-0 kubenswrapper[8606]: I1204 22:11:47.945442 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:48.943787 master-0 kubenswrapper[8606]: I1204 22:11:48.943726 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:48.943787 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:48.943787 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:48.943787 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:48.944369 master-0 kubenswrapper[8606]: I1204 22:11:48.943815 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:49.392041 master-0 kubenswrapper[8606]: I1204 22:11:49.391976 8606 scope.go:117] "RemoveContainer" containerID="8aef2ad5ac5490ab1a81df3b63461f15674a4e79c00fcefd7f3c846aef27f271" Dec 04 22:11:49.392824 master-0 kubenswrapper[8606]: I1204 22:11:49.392172 8606 scope.go:117] "RemoveContainer" containerID="a1cf3561aceffd368c0ca4d9cb40d000ada9182cc1eeba5246056ae555e8f11e" Dec 04 22:11:49.392824 master-0 kubenswrapper[8606]: E1204 22:11:49.392393 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:11:49.944196 master-0 kubenswrapper[8606]: I1204 22:11:49.944113 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:49.944196 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:49.944196 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:49.944196 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:49.944655 master-0 kubenswrapper[8606]: I1204 22:11:49.944199 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:50.437460 master-0 kubenswrapper[8606]: I1204 22:11:50.437343 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-nk92d_634c1df6-de4d-4e26-8c71-d39311cae0ce/approver/1.log" Dec 04 22:11:50.438447 master-0 kubenswrapper[8606]: I1204 22:11:50.438065 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerStarted","Data":"2b90cccc4060f63e3151c04577b704a9e40c2c1995c15db065507afb9359b261"} Dec 04 22:11:50.764197 master-0 kubenswrapper[8606]: E1204 22:11:50.763936 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="200ms" Dec 04 22:11:50.945020 master-0 kubenswrapper[8606]: I1204 22:11:50.944909 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:50.945020 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:50.945020 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:50.945020 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:50.945430 master-0 kubenswrapper[8606]: I1204 22:11:50.945026 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:51.944453 master-0 kubenswrapper[8606]: I1204 22:11:51.944367 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:51.944453 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:51.944453 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:51.944453 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:51.945486 master-0 kubenswrapper[8606]: I1204 22:11:51.944465 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:52.943737 master-0 kubenswrapper[8606]: I1204 22:11:52.943662 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:52.943737 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:52.943737 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:52.943737 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:52.944079 master-0 kubenswrapper[8606]: I1204 22:11:52.943757 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:53.945262 master-0 kubenswrapper[8606]: I1204 22:11:53.945193 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:53.945262 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:53.945262 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:53.945262 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:53.945995 master-0 kubenswrapper[8606]: I1204 22:11:53.945290 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:54.945074 master-0 kubenswrapper[8606]: I1204 22:11:54.944981 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:54.945074 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:54.945074 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:54.945074 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:54.945857 master-0 kubenswrapper[8606]: I1204 22:11:54.945089 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:55.944419 master-0 kubenswrapper[8606]: I1204 22:11:55.944351 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:55.944419 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:55.944419 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:55.944419 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:55.944759 master-0 kubenswrapper[8606]: I1204 22:11:55.944429 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:56.310291 master-0 kubenswrapper[8606]: E1204 22:11:56.310019 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{kube-controller-manager-master-0.187e2263646d1980 openshift-kube-controller-manager 9868 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-master-0,UID:fad55397ac8e23f218f25cb714ea5b2b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:05:14 +0000 UTC,LastTimestamp:2025-12-04 22:11:01.989627957 +0000 UTC m=+626.799930212,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:11:56.944271 master-0 kubenswrapper[8606]: I1204 22:11:56.944164 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:56.944271 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:56.944271 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:56.944271 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:56.944271 master-0 kubenswrapper[8606]: I1204 22:11:56.944254 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:57.944382 master-0 kubenswrapper[8606]: I1204 22:11:57.944299 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:57.944382 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:57.944382 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:57.944382 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:57.945098 master-0 kubenswrapper[8606]: I1204 22:11:57.944415 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:58.943309 master-0 kubenswrapper[8606]: I1204 22:11:58.943193 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:58.943309 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:58.943309 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:58.943309 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:58.943309 master-0 kubenswrapper[8606]: I1204 22:11:58.943263 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:11:59.945435 master-0 kubenswrapper[8606]: I1204 22:11:59.945345 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:11:59.945435 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:11:59.945435 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:11:59.945435 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:11:59.946421 master-0 kubenswrapper[8606]: I1204 22:11:59.945442 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:00.391973 master-0 kubenswrapper[8606]: I1204 22:12:00.391878 8606 scope.go:117] "RemoveContainer" containerID="8aef2ad5ac5490ab1a81df3b63461f15674a4e79c00fcefd7f3c846aef27f271" Dec 04 22:12:00.392444 master-0 kubenswrapper[8606]: E1204 22:12:00.392308 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:12:00.944606 master-0 kubenswrapper[8606]: I1204 22:12:00.944490 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:00.944606 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:00.944606 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:00.944606 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:00.944935 master-0 kubenswrapper[8606]: I1204 22:12:00.944626 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:00.965281 master-0 kubenswrapper[8606]: E1204 22:12:00.965052 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Dec 04 22:12:01.945468 master-0 kubenswrapper[8606]: I1204 22:12:01.945365 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:01.945468 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:01.945468 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:01.945468 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:01.945928 master-0 kubenswrapper[8606]: I1204 22:12:01.945468 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:01.990093 master-0 kubenswrapper[8606]: I1204 22:12:01.989983 8606 status_manager.go:851] "Failed to get status for pod" podUID="fad55397ac8e23f218f25cb714ea5b2b" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods kube-controller-manager-master-0)" Dec 04 22:12:02.945420 master-0 kubenswrapper[8606]: I1204 22:12:02.945177 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:02.945420 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:02.945420 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:02.945420 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:02.945420 master-0 kubenswrapper[8606]: I1204 22:12:02.945327 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:03.424759 master-0 kubenswrapper[8606]: E1204 22:12:03.424676 8606 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 04 22:12:03.425843 master-0 kubenswrapper[8606]: I1204 22:12:03.425787 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 04 22:12:03.453490 master-0 kubenswrapper[8606]: W1204 22:12:03.452662 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d12e893528ad53a994f10901a644ea.slice/crio-f09554581756c0f7969c0e40141de26184513807dc2c20e4d5041730923d9d0c WatchSource:0}: Error finding container f09554581756c0f7969c0e40141de26184513807dc2c20e4d5041730923d9d0c: Status 404 returned error can't find the container with id f09554581756c0f7969c0e40141de26184513807dc2c20e4d5041730923d9d0c Dec 04 22:12:03.555783 master-0 kubenswrapper[8606]: I1204 22:12:03.555688 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"f09554581756c0f7969c0e40141de26184513807dc2c20e4d5041730923d9d0c"} Dec 04 22:12:03.945262 master-0 kubenswrapper[8606]: I1204 22:12:03.945132 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:03.945262 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:03.945262 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:03.945262 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:03.945262 master-0 kubenswrapper[8606]: I1204 22:12:03.945214 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:04.574079 master-0 kubenswrapper[8606]: I1204 22:12:04.574011 8606 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="553d3584b8fff905a7e34ad91d98d0f31e54579b68090cae0d50c0891bc22dd5" exitCode=0 Dec 04 22:12:04.577342 master-0 kubenswrapper[8606]: I1204 22:12:04.574123 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"553d3584b8fff905a7e34ad91d98d0f31e54579b68090cae0d50c0891bc22dd5"} Dec 04 22:12:04.577648 master-0 kubenswrapper[8606]: I1204 22:12:04.574633 8606 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:12:04.577814 master-0 kubenswrapper[8606]: I1204 22:12:04.577783 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:12:04.943610 master-0 kubenswrapper[8606]: I1204 22:12:04.943374 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:04.943610 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:04.943610 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:04.943610 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:04.943610 master-0 kubenswrapper[8606]: I1204 22:12:04.943444 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:05.944430 master-0 kubenswrapper[8606]: I1204 22:12:05.944344 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:05.944430 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:05.944430 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:05.944430 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:05.945788 master-0 kubenswrapper[8606]: I1204 22:12:05.944453 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:06.944928 master-0 kubenswrapper[8606]: I1204 22:12:06.944791 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:06.944928 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:06.944928 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:06.944928 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:06.945995 master-0 kubenswrapper[8606]: I1204 22:12:06.944959 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:07.943925 master-0 kubenswrapper[8606]: I1204 22:12:07.943838 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:07.943925 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:07.943925 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:07.943925 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:07.944274 master-0 kubenswrapper[8606]: I1204 22:12:07.943958 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:08.945376 master-0 kubenswrapper[8606]: I1204 22:12:08.945282 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:08.945376 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:08.945376 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:08.945376 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:08.946192 master-0 kubenswrapper[8606]: I1204 22:12:08.945404 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:09.944342 master-0 kubenswrapper[8606]: I1204 22:12:09.944238 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:09.944342 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:09.944342 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:09.944342 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:09.944863 master-0 kubenswrapper[8606]: I1204 22:12:09.944344 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:10.944782 master-0 kubenswrapper[8606]: I1204 22:12:10.944683 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:10.944782 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:10.944782 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:10.944782 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:10.944782 master-0 kubenswrapper[8606]: I1204 22:12:10.944775 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:11.366723 master-0 kubenswrapper[8606]: E1204 22:12:11.366558 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Dec 04 22:12:11.944347 master-0 kubenswrapper[8606]: I1204 22:12:11.944252 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:11.944347 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:11.944347 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:11.944347 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:11.944347 master-0 kubenswrapper[8606]: I1204 22:12:11.944339 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:12.391453 master-0 kubenswrapper[8606]: I1204 22:12:12.391370 8606 scope.go:117] "RemoveContainer" containerID="8aef2ad5ac5490ab1a81df3b63461f15674a4e79c00fcefd7f3c846aef27f271" Dec 04 22:12:12.645347 master-0 kubenswrapper[8606]: I1204 22:12:12.645130 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/3.log" Dec 04 22:12:12.645902 master-0 kubenswrapper[8606]: I1204 22:12:12.645817 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6"} Dec 04 22:12:12.944990 master-0 kubenswrapper[8606]: I1204 22:12:12.944798 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:12.944990 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:12.944990 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:12.944990 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:12.944990 master-0 kubenswrapper[8606]: I1204 22:12:12.944876 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:13.944744 master-0 kubenswrapper[8606]: I1204 22:12:13.944664 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:13.944744 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:13.944744 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:13.944744 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:13.945031 master-0 kubenswrapper[8606]: I1204 22:12:13.944766 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:14.236636 master-0 kubenswrapper[8606]: E1204 22:12:14.236399 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:12:04Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:12:04Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:12:04Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:12:04Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:12:14.944368 master-0 kubenswrapper[8606]: I1204 22:12:14.944271 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:14.944368 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:14.944368 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:14.944368 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:14.944788 master-0 kubenswrapper[8606]: I1204 22:12:14.944385 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:15.943588 master-0 kubenswrapper[8606]: I1204 22:12:15.943540 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:15.943588 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:15.943588 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:15.943588 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:15.946767 master-0 kubenswrapper[8606]: I1204 22:12:15.946691 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:16.945890 master-0 kubenswrapper[8606]: I1204 22:12:16.945791 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:16.945890 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:16.945890 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:16.945890 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:16.946667 master-0 kubenswrapper[8606]: I1204 22:12:16.945956 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:17.944087 master-0 kubenswrapper[8606]: I1204 22:12:17.943960 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:17.944087 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:17.944087 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:17.944087 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:17.944719 master-0 kubenswrapper[8606]: I1204 22:12:17.944638 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:18.944961 master-0 kubenswrapper[8606]: I1204 22:12:18.944895 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:18.944961 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:18.944961 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:18.944961 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:18.945714 master-0 kubenswrapper[8606]: I1204 22:12:18.944983 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:19.943811 master-0 kubenswrapper[8606]: I1204 22:12:19.943646 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:19.943811 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:19.943811 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:19.943811 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:19.943811 master-0 kubenswrapper[8606]: I1204 22:12:19.943778 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:20.945138 master-0 kubenswrapper[8606]: I1204 22:12:20.945072 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:20.945138 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:20.945138 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:20.945138 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:20.946385 master-0 kubenswrapper[8606]: I1204 22:12:20.945789 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:21.944625 master-0 kubenswrapper[8606]: I1204 22:12:21.944538 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:21.944625 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:21.944625 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:21.944625 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:21.944960 master-0 kubenswrapper[8606]: I1204 22:12:21.944650 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:22.167872 master-0 kubenswrapper[8606]: E1204 22:12:22.167764 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Dec 04 22:12:22.944856 master-0 kubenswrapper[8606]: I1204 22:12:22.944761 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:22.944856 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:22.944856 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:22.944856 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:22.944856 master-0 kubenswrapper[8606]: I1204 22:12:22.944849 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:23.944561 master-0 kubenswrapper[8606]: I1204 22:12:23.944428 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:23.944561 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:23.944561 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:23.944561 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:23.945546 master-0 kubenswrapper[8606]: I1204 22:12:23.944602 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:24.237663 master-0 kubenswrapper[8606]: E1204 22:12:24.237474 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:12:24.944545 master-0 kubenswrapper[8606]: I1204 22:12:24.944424 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:24.944545 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:24.944545 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:24.944545 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:24.945493 master-0 kubenswrapper[8606]: I1204 22:12:24.944577 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:25.943941 master-0 kubenswrapper[8606]: I1204 22:12:25.943838 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:25.943941 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:25.943941 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:25.943941 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:25.944498 master-0 kubenswrapper[8606]: I1204 22:12:25.943937 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:26.944388 master-0 kubenswrapper[8606]: I1204 22:12:26.944284 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:26.944388 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:26.944388 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:26.944388 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:26.945453 master-0 kubenswrapper[8606]: I1204 22:12:26.944392 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:27.944305 master-0 kubenswrapper[8606]: I1204 22:12:27.944213 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:27.944305 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:27.944305 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:27.944305 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:27.945615 master-0 kubenswrapper[8606]: I1204 22:12:27.944305 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:28.944464 master-0 kubenswrapper[8606]: I1204 22:12:28.944345 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:28.944464 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:28.944464 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:28.944464 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:28.945768 master-0 kubenswrapper[8606]: I1204 22:12:28.944462 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:29.944417 master-0 kubenswrapper[8606]: I1204 22:12:29.944338 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:29.944417 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:29.944417 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:29.944417 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:29.944798 master-0 kubenswrapper[8606]: I1204 22:12:29.944425 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:30.313717 master-0 kubenswrapper[8606]: E1204 22:12:30.313457 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{kube-controller-manager-master-0.187e226374d89078 openshift-kube-controller-manager 9878 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-master-0,UID:fad55397ac8e23f218f25cb714ea5b2b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:05:14 +0000 UTC,LastTimestamp:2025-12-04 22:11:02.313654819 +0000 UTC m=+627.123957074,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:12:30.943703 master-0 kubenswrapper[8606]: I1204 22:12:30.943627 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:30.943703 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:30.943703 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:30.943703 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:30.943950 master-0 kubenswrapper[8606]: I1204 22:12:30.943747 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:31.795366 master-0 kubenswrapper[8606]: I1204 22:12:31.795322 8606 generic.go:334] "Generic (PLEG): container finished" podID="c6a5d14d-0409-4024-b0a8-200fa2594185" containerID="5553ac89bf95798ab2decbb87aaa4e3e8d835fcf542a4ada2023ac05d60471d4" exitCode=0 Dec 04 22:12:31.795964 master-0 kubenswrapper[8606]: I1204 22:12:31.795419 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" event={"ID":"c6a5d14d-0409-4024-b0a8-200fa2594185","Type":"ContainerDied","Data":"5553ac89bf95798ab2decbb87aaa4e3e8d835fcf542a4ada2023ac05d60471d4"} Dec 04 22:12:31.796018 master-0 kubenswrapper[8606]: I1204 22:12:31.795985 8606 scope.go:117] "RemoveContainer" containerID="bad146c5ce315f7f5070081135587df7a077e864def57e2c38a773560069cf17" Dec 04 22:12:31.796850 master-0 kubenswrapper[8606]: I1204 22:12:31.796769 8606 scope.go:117] "RemoveContainer" containerID="5553ac89bf95798ab2decbb87aaa4e3e8d835fcf542a4ada2023ac05d60471d4" Dec 04 22:12:31.797190 master-0 kubenswrapper[8606]: E1204 22:12:31.797125 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-f797b99b6-m9m4h_openshift-marketplace(c6a5d14d-0409-4024-b0a8-200fa2594185)\"" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" podUID="c6a5d14d-0409-4024-b0a8-200fa2594185" Dec 04 22:12:31.944051 master-0 kubenswrapper[8606]: I1204 22:12:31.943968 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:31.944051 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:31.944051 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:31.944051 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:31.944439 master-0 kubenswrapper[8606]: I1204 22:12:31.944060 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:32.268266 master-0 kubenswrapper[8606]: I1204 22:12:32.268199 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:12:32.268266 master-0 kubenswrapper[8606]: I1204 22:12:32.268271 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:12:32.807248 master-0 kubenswrapper[8606]: I1204 22:12:32.807175 8606 scope.go:117] "RemoveContainer" containerID="5553ac89bf95798ab2decbb87aaa4e3e8d835fcf542a4ada2023ac05d60471d4" Dec 04 22:12:32.808285 master-0 kubenswrapper[8606]: E1204 22:12:32.807597 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-f797b99b6-m9m4h_openshift-marketplace(c6a5d14d-0409-4024-b0a8-200fa2594185)\"" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" podUID="c6a5d14d-0409-4024-b0a8-200fa2594185" Dec 04 22:12:32.943998 master-0 kubenswrapper[8606]: I1204 22:12:32.943943 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:32.943998 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:32.943998 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:32.943998 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:32.944538 master-0 kubenswrapper[8606]: I1204 22:12:32.944465 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:33.769570 master-0 kubenswrapper[8606]: E1204 22:12:33.769461 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 04 22:12:33.944475 master-0 kubenswrapper[8606]: I1204 22:12:33.944391 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:33.944475 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:33.944475 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:33.944475 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:33.945480 master-0 kubenswrapper[8606]: I1204 22:12:33.944494 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:34.238073 master-0 kubenswrapper[8606]: E1204 22:12:34.237906 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:12:34.949249 master-0 kubenswrapper[8606]: I1204 22:12:34.949171 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:34.949249 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:34.949249 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:34.949249 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:34.949997 master-0 kubenswrapper[8606]: I1204 22:12:34.949290 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:35.944426 master-0 kubenswrapper[8606]: I1204 22:12:35.944306 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:35.944426 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:35.944426 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:35.944426 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:35.944885 master-0 kubenswrapper[8606]: I1204 22:12:35.944420 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:36.943292 master-0 kubenswrapper[8606]: I1204 22:12:36.943217 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:36.943292 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:36.943292 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:36.943292 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:36.944289 master-0 kubenswrapper[8606]: I1204 22:12:36.943292 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:37.944826 master-0 kubenswrapper[8606]: I1204 22:12:37.944668 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:37.944826 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:37.944826 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:37.944826 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:37.945834 master-0 kubenswrapper[8606]: I1204 22:12:37.944833 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:38.581559 master-0 kubenswrapper[8606]: E1204 22:12:38.581447 8606 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 04 22:12:38.944637 master-0 kubenswrapper[8606]: I1204 22:12:38.944555 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:38.944637 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:38.944637 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:38.944637 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:38.945541 master-0 kubenswrapper[8606]: I1204 22:12:38.944664 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:39.867542 master-0 kubenswrapper[8606]: I1204 22:12:39.867413 8606 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="ff5921da732d05f72d82c0539e6f4661b6512a68939a31fe1f83fa7bbd8cf1d4" exitCode=0 Dec 04 22:12:39.867542 master-0 kubenswrapper[8606]: I1204 22:12:39.867484 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"ff5921da732d05f72d82c0539e6f4661b6512a68939a31fe1f83fa7bbd8cf1d4"} Dec 04 22:12:39.868315 master-0 kubenswrapper[8606]: I1204 22:12:39.867938 8606 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:12:39.868315 master-0 kubenswrapper[8606]: I1204 22:12:39.867962 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:12:39.944461 master-0 kubenswrapper[8606]: I1204 22:12:39.944385 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:39.944461 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:39.944461 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:39.944461 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:39.944831 master-0 kubenswrapper[8606]: I1204 22:12:39.944469 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:40.944828 master-0 kubenswrapper[8606]: I1204 22:12:40.944701 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:40.944828 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:40.944828 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:40.944828 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:40.945487 master-0 kubenswrapper[8606]: I1204 22:12:40.944858 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:41.944661 master-0 kubenswrapper[8606]: I1204 22:12:41.944562 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:41.944661 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:41.944661 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:41.944661 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:41.944661 master-0 kubenswrapper[8606]: I1204 22:12:41.944646 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:42.943942 master-0 kubenswrapper[8606]: I1204 22:12:42.943839 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:42.943942 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:42.943942 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:42.943942 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:42.943942 master-0 kubenswrapper[8606]: I1204 22:12:42.943925 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:43.943668 master-0 kubenswrapper[8606]: I1204 22:12:43.943617 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:43.943668 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:43.943668 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:43.943668 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:43.944254 master-0 kubenswrapper[8606]: I1204 22:12:43.943688 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:44.239159 master-0 kubenswrapper[8606]: E1204 22:12:44.238938 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:12:44.944104 master-0 kubenswrapper[8606]: I1204 22:12:44.943992 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:44.944104 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:44.944104 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:44.944104 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:44.944104 master-0 kubenswrapper[8606]: I1204 22:12:44.944093 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:45.392104 master-0 kubenswrapper[8606]: I1204 22:12:45.392018 8606 scope.go:117] "RemoveContainer" containerID="5553ac89bf95798ab2decbb87aaa4e3e8d835fcf542a4ada2023ac05d60471d4" Dec 04 22:12:45.915112 master-0 kubenswrapper[8606]: I1204 22:12:45.915037 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" event={"ID":"c6a5d14d-0409-4024-b0a8-200fa2594185","Type":"ContainerStarted","Data":"4e5f4d666e715187131125caa7b8db325dd82e37d31be42d4f697d2f2db4f71e"} Dec 04 22:12:45.916152 master-0 kubenswrapper[8606]: I1204 22:12:45.916094 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:12:45.917602 master-0 kubenswrapper[8606]: I1204 22:12:45.917552 8606 patch_prober.go:28] interesting pod/marketplace-operator-f797b99b6-m9m4h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" start-of-body= Dec 04 22:12:45.917749 master-0 kubenswrapper[8606]: I1204 22:12:45.917605 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" podUID="c6a5d14d-0409-4024-b0a8-200fa2594185" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.5:8080/healthz\": dial tcp 10.128.0.5:8080: connect: connection refused" Dec 04 22:12:45.944738 master-0 kubenswrapper[8606]: I1204 22:12:45.944699 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:45.944738 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:45.944738 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:45.944738 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:45.945362 master-0 kubenswrapper[8606]: I1204 22:12:45.944768 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:46.931291 master-0 kubenswrapper[8606]: I1204 22:12:46.931206 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:12:46.944194 master-0 kubenswrapper[8606]: I1204 22:12:46.944119 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:46.944194 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:46.944194 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:46.944194 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:46.944492 master-0 kubenswrapper[8606]: I1204 22:12:46.944209 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:46.970940 master-0 kubenswrapper[8606]: E1204 22:12:46.970810 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 04 22:12:47.944799 master-0 kubenswrapper[8606]: I1204 22:12:47.944682 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:47.944799 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:47.944799 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:47.944799 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:47.945475 master-0 kubenswrapper[8606]: I1204 22:12:47.945403 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:48.944069 master-0 kubenswrapper[8606]: I1204 22:12:48.943942 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:48.944069 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:48.944069 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:48.944069 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:48.944069 master-0 kubenswrapper[8606]: I1204 22:12:48.944046 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:49.944920 master-0 kubenswrapper[8606]: I1204 22:12:49.944791 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:49.944920 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:49.944920 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:49.944920 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:49.944920 master-0 kubenswrapper[8606]: I1204 22:12:49.944893 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:50.944546 master-0 kubenswrapper[8606]: I1204 22:12:50.944427 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:50.944546 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:50.944546 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:50.944546 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:50.945753 master-0 kubenswrapper[8606]: I1204 22:12:50.944589 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:51.944771 master-0 kubenswrapper[8606]: I1204 22:12:51.944622 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:51.944771 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:51.944771 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:51.944771 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:51.945835 master-0 kubenswrapper[8606]: I1204 22:12:51.944813 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:52.944070 master-0 kubenswrapper[8606]: I1204 22:12:52.943943 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:52.944070 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:52.944070 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:52.944070 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:52.944467 master-0 kubenswrapper[8606]: I1204 22:12:52.944078 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:52.972242 master-0 kubenswrapper[8606]: I1204 22:12:52.972169 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/2.log" Dec 04 22:12:52.972988 master-0 kubenswrapper[8606]: I1204 22:12:52.972929 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/1.log" Dec 04 22:12:52.973086 master-0 kubenswrapper[8606]: I1204 22:12:52.973013 8606 generic.go:334] "Generic (PLEG): container finished" podID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" containerID="db1b72940eea381ffedc858f1def5527cda55e491f0234248167dff13f171166" exitCode=1 Dec 04 22:12:52.973086 master-0 kubenswrapper[8606]: I1204 22:12:52.973062 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerDied","Data":"db1b72940eea381ffedc858f1def5527cda55e491f0234248167dff13f171166"} Dec 04 22:12:52.973218 master-0 kubenswrapper[8606]: I1204 22:12:52.973114 8606 scope.go:117] "RemoveContainer" containerID="87db242867379a05a8e5e70ee5c05b6acdb8e3f23756e86287e28e91fa037302" Dec 04 22:12:52.973820 master-0 kubenswrapper[8606]: I1204 22:12:52.973769 8606 scope.go:117] "RemoveContainer" containerID="db1b72940eea381ffedc858f1def5527cda55e491f0234248167dff13f171166" Dec 04 22:12:52.974182 master-0 kubenswrapper[8606]: E1204 22:12:52.974131 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:12:53.943970 master-0 kubenswrapper[8606]: I1204 22:12:53.943839 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:53.943970 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:53.943970 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:53.943970 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:53.944850 master-0 kubenswrapper[8606]: I1204 22:12:53.943985 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:53.995731 master-0 kubenswrapper[8606]: I1204 22:12:53.995629 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/2.log" Dec 04 22:12:53.998946 master-0 kubenswrapper[8606]: I1204 22:12:53.998879 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-v7zfw_fb0274dc-fac1-41f9-b3e5-77253d851fdf/manager/1.log" Dec 04 22:12:53.999910 master-0 kubenswrapper[8606]: I1204 22:12:53.999852 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-v7zfw_fb0274dc-fac1-41f9-b3e5-77253d851fdf/manager/0.log" Dec 04 22:12:54.000682 master-0 kubenswrapper[8606]: I1204 22:12:54.000584 8606 generic.go:334] "Generic (PLEG): container finished" podID="fb0274dc-fac1-41f9-b3e5-77253d851fdf" containerID="818c6aaeff094c3713e384dec4d55a28c1f228a4e98ba130afd94743d45d288f" exitCode=1 Dec 04 22:12:54.000825 master-0 kubenswrapper[8606]: I1204 22:12:54.000688 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerDied","Data":"818c6aaeff094c3713e384dec4d55a28c1f228a4e98ba130afd94743d45d288f"} Dec 04 22:12:54.000825 master-0 kubenswrapper[8606]: I1204 22:12:54.000777 8606 scope.go:117] "RemoveContainer" containerID="5b7837bb8d893076191e798bbe6f7756d536495c527346610e4cc8ec29e29fe5" Dec 04 22:12:54.001859 master-0 kubenswrapper[8606]: I1204 22:12:54.001793 8606 scope.go:117] "RemoveContainer" containerID="818c6aaeff094c3713e384dec4d55a28c1f228a4e98ba130afd94743d45d288f" Dec 04 22:12:54.002392 master-0 kubenswrapper[8606]: E1204 22:12:54.002322 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=catalogd-controller-manager-7cc89f4c4c-v7zfw_openshift-catalogd(fb0274dc-fac1-41f9-b3e5-77253d851fdf)\"" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" podUID="fb0274dc-fac1-41f9-b3e5-77253d851fdf" Dec 04 22:12:54.241289 master-0 kubenswrapper[8606]: E1204 22:12:54.239193 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:12:54.241289 master-0 kubenswrapper[8606]: E1204 22:12:54.239231 8606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 22:12:54.944913 master-0 kubenswrapper[8606]: I1204 22:12:54.944770 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:54.944913 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:54.944913 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:54.944913 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:54.944913 master-0 kubenswrapper[8606]: I1204 22:12:54.944849 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:55.011105 master-0 kubenswrapper[8606]: I1204 22:12:55.011034 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-v7zfw_fb0274dc-fac1-41f9-b3e5-77253d851fdf/manager/1.log" Dec 04 22:12:55.944613 master-0 kubenswrapper[8606]: I1204 22:12:55.944490 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:55.944613 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:55.944613 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:55.944613 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:55.945161 master-0 kubenswrapper[8606]: I1204 22:12:55.944654 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:56.022099 master-0 kubenswrapper[8606]: I1204 22:12:56.022053 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/config-sync-controllers/0.log" Dec 04 22:12:56.023324 master-0 kubenswrapper[8606]: I1204 22:12:56.023244 8606 generic.go:334] "Generic (PLEG): container finished" podID="74197c50-9a41-40e8-9289-c7e6afbd3737" containerID="d4cd669f8e4bd3008a9642035eec139a13bd3a586fb003b3adb4d948956c28f6" exitCode=1 Dec 04 22:12:56.023387 master-0 kubenswrapper[8606]: I1204 22:12:56.023343 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerDied","Data":"d4cd669f8e4bd3008a9642035eec139a13bd3a586fb003b3adb4d948956c28f6"} Dec 04 22:12:56.024120 master-0 kubenswrapper[8606]: I1204 22:12:56.024072 8606 scope.go:117] "RemoveContainer" containerID="d4cd669f8e4bd3008a9642035eec139a13bd3a586fb003b3adb4d948956c28f6" Dec 04 22:12:56.756171 master-0 kubenswrapper[8606]: I1204 22:12:56.756095 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:12:56.756738 master-0 kubenswrapper[8606]: I1204 22:12:56.756612 8606 scope.go:117] "RemoveContainer" containerID="818c6aaeff094c3713e384dec4d55a28c1f228a4e98ba130afd94743d45d288f" Dec 04 22:12:56.756857 master-0 kubenswrapper[8606]: E1204 22:12:56.756821 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=catalogd-controller-manager-7cc89f4c4c-v7zfw_openshift-catalogd(fb0274dc-fac1-41f9-b3e5-77253d851fdf)\"" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" podUID="fb0274dc-fac1-41f9-b3e5-77253d851fdf" Dec 04 22:12:56.943649 master-0 kubenswrapper[8606]: I1204 22:12:56.943566 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:56.943649 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:56.943649 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:56.943649 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:56.943974 master-0 kubenswrapper[8606]: I1204 22:12:56.943641 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:57.036382 master-0 kubenswrapper[8606]: I1204 22:12:57.036307 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/config-sync-controllers/0.log" Dec 04 22:12:57.037289 master-0 kubenswrapper[8606]: I1204 22:12:57.037229 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/cluster-cloud-controller-manager/0.log" Dec 04 22:12:57.037388 master-0 kubenswrapper[8606]: I1204 22:12:57.037308 8606 generic.go:334] "Generic (PLEG): container finished" podID="74197c50-9a41-40e8-9289-c7e6afbd3737" containerID="d7411bec11d115b15a691f3d3010646ecd1f289830d7539f76d0467f6cd83226" exitCode=1 Dec 04 22:12:57.037388 master-0 kubenswrapper[8606]: I1204 22:12:57.037357 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerDied","Data":"d7411bec11d115b15a691f3d3010646ecd1f289830d7539f76d0467f6cd83226"} Dec 04 22:12:57.037584 master-0 kubenswrapper[8606]: I1204 22:12:57.037401 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerStarted","Data":"21c5db260fc9c003b5979ff05e774d8d4b66aafbdc7ee070b9faa8bb51459bea"} Dec 04 22:12:57.038271 master-0 kubenswrapper[8606]: I1204 22:12:57.038213 8606 scope.go:117] "RemoveContainer" containerID="d7411bec11d115b15a691f3d3010646ecd1f289830d7539f76d0467f6cd83226" Dec 04 22:12:57.945526 master-0 kubenswrapper[8606]: I1204 22:12:57.945373 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:57.945526 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:57.945526 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:57.945526 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:57.946007 master-0 kubenswrapper[8606]: I1204 22:12:57.945574 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:58.047767 master-0 kubenswrapper[8606]: I1204 22:12:58.047688 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/config-sync-controllers/0.log" Dec 04 22:12:58.048628 master-0 kubenswrapper[8606]: I1204 22:12:58.048141 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/cluster-cloud-controller-manager/0.log" Dec 04 22:12:58.048628 master-0 kubenswrapper[8606]: I1204 22:12:58.048202 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerStarted","Data":"a43528b1d37a8cf8c729e8c6970f63fadaafd2a2a5d053a62faf1bb166b17472"} Dec 04 22:12:58.945255 master-0 kubenswrapper[8606]: I1204 22:12:58.945101 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:58.945255 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:58.945255 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:58.945255 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:58.945255 master-0 kubenswrapper[8606]: I1204 22:12:58.945185 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:12:59.945062 master-0 kubenswrapper[8606]: I1204 22:12:59.944957 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:12:59.945062 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:12:59.945062 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:12:59.945062 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:12:59.945062 master-0 kubenswrapper[8606]: I1204 22:12:59.945050 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:00.944974 master-0 kubenswrapper[8606]: I1204 22:13:00.944850 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:00.944974 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:00.944974 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:00.944974 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:00.944974 master-0 kubenswrapper[8606]: I1204 22:13:00.944944 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:01.945642 master-0 kubenswrapper[8606]: I1204 22:13:01.945495 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:01.945642 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:01.945642 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:01.945642 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:01.946774 master-0 kubenswrapper[8606]: I1204 22:13:01.945648 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:01.992136 master-0 kubenswrapper[8606]: I1204 22:13:01.992020 8606 status_manager.go:851] "Failed to get status for pod" podUID="5e09e2af7200e6f9be469dbfd9bb1127" pod="kube-system/bootstrap-kube-scheduler-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods bootstrap-kube-scheduler-master-0)" Dec 04 22:13:02.111570 master-0 kubenswrapper[8606]: I1204 22:13:02.109569 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-nxbjw_ce6b5a46-172b-4575-ba22-ff3c6ea4207f/manager/1.log" Dec 04 22:13:02.116655 master-0 kubenswrapper[8606]: I1204 22:13:02.116598 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-nxbjw_ce6b5a46-172b-4575-ba22-ff3c6ea4207f/manager/0.log" Dec 04 22:13:02.116775 master-0 kubenswrapper[8606]: I1204 22:13:02.116685 8606 generic.go:334] "Generic (PLEG): container finished" podID="ce6b5a46-172b-4575-ba22-ff3c6ea4207f" containerID="fe4d171382cafc8367d04ba38562e53607a288ace82dafd0c07ed366d2a1ef56" exitCode=1 Dec 04 22:13:02.116775 master-0 kubenswrapper[8606]: I1204 22:13:02.116735 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerDied","Data":"fe4d171382cafc8367d04ba38562e53607a288ace82dafd0c07ed366d2a1ef56"} Dec 04 22:13:02.116921 master-0 kubenswrapper[8606]: I1204 22:13:02.116793 8606 scope.go:117] "RemoveContainer" containerID="037f05faa0b4635e20f5127ded6c5b63a2893aa9715387918fd80e11092dcfbb" Dec 04 22:13:02.118895 master-0 kubenswrapper[8606]: I1204 22:13:02.117930 8606 scope.go:117] "RemoveContainer" containerID="fe4d171382cafc8367d04ba38562e53607a288ace82dafd0c07ed366d2a1ef56" Dec 04 22:13:02.118895 master-0 kubenswrapper[8606]: E1204 22:13:02.118372 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=operator-controller-controller-manager-7cbd59c7f8-nxbjw_openshift-operator-controller(ce6b5a46-172b-4575-ba22-ff3c6ea4207f)\"" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" podUID="ce6b5a46-172b-4575-ba22-ff3c6ea4207f" Dec 04 22:13:02.945764 master-0 kubenswrapper[8606]: I1204 22:13:02.945595 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:02.945764 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:02.945764 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:02.945764 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:02.945764 master-0 kubenswrapper[8606]: I1204 22:13:02.945734 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:03.127783 master-0 kubenswrapper[8606]: I1204 22:13:03.127694 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-nxbjw_ce6b5a46-172b-4575-ba22-ff3c6ea4207f/manager/1.log" Dec 04 22:13:03.372721 master-0 kubenswrapper[8606]: E1204 22:13:03.372567 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:13:03.943551 master-0 kubenswrapper[8606]: I1204 22:13:03.943461 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:03.943551 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:03.943551 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:03.943551 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:03.943551 master-0 kubenswrapper[8606]: I1204 22:13:03.943538 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:04.317334 master-0 kubenswrapper[8606]: E1204 22:13:04.317163 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{kube-controller-manager-master-0.187e2263759927eb openshift-kube-controller-manager 9880 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-master-0,UID:fad55397ac8e23f218f25cb714ea5b2b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:05:14 +0000 UTC,LastTimestamp:2025-12-04 22:11:02.32941988 +0000 UTC m=+627.139722135,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:13:04.944209 master-0 kubenswrapper[8606]: I1204 22:13:04.944150 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:04.944209 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:04.944209 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:04.944209 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:04.944515 master-0 kubenswrapper[8606]: I1204 22:13:04.944211 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:05.392005 master-0 kubenswrapper[8606]: I1204 22:13:05.391900 8606 scope.go:117] "RemoveContainer" containerID="db1b72940eea381ffedc858f1def5527cda55e491f0234248167dff13f171166" Dec 04 22:13:05.392996 master-0 kubenswrapper[8606]: E1204 22:13:05.392123 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:13:05.944040 master-0 kubenswrapper[8606]: I1204 22:13:05.943978 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:05.944040 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:05.944040 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:05.944040 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:05.944485 master-0 kubenswrapper[8606]: I1204 22:13:05.944067 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:06.756756 master-0 kubenswrapper[8606]: I1204 22:13:06.756639 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:13:06.757929 master-0 kubenswrapper[8606]: I1204 22:13:06.757892 8606 scope.go:117] "RemoveContainer" containerID="818c6aaeff094c3713e384dec4d55a28c1f228a4e98ba130afd94743d45d288f" Dec 04 22:13:06.944343 master-0 kubenswrapper[8606]: I1204 22:13:06.944253 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:06.944343 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:06.944343 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:06.944343 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:06.944876 master-0 kubenswrapper[8606]: I1204 22:13:06.944353 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:06.956272 master-0 kubenswrapper[8606]: I1204 22:13:06.956178 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:13:06.956272 master-0 kubenswrapper[8606]: I1204 22:13:06.956253 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:13:06.957592 master-0 kubenswrapper[8606]: I1204 22:13:06.956971 8606 scope.go:117] "RemoveContainer" containerID="fe4d171382cafc8367d04ba38562e53607a288ace82dafd0c07ed366d2a1ef56" Dec 04 22:13:06.957592 master-0 kubenswrapper[8606]: E1204 22:13:06.957238 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=operator-controller-controller-manager-7cbd59c7f8-nxbjw_openshift-operator-controller(ce6b5a46-172b-4575-ba22-ff3c6ea4207f)\"" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" podUID="ce6b5a46-172b-4575-ba22-ff3c6ea4207f" Dec 04 22:13:07.166556 master-0 kubenswrapper[8606]: I1204 22:13:07.166463 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-v7zfw_fb0274dc-fac1-41f9-b3e5-77253d851fdf/manager/1.log" Dec 04 22:13:07.167488 master-0 kubenswrapper[8606]: I1204 22:13:07.167430 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerStarted","Data":"c8721bb45eb3cc953ffc024aeb4fb1727d1b4358f45d475db67be3cd695e49da"} Dec 04 22:13:07.167852 master-0 kubenswrapper[8606]: I1204 22:13:07.167789 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:13:07.943853 master-0 kubenswrapper[8606]: I1204 22:13:07.943758 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:07.943853 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:07.943853 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:07.943853 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:07.944892 master-0 kubenswrapper[8606]: I1204 22:13:07.943864 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:08.944852 master-0 kubenswrapper[8606]: I1204 22:13:08.944728 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:08.944852 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:08.944852 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:08.944852 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:08.946317 master-0 kubenswrapper[8606]: I1204 22:13:08.944872 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:09.944081 master-0 kubenswrapper[8606]: I1204 22:13:09.943984 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:09.944081 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:09.944081 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:09.944081 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:09.944577 master-0 kubenswrapper[8606]: I1204 22:13:09.944092 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:10.944569 master-0 kubenswrapper[8606]: I1204 22:13:10.944452 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:10.944569 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:10.944569 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:10.944569 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:10.945756 master-0 kubenswrapper[8606]: I1204 22:13:10.944632 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:11.944925 master-0 kubenswrapper[8606]: I1204 22:13:11.944843 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:11.944925 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:11.944925 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:11.944925 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:11.946110 master-0 kubenswrapper[8606]: I1204 22:13:11.944926 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:12.944843 master-0 kubenswrapper[8606]: I1204 22:13:12.944724 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:12.944843 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:12.944843 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:12.944843 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:12.946166 master-0 kubenswrapper[8606]: I1204 22:13:12.944850 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:13.871485 master-0 kubenswrapper[8606]: E1204 22:13:13.871370 8606 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 04 22:13:13.944271 master-0 kubenswrapper[8606]: I1204 22:13:13.944191 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:13.944271 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:13.944271 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:13.944271 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:13.944897 master-0 kubenswrapper[8606]: I1204 22:13:13.944295 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:14.230360 master-0 kubenswrapper[8606]: I1204 22:13:14.230291 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"d9582fc250da782a33466d4d52e589af275376f87d9bdf03fa1cb11c7d23524e"} Dec 04 22:13:14.232496 master-0 kubenswrapper[8606]: I1204 22:13:14.230134 8606 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="d9582fc250da782a33466d4d52e589af275376f87d9bdf03fa1cb11c7d23524e" exitCode=0 Dec 04 22:13:14.232496 master-0 kubenswrapper[8606]: I1204 22:13:14.230632 8606 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:13:14.232496 master-0 kubenswrapper[8606]: I1204 22:13:14.230671 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:13:14.944845 master-0 kubenswrapper[8606]: I1204 22:13:14.944774 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:13:14.944845 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:13:14.944845 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:13:14.944845 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:13:14.944845 master-0 kubenswrapper[8606]: I1204 22:13:14.944841 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:13:14.945378 master-0 kubenswrapper[8606]: I1204 22:13:14.944897 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:13:14.945851 master-0 kubenswrapper[8606]: I1204 22:13:14.945796 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"57c531c30874bf4998e2715db0baeccbcbac79537b43a4f58e8644a7f87789e1"} pod="openshift-ingress/router-default-5465c8b4db-8vm66" containerMessage="Container router failed startup probe, will be restarted" Dec 04 22:13:14.946002 master-0 kubenswrapper[8606]: I1204 22:13:14.945859 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" containerID="cri-o://57c531c30874bf4998e2715db0baeccbcbac79537b43a4f58e8644a7f87789e1" gracePeriod=3600 Dec 04 22:13:15.242645 master-0 kubenswrapper[8606]: I1204 22:13:15.242550 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:13:15.243183 master-0 kubenswrapper[8606]: I1204 22:13:15.243162 8606 generic.go:334] "Generic (PLEG): container finished" podID="fad55397ac8e23f218f25cb714ea5b2b" containerID="90abdb91a6f0572c29e8db6a5253191316a65911b9cf466a77844b0b5c6a021d" exitCode=0 Dec 04 22:13:15.243287 master-0 kubenswrapper[8606]: I1204 22:13:15.243245 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerDied","Data":"90abdb91a6f0572c29e8db6a5253191316a65911b9cf466a77844b0b5c6a021d"} Dec 04 22:13:15.243848 master-0 kubenswrapper[8606]: I1204 22:13:15.243815 8606 scope.go:117] "RemoveContainer" containerID="90abdb91a6f0572c29e8db6a5253191316a65911b9cf466a77844b0b5c6a021d" Dec 04 22:13:16.260535 master-0 kubenswrapper[8606]: I1204 22:13:16.260302 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:13:16.261378 master-0 kubenswrapper[8606]: I1204 22:13:16.260547 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"ce4dc99537b0548b822be5daff148472917250cef8953dfac3f2e8319903724f"} Dec 04 22:13:16.760028 master-0 kubenswrapper[8606]: I1204 22:13:16.759932 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:13:19.392722 master-0 kubenswrapper[8606]: I1204 22:13:19.392637 8606 scope.go:117] "RemoveContainer" containerID="db1b72940eea381ffedc858f1def5527cda55e491f0234248167dff13f171166" Dec 04 22:13:20.292146 master-0 kubenswrapper[8606]: I1204 22:13:20.292077 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/2.log" Dec 04 22:13:20.292431 master-0 kubenswrapper[8606]: I1204 22:13:20.292165 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerStarted","Data":"3f6074b8213057dcfc74cc30ffdd750130f6000fde9639fd232853d1c9ba798d"} Dec 04 22:13:20.374392 master-0 kubenswrapper[8606]: E1204 22:13:20.374239 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:13:20.392143 master-0 kubenswrapper[8606]: I1204 22:13:20.392065 8606 scope.go:117] "RemoveContainer" containerID="fe4d171382cafc8367d04ba38562e53607a288ace82dafd0c07ed366d2a1ef56" Dec 04 22:13:21.302044 master-0 kubenswrapper[8606]: I1204 22:13:21.301968 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-nxbjw_ce6b5a46-172b-4575-ba22-ff3c6ea4207f/manager/1.log" Dec 04 22:13:21.302898 master-0 kubenswrapper[8606]: I1204 22:13:21.302497 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerStarted","Data":"9bd36bdfa3dbe917fb415f401e0843138d225eb2cdd038a07c1fc4862acaf2a9"} Dec 04 22:13:21.302898 master-0 kubenswrapper[8606]: I1204 22:13:21.302795 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:13:23.318427 master-0 kubenswrapper[8606]: I1204 22:13:23.318372 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-74d9cbffbc-nzqgx_5dac8e25-0f51-4c04-929c-060479689a9d/machine-approver-controller/0.log" Dec 04 22:13:23.319286 master-0 kubenswrapper[8606]: I1204 22:13:23.319249 8606 generic.go:334] "Generic (PLEG): container finished" podID="5dac8e25-0f51-4c04-929c-060479689a9d" containerID="028e28ae583843cf573a020510b23f29bb89888d81cbcae42e7af0446b2d3e61" exitCode=255 Dec 04 22:13:23.319391 master-0 kubenswrapper[8606]: I1204 22:13:23.319317 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" event={"ID":"5dac8e25-0f51-4c04-929c-060479689a9d","Type":"ContainerDied","Data":"028e28ae583843cf573a020510b23f29bb89888d81cbcae42e7af0446b2d3e61"} Dec 04 22:13:23.320245 master-0 kubenswrapper[8606]: I1204 22:13:23.320226 8606 scope.go:117] "RemoveContainer" containerID="028e28ae583843cf573a020510b23f29bb89888d81cbcae42e7af0446b2d3e61" Dec 04 22:13:24.185244 master-0 kubenswrapper[8606]: I1204 22:13:24.183881 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:13:24.185244 master-0 kubenswrapper[8606]: I1204 22:13:24.184035 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:13:24.331100 master-0 kubenswrapper[8606]: I1204 22:13:24.331037 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-74d9cbffbc-nzqgx_5dac8e25-0f51-4c04-929c-060479689a9d/machine-approver-controller/0.log" Dec 04 22:13:24.332964 master-0 kubenswrapper[8606]: I1204 22:13:24.332918 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" event={"ID":"5dac8e25-0f51-4c04-929c-060479689a9d","Type":"ContainerStarted","Data":"5d2fefeec2561a7c75cacf5399d1c0370782912b0bc1f1c8faf916fa302e41f1"} Dec 04 22:13:26.960953 master-0 kubenswrapper[8606]: I1204 22:13:26.960823 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:13:27.184386 master-0 kubenswrapper[8606]: I1204 22:13:27.184211 8606 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:13:27.184386 master-0 kubenswrapper[8606]: I1204 22:13:27.184326 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:13:27.359689 master-0 kubenswrapper[8606]: I1204 22:13:27.359581 8606 generic.go:334] "Generic (PLEG): container finished" podID="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" containerID="7d1cd57ab4d6b616f66a493d64b829158d3913e17906249bebc8057db4a21035" exitCode=0 Dec 04 22:13:27.359689 master-0 kubenswrapper[8606]: I1204 22:13:27.359658 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerDied","Data":"7d1cd57ab4d6b616f66a493d64b829158d3913e17906249bebc8057db4a21035"} Dec 04 22:13:27.360314 master-0 kubenswrapper[8606]: I1204 22:13:27.359765 8606 scope.go:117] "RemoveContainer" containerID="8235211a3898b8961786603441645f7da3fef63f8a04f95fcc274a44a7765453" Dec 04 22:13:27.360972 master-0 kubenswrapper[8606]: I1204 22:13:27.360897 8606 scope.go:117] "RemoveContainer" containerID="7d1cd57ab4d6b616f66a493d64b829158d3913e17906249bebc8057db4a21035" Dec 04 22:13:27.361304 master-0 kubenswrapper[8606]: E1204 22:13:27.361228 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-cluster-manager pod=ovnkube-control-plane-5df5548d54-gjjxs_openshift-ovn-kubernetes(3f6d05b8-b7b4-4b2d-ace0-d1f59035d161)\"" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" podUID="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" Dec 04 22:13:27.363315 master-0 kubenswrapper[8606]: I1204 22:13:27.362570 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/1.log" Dec 04 22:13:27.364259 master-0 kubenswrapper[8606]: I1204 22:13:27.364225 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/0.log" Dec 04 22:13:27.364335 master-0 kubenswrapper[8606]: I1204 22:13:27.364297 8606 generic.go:334] "Generic (PLEG): container finished" podID="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" containerID="669bc75f8a8081f19feaad95999690146addc9aad7304006d9c07b88b80a73be" exitCode=1 Dec 04 22:13:27.364418 master-0 kubenswrapper[8606]: I1204 22:13:27.364363 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerDied","Data":"669bc75f8a8081f19feaad95999690146addc9aad7304006d9c07b88b80a73be"} Dec 04 22:13:27.365580 master-0 kubenswrapper[8606]: I1204 22:13:27.365543 8606 scope.go:117] "RemoveContainer" containerID="669bc75f8a8081f19feaad95999690146addc9aad7304006d9c07b88b80a73be" Dec 04 22:13:27.365921 master-0 kubenswrapper[8606]: E1204 22:13:27.365893 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-78f758c7b9-44srj_openshift-machine-api(a3899a38-39b8-4b48-81e5-4d8854ecc8ab)\"" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" podUID="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" Dec 04 22:13:27.368371 master-0 kubenswrapper[8606]: I1204 22:13:27.368337 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-5f49d774cd-5m4l9_5598683a-cd32-486d-8839-205829d55cc2/cluster-autoscaler-operator/0.log" Dec 04 22:13:27.369247 master-0 kubenswrapper[8606]: I1204 22:13:27.369171 8606 generic.go:334] "Generic (PLEG): container finished" podID="5598683a-cd32-486d-8839-205829d55cc2" containerID="922ec5ad22c0f758ca0c6af6881b85724b616bc9bf1514cfd7b12d47fc0ff553" exitCode=255 Dec 04 22:13:27.369400 master-0 kubenswrapper[8606]: I1204 22:13:27.369315 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" event={"ID":"5598683a-cd32-486d-8839-205829d55cc2","Type":"ContainerDied","Data":"922ec5ad22c0f758ca0c6af6881b85724b616bc9bf1514cfd7b12d47fc0ff553"} Dec 04 22:13:27.370572 master-0 kubenswrapper[8606]: I1204 22:13:27.370494 8606 scope.go:117] "RemoveContainer" containerID="922ec5ad22c0f758ca0c6af6881b85724b616bc9bf1514cfd7b12d47fc0ff553" Dec 04 22:13:27.372419 master-0 kubenswrapper[8606]: I1204 22:13:27.372356 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-88d48b57d-pp4fd_74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/machine-api-operator/0.log" Dec 04 22:13:27.373082 master-0 kubenswrapper[8606]: I1204 22:13:27.373043 8606 generic.go:334] "Generic (PLEG): container finished" podID="74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9" containerID="6d31ad2a1f5237b4355ed2f39e4e13656076c4a85f80a08d5d712a1a6ab75238" exitCode=255 Dec 04 22:13:27.373211 master-0 kubenswrapper[8606]: I1204 22:13:27.373131 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" event={"ID":"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9","Type":"ContainerDied","Data":"6d31ad2a1f5237b4355ed2f39e4e13656076c4a85f80a08d5d712a1a6ab75238"} Dec 04 22:13:27.373825 master-0 kubenswrapper[8606]: I1204 22:13:27.373797 8606 scope.go:117] "RemoveContainer" containerID="6d31ad2a1f5237b4355ed2f39e4e13656076c4a85f80a08d5d712a1a6ab75238" Dec 04 22:13:27.377906 master-0 kubenswrapper[8606]: I1204 22:13:27.377858 8606 generic.go:334] "Generic (PLEG): container finished" podID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerID="1bd03b6d56dba3556ff5faee83fe97db0ea1194b6ba6e4b1aac4ae2f4b0e67d8" exitCode=0 Dec 04 22:13:27.378020 master-0 kubenswrapper[8606]: I1204 22:13:27.377924 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerDied","Data":"1bd03b6d56dba3556ff5faee83fe97db0ea1194b6ba6e4b1aac4ae2f4b0e67d8"} Dec 04 22:13:27.378645 master-0 kubenswrapper[8606]: I1204 22:13:27.378607 8606 scope.go:117] "RemoveContainer" containerID="1bd03b6d56dba3556ff5faee83fe97db0ea1194b6ba6e4b1aac4ae2f4b0e67d8" Dec 04 22:13:27.378996 master-0 kubenswrapper[8606]: E1204 22:13:27.378961 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=controller-manager pod=controller-manager-86785576d9-t7jrz_openshift-controller-manager(c3863c74-8f22-4c67-bef5-2d0d39df4abd)\"" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" Dec 04 22:13:27.424070 master-0 kubenswrapper[8606]: I1204 22:13:27.424003 8606 scope.go:117] "RemoveContainer" containerID="93d74a7e351d1bb38ca66b99396fddaa338eac5fd2201ea238d97a8b16a1e1a0" Dec 04 22:13:27.484074 master-0 kubenswrapper[8606]: I1204 22:13:27.484012 8606 scope.go:117] "RemoveContainer" containerID="ab6c10b9a3e0637d5c7a14c6df7c632b34ad06eac467a51eec2ac60a0a5a71c4" Dec 04 22:13:28.390684 master-0 kubenswrapper[8606]: I1204 22:13:28.390630 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-88d48b57d-pp4fd_74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/machine-api-operator/0.log" Dec 04 22:13:28.392356 master-0 kubenswrapper[8606]: I1204 22:13:28.391255 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" event={"ID":"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9","Type":"ContainerStarted","Data":"1bc5ab124190b2c59d84b58250cc263ddcafcb9537dd2db02384165b00676c7f"} Dec 04 22:13:28.399050 master-0 kubenswrapper[8606]: I1204 22:13:28.398999 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/1.log" Dec 04 22:13:28.402824 master-0 kubenswrapper[8606]: I1204 22:13:28.402785 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-5f49d774cd-5m4l9_5598683a-cd32-486d-8839-205829d55cc2/cluster-autoscaler-operator/0.log" Dec 04 22:13:28.403714 master-0 kubenswrapper[8606]: I1204 22:13:28.403646 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" event={"ID":"5598683a-cd32-486d-8839-205829d55cc2","Type":"ContainerStarted","Data":"92528b95e13ec8f264bec256aad90296e21be58deea24ba77dc8ff80b36c0304"} Dec 04 22:13:31.610412 master-0 kubenswrapper[8606]: I1204 22:13:31.610320 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:13:31.610412 master-0 kubenswrapper[8606]: I1204 22:13:31.610412 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:13:31.611550 master-0 kubenswrapper[8606]: I1204 22:13:31.611097 8606 scope.go:117] "RemoveContainer" containerID="1bd03b6d56dba3556ff5faee83fe97db0ea1194b6ba6e4b1aac4ae2f4b0e67d8" Dec 04 22:13:31.611550 master-0 kubenswrapper[8606]: E1204 22:13:31.611457 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=controller-manager pod=controller-manager-86785576d9-t7jrz_openshift-controller-manager(c3863c74-8f22-4c67-bef5-2d0d39df4abd)\"" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" Dec 04 22:13:35.462287 master-0 kubenswrapper[8606]: I1204 22:13:35.462192 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-nznvn_f1534e25-7add-46a1-8f4e-0065c232aa4e/control-plane-machine-set-operator/1.log" Dec 04 22:13:35.463337 master-0 kubenswrapper[8606]: I1204 22:13:35.462985 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-nznvn_f1534e25-7add-46a1-8f4e-0065c232aa4e/control-plane-machine-set-operator/0.log" Dec 04 22:13:35.463337 master-0 kubenswrapper[8606]: I1204 22:13:35.463040 8606 generic.go:334] "Generic (PLEG): container finished" podID="f1534e25-7add-46a1-8f4e-0065c232aa4e" containerID="893b1713daa62af184411a3e5a2cfc2bd5735ca25c31751314839bd533678913" exitCode=1 Dec 04 22:13:35.463337 master-0 kubenswrapper[8606]: I1204 22:13:35.463106 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" event={"ID":"f1534e25-7add-46a1-8f4e-0065c232aa4e","Type":"ContainerDied","Data":"893b1713daa62af184411a3e5a2cfc2bd5735ca25c31751314839bd533678913"} Dec 04 22:13:35.463337 master-0 kubenswrapper[8606]: I1204 22:13:35.463231 8606 scope.go:117] "RemoveContainer" containerID="efec9b80d16091e3ba4473728d27aba3a23ca799a67ec448c19c49d6e7be1b22" Dec 04 22:13:35.463939 master-0 kubenswrapper[8606]: I1204 22:13:35.463881 8606 scope.go:117] "RemoveContainer" containerID="893b1713daa62af184411a3e5a2cfc2bd5735ca25c31751314839bd533678913" Dec 04 22:13:35.464207 master-0 kubenswrapper[8606]: E1204 22:13:35.464155 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"control-plane-machine-set-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=control-plane-machine-set-operator pod=control-plane-machine-set-operator-7df95c79b5-nznvn_openshift-machine-api(f1534e25-7add-46a1-8f4e-0065c232aa4e)\"" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" podUID="f1534e25-7add-46a1-8f4e-0065c232aa4e" Dec 04 22:13:36.474819 master-0 kubenswrapper[8606]: I1204 22:13:36.474723 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-nznvn_f1534e25-7add-46a1-8f4e-0065c232aa4e/control-plane-machine-set-operator/1.log" Dec 04 22:13:37.184726 master-0 kubenswrapper[8606]: I1204 22:13:37.184620 8606 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:13:37.185102 master-0 kubenswrapper[8606]: I1204 22:13:37.184723 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:13:37.375845 master-0 kubenswrapper[8606]: E1204 22:13:37.375728 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:13:38.320454 master-0 kubenswrapper[8606]: E1204 22:13:38.320205 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e22b54bb64339 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:BackOff,Message:Back-off restarting failed container kube-scheduler in pod bootstrap-kube-scheduler-master-0_kube-system(5e09e2af7200e6f9be469dbfd9bb1127),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:11:06.030474041 +0000 UTC m=+630.840776286,LastTimestamp:2025-12-04 22:11:06.030474041 +0000 UTC m=+630.840776286,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:13:38.391868 master-0 kubenswrapper[8606]: I1204 22:13:38.391740 8606 scope.go:117] "RemoveContainer" containerID="669bc75f8a8081f19feaad95999690146addc9aad7304006d9c07b88b80a73be" Dec 04 22:13:39.503033 master-0 kubenswrapper[8606]: I1204 22:13:39.502935 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/1.log" Dec 04 22:13:39.504146 master-0 kubenswrapper[8606]: I1204 22:13:39.504054 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerStarted","Data":"45f93308614301d84cb0176bf1dabc3de4bbaffff580b91a3cb5db6707e27be7"} Dec 04 22:13:42.392650 master-0 kubenswrapper[8606]: I1204 22:13:42.392478 8606 scope.go:117] "RemoveContainer" containerID="1bd03b6d56dba3556ff5faee83fe97db0ea1194b6ba6e4b1aac4ae2f4b0e67d8" Dec 04 22:13:42.393787 master-0 kubenswrapper[8606]: I1204 22:13:42.393079 8606 scope.go:117] "RemoveContainer" containerID="7d1cd57ab4d6b616f66a493d64b829158d3913e17906249bebc8057db4a21035" Dec 04 22:13:43.542709 master-0 kubenswrapper[8606]: I1204 22:13:43.542629 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerStarted","Data":"4c72e9186536692b281fe4e714a6c8b9d1a2250b3c26e8d8330699b4c2ec401d"} Dec 04 22:13:43.543531 master-0 kubenswrapper[8606]: I1204 22:13:43.543294 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:13:43.545598 master-0 kubenswrapper[8606]: I1204 22:13:43.545540 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerStarted","Data":"4d9524d2f6c8db6ea1b8d01f9923d2a3d6c267b7a2c9858906c3336929be1a8b"} Dec 04 22:13:43.549405 master-0 kubenswrapper[8606]: I1204 22:13:43.549348 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:13:45.781264 master-0 kubenswrapper[8606]: I1204 22:13:45.781186 8606 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:33768->127.0.0.1:10357: read: connection reset by peer" start-of-body= Dec 04 22:13:45.782395 master-0 kubenswrapper[8606]: I1204 22:13:45.781313 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:33768->127.0.0.1:10357: read: connection reset by peer" Dec 04 22:13:45.782395 master-0 kubenswrapper[8606]: I1204 22:13:45.781404 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:13:45.783661 master-0 kubenswrapper[8606]: I1204 22:13:45.782691 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ce4dc99537b0548b822be5daff148472917250cef8953dfac3f2e8319903724f"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Dec 04 22:13:45.783661 master-0 kubenswrapper[8606]: I1204 22:13:45.782913 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" containerID="cri-o://ce4dc99537b0548b822be5daff148472917250cef8953dfac3f2e8319903724f" gracePeriod=30 Dec 04 22:13:46.570494 master-0 kubenswrapper[8606]: I1204 22:13:46.570397 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/1.log" Dec 04 22:13:46.573350 master-0 kubenswrapper[8606]: I1204 22:13:46.573273 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:13:46.573465 master-0 kubenswrapper[8606]: I1204 22:13:46.573398 8606 generic.go:334] "Generic (PLEG): container finished" podID="fad55397ac8e23f218f25cb714ea5b2b" containerID="ce4dc99537b0548b822be5daff148472917250cef8953dfac3f2e8319903724f" exitCode=255 Dec 04 22:13:46.573572 master-0 kubenswrapper[8606]: I1204 22:13:46.573462 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerDied","Data":"ce4dc99537b0548b822be5daff148472917250cef8953dfac3f2e8319903724f"} Dec 04 22:13:46.573656 master-0 kubenswrapper[8606]: I1204 22:13:46.573570 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"f87bf61657cd2e63ffdf4cd18cece9834c806ee679f521114bc6c30454ba3f55"} Dec 04 22:13:46.573656 master-0 kubenswrapper[8606]: I1204 22:13:46.573615 8606 scope.go:117] "RemoveContainer" containerID="90abdb91a6f0572c29e8db6a5253191316a65911b9cf466a77844b0b5c6a021d" Dec 04 22:13:47.392051 master-0 kubenswrapper[8606]: I1204 22:13:47.392001 8606 scope.go:117] "RemoveContainer" containerID="893b1713daa62af184411a3e5a2cfc2bd5735ca25c31751314839bd533678913" Dec 04 22:13:47.588089 master-0 kubenswrapper[8606]: I1204 22:13:47.588025 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/1.log" Dec 04 22:13:47.589925 master-0 kubenswrapper[8606]: I1204 22:13:47.589872 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:13:48.234170 master-0 kubenswrapper[8606]: E1204 22:13:48.234051 8606 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 04 22:13:48.604850 master-0 kubenswrapper[8606]: I1204 22:13:48.604760 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"ff3db8bd41312f8815f56e2a1d1c76c27943763c50e5bb1dafc09d4915bc599d"} Dec 04 22:13:48.607971 master-0 kubenswrapper[8606]: I1204 22:13:48.607928 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-nznvn_f1534e25-7add-46a1-8f4e-0065c232aa4e/control-plane-machine-set-operator/1.log" Dec 04 22:13:48.608047 master-0 kubenswrapper[8606]: I1204 22:13:48.608001 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" event={"ID":"f1534e25-7add-46a1-8f4e-0065c232aa4e","Type":"ContainerStarted","Data":"a2b5c7a83b5284024cd6037f9bc3ab61da8f64f7c1155c2976623ada6236de54"} Dec 04 22:13:49.624436 master-0 kubenswrapper[8606]: I1204 22:13:49.624353 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"efa3762feffc3da59b6f9bcafd79da1dfd2e009c93cfb906986a1a37a50f7d8d"} Dec 04 22:13:49.624436 master-0 kubenswrapper[8606]: I1204 22:13:49.624423 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"dcca3d65f2eb9a58cbe582258c5a8066f1e6748f3a54afa3247bc56a9f4f23d0"} Dec 04 22:13:49.624436 master-0 kubenswrapper[8606]: I1204 22:13:49.624443 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"8ee64a9d18e9766817acb72d7fa9c4b992b2a148db0509af5fadc5499a6f837a"} Dec 04 22:13:50.641315 master-0 kubenswrapper[8606]: I1204 22:13:50.641210 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"fb4afb592b5c30cfd21e213860a9cae209891a86353f6f65689e3455958a2f39"} Dec 04 22:13:50.642391 master-0 kubenswrapper[8606]: I1204 22:13:50.641754 8606 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:13:50.642391 master-0 kubenswrapper[8606]: I1204 22:13:50.641808 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:13:50.645391 master-0 kubenswrapper[8606]: I1204 22:13:50.645300 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/3.log" Dec 04 22:13:50.646574 master-0 kubenswrapper[8606]: I1204 22:13:50.646472 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/2.log" Dec 04 22:13:50.646731 master-0 kubenswrapper[8606]: I1204 22:13:50.646640 8606 generic.go:334] "Generic (PLEG): container finished" podID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" containerID="3f6074b8213057dcfc74cc30ffdd750130f6000fde9639fd232853d1c9ba798d" exitCode=1 Dec 04 22:13:50.646731 master-0 kubenswrapper[8606]: I1204 22:13:50.646709 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerDied","Data":"3f6074b8213057dcfc74cc30ffdd750130f6000fde9639fd232853d1c9ba798d"} Dec 04 22:13:50.646877 master-0 kubenswrapper[8606]: I1204 22:13:50.646793 8606 scope.go:117] "RemoveContainer" containerID="db1b72940eea381ffedc858f1def5527cda55e491f0234248167dff13f171166" Dec 04 22:13:50.647751 master-0 kubenswrapper[8606]: I1204 22:13:50.647693 8606 scope.go:117] "RemoveContainer" containerID="3f6074b8213057dcfc74cc30ffdd750130f6000fde9639fd232853d1c9ba798d" Dec 04 22:13:50.648205 master-0 kubenswrapper[8606]: E1204 22:13:50.648145 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:13:51.658336 master-0 kubenswrapper[8606]: I1204 22:13:51.658252 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/3.log" Dec 04 22:13:53.426149 master-0 kubenswrapper[8606]: I1204 22:13:53.425995 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 04 22:13:53.426149 master-0 kubenswrapper[8606]: I1204 22:13:53.426103 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 04 22:13:54.183848 master-0 kubenswrapper[8606]: I1204 22:13:54.183739 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:13:54.184284 master-0 kubenswrapper[8606]: I1204 22:13:54.183902 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:13:54.376977 master-0 kubenswrapper[8606]: E1204 22:13:54.376889 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:13:57.184672 master-0 kubenswrapper[8606]: I1204 22:13:57.184550 8606 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:13:57.185540 master-0 kubenswrapper[8606]: I1204 22:13:57.184681 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:14:01.744907 master-0 kubenswrapper[8606]: I1204 22:14:01.744685 8606 generic.go:334] "Generic (PLEG): container finished" podID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerID="57c531c30874bf4998e2715db0baeccbcbac79537b43a4f58e8644a7f87789e1" exitCode=0 Dec 04 22:14:01.744907 master-0 kubenswrapper[8606]: I1204 22:14:01.744776 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerDied","Data":"57c531c30874bf4998e2715db0baeccbcbac79537b43a4f58e8644a7f87789e1"} Dec 04 22:14:01.744907 master-0 kubenswrapper[8606]: I1204 22:14:01.744855 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerStarted","Data":"1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3"} Dec 04 22:14:01.744907 master-0 kubenswrapper[8606]: I1204 22:14:01.744875 8606 scope.go:117] "RemoveContainer" containerID="ff93336d691c6b7a2b8fa61f8706675e304c74a0c7baba380e59deb94bba182a" Dec 04 22:14:01.941098 master-0 kubenswrapper[8606]: I1204 22:14:01.941021 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:14:01.945460 master-0 kubenswrapper[8606]: I1204 22:14:01.944858 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:01.945460 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:01.945460 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:01.945460 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:01.945460 master-0 kubenswrapper[8606]: I1204 22:14:01.944943 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:01.993935 master-0 kubenswrapper[8606]: I1204 22:14:01.993827 8606 status_manager.go:851] "Failed to get status for pod" podUID="634c1df6-de4d-4e26-8c71-d39311cae0ce" pod="openshift-network-node-identity/network-node-identity-nk92d" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods network-node-identity-nk92d)" Dec 04 22:14:02.944880 master-0 kubenswrapper[8606]: I1204 22:14:02.944801 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:02.944880 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:02.944880 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:02.944880 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:02.946023 master-0 kubenswrapper[8606]: I1204 22:14:02.944908 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:03.469989 master-0 kubenswrapper[8606]: I1204 22:14:03.469908 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 04 22:14:03.944045 master-0 kubenswrapper[8606]: I1204 22:14:03.943937 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:03.944045 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:03.944045 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:03.944045 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:03.944562 master-0 kubenswrapper[8606]: I1204 22:14:03.944045 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:04.944785 master-0 kubenswrapper[8606]: I1204 22:14:04.944727 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:04.944785 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:04.944785 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:04.944785 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:04.945652 master-0 kubenswrapper[8606]: I1204 22:14:04.945603 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:05.234840 master-0 kubenswrapper[8606]: E1204 22:14:05.234640 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:13:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:13:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:13:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:13:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:14:05.392444 master-0 kubenswrapper[8606]: I1204 22:14:05.392325 8606 scope.go:117] "RemoveContainer" containerID="3f6074b8213057dcfc74cc30ffdd750130f6000fde9639fd232853d1c9ba798d" Dec 04 22:14:05.392929 master-0 kubenswrapper[8606]: E1204 22:14:05.392850 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:14:05.943711 master-0 kubenswrapper[8606]: I1204 22:14:05.943647 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:05.943711 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:05.943711 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:05.943711 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:05.944026 master-0 kubenswrapper[8606]: I1204 22:14:05.943723 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:06.943685 master-0 kubenswrapper[8606]: I1204 22:14:06.943572 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:06.943685 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:06.943685 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:06.943685 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:06.943685 master-0 kubenswrapper[8606]: I1204 22:14:06.943681 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:07.184420 master-0 kubenswrapper[8606]: I1204 22:14:07.184308 8606 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:14:07.184420 master-0 kubenswrapper[8606]: I1204 22:14:07.184395 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:14:07.944047 master-0 kubenswrapper[8606]: I1204 22:14:07.943959 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:07.944047 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:07.944047 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:07.944047 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:07.944047 master-0 kubenswrapper[8606]: I1204 22:14:07.944042 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:08.470005 master-0 kubenswrapper[8606]: I1204 22:14:08.469936 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 04 22:14:08.941426 master-0 kubenswrapper[8606]: I1204 22:14:08.941280 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:14:08.944184 master-0 kubenswrapper[8606]: I1204 22:14:08.944101 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:08.944184 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:08.944184 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:08.944184 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:08.945010 master-0 kubenswrapper[8606]: I1204 22:14:08.944219 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:09.944175 master-0 kubenswrapper[8606]: I1204 22:14:09.944064 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:09.944175 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:09.944175 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:09.944175 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:09.944175 master-0 kubenswrapper[8606]: I1204 22:14:09.944163 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:10.944331 master-0 kubenswrapper[8606]: I1204 22:14:10.944216 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:10.944331 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:10.944331 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:10.944331 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:10.944331 master-0 kubenswrapper[8606]: I1204 22:14:10.944313 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:11.378218 master-0 kubenswrapper[8606]: E1204 22:14:11.378075 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:14:11.944834 master-0 kubenswrapper[8606]: I1204 22:14:11.944752 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:11.944834 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:11.944834 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:11.944834 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:11.946095 master-0 kubenswrapper[8606]: I1204 22:14:11.944850 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:12.323941 master-0 kubenswrapper[8606]: E1204 22:14:12.323737 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{router-default-5465c8b4db-8vm66.187e2290858068fd openshift-ingress 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-ingress,Name:router-default-5465c8b4db-8vm66,UID:c178afcf-b713-4c74-b22b-6169ba3123f5,APIVersion:v1,ResourceVersion:10249,FieldPath:spec.containers{router},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2b3d313c599852b3543ee5c3a62691bd2d1bbad12c2e1c610cd71a1dec6eea32\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:08:28.086233341 +0000 UTC m=+472.896535586,LastTimestamp:2025-12-04 22:11:14.06808349 +0000 UTC m=+638.878385715,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:14:12.944142 master-0 kubenswrapper[8606]: I1204 22:14:12.944065 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:12.944142 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:12.944142 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:12.944142 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:12.944753 master-0 kubenswrapper[8606]: I1204 22:14:12.944165 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:13.851153 master-0 kubenswrapper[8606]: I1204 22:14:13.851048 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/4.log" Dec 04 22:14:13.852423 master-0 kubenswrapper[8606]: I1204 22:14:13.852358 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/3.log" Dec 04 22:14:13.853140 master-0 kubenswrapper[8606]: I1204 22:14:13.853031 8606 generic.go:334] "Generic (PLEG): container finished" podID="addddaac-a31a-4dbf-b78f-87225b11b463" containerID="dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6" exitCode=1 Dec 04 22:14:13.853140 master-0 kubenswrapper[8606]: I1204 22:14:13.853099 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerDied","Data":"dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6"} Dec 04 22:14:13.853380 master-0 kubenswrapper[8606]: I1204 22:14:13.853194 8606 scope.go:117] "RemoveContainer" containerID="8aef2ad5ac5490ab1a81df3b63461f15674a4e79c00fcefd7f3c846aef27f271" Dec 04 22:14:13.854067 master-0 kubenswrapper[8606]: I1204 22:14:13.853981 8606 scope.go:117] "RemoveContainer" containerID="dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6" Dec 04 22:14:13.854537 master-0 kubenswrapper[8606]: E1204 22:14:13.854441 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:14:13.945349 master-0 kubenswrapper[8606]: I1204 22:14:13.945274 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:13.945349 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:13.945349 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:13.945349 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:13.945836 master-0 kubenswrapper[8606]: I1204 22:14:13.945364 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:14.864244 master-0 kubenswrapper[8606]: I1204 22:14:14.864143 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/4.log" Dec 04 22:14:14.944864 master-0 kubenswrapper[8606]: I1204 22:14:14.944802 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:14.944864 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:14.944864 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:14.944864 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:14.945267 master-0 kubenswrapper[8606]: I1204 22:14:14.944872 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:15.235602 master-0 kubenswrapper[8606]: E1204 22:14:15.235435 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:14:15.944179 master-0 kubenswrapper[8606]: I1204 22:14:15.944097 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:15.944179 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:15.944179 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:15.944179 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:15.944838 master-0 kubenswrapper[8606]: I1204 22:14:15.944203 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:16.392059 master-0 kubenswrapper[8606]: I1204 22:14:16.391967 8606 scope.go:117] "RemoveContainer" containerID="3f6074b8213057dcfc74cc30ffdd750130f6000fde9639fd232853d1c9ba798d" Dec 04 22:14:16.392452 master-0 kubenswrapper[8606]: E1204 22:14:16.392287 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:14:16.600735 master-0 kubenswrapper[8606]: I1204 22:14:16.600646 8606 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:34520->127.0.0.1:10357: read: connection reset by peer" start-of-body= Dec 04 22:14:16.600735 master-0 kubenswrapper[8606]: I1204 22:14:16.600725 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:34520->127.0.0.1:10357: read: connection reset by peer" Dec 04 22:14:16.601137 master-0 kubenswrapper[8606]: I1204 22:14:16.600799 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:14:16.603840 master-0 kubenswrapper[8606]: I1204 22:14:16.601674 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f87bf61657cd2e63ffdf4cd18cece9834c806ee679f521114bc6c30454ba3f55"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Dec 04 22:14:16.603840 master-0 kubenswrapper[8606]: I1204 22:14:16.601776 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" containerID="cri-o://f87bf61657cd2e63ffdf4cd18cece9834c806ee679f521114bc6c30454ba3f55" gracePeriod=30 Dec 04 22:14:16.881723 master-0 kubenswrapper[8606]: I1204 22:14:16.881632 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/2.log" Dec 04 22:14:16.882291 master-0 kubenswrapper[8606]: I1204 22:14:16.882235 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/1.log" Dec 04 22:14:16.884653 master-0 kubenswrapper[8606]: I1204 22:14:16.884571 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:14:16.884836 master-0 kubenswrapper[8606]: I1204 22:14:16.884691 8606 generic.go:334] "Generic (PLEG): container finished" podID="fad55397ac8e23f218f25cb714ea5b2b" containerID="f87bf61657cd2e63ffdf4cd18cece9834c806ee679f521114bc6c30454ba3f55" exitCode=255 Dec 04 22:14:16.884836 master-0 kubenswrapper[8606]: I1204 22:14:16.884749 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerDied","Data":"f87bf61657cd2e63ffdf4cd18cece9834c806ee679f521114bc6c30454ba3f55"} Dec 04 22:14:16.884836 master-0 kubenswrapper[8606]: I1204 22:14:16.884801 8606 scope.go:117] "RemoveContainer" containerID="ce4dc99537b0548b822be5daff148472917250cef8953dfac3f2e8319903724f" Dec 04 22:14:16.944903 master-0 kubenswrapper[8606]: I1204 22:14:16.944836 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:16.944903 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:16.944903 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:16.944903 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:16.948332 master-0 kubenswrapper[8606]: I1204 22:14:16.944929 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:17.895185 master-0 kubenswrapper[8606]: I1204 22:14:17.895116 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/2.log" Dec 04 22:14:17.897476 master-0 kubenswrapper[8606]: I1204 22:14:17.897421 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:14:17.897572 master-0 kubenswrapper[8606]: I1204 22:14:17.897533 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75"} Dec 04 22:14:17.944279 master-0 kubenswrapper[8606]: I1204 22:14:17.944174 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:17.944279 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:17.944279 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:17.944279 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:17.944633 master-0 kubenswrapper[8606]: I1204 22:14:17.944319 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:18.944423 master-0 kubenswrapper[8606]: I1204 22:14:18.944350 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:18.944423 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:18.944423 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:18.944423 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:18.945012 master-0 kubenswrapper[8606]: I1204 22:14:18.944425 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:19.947088 master-0 kubenswrapper[8606]: I1204 22:14:19.946987 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:19.947088 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:19.947088 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:19.947088 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:19.947088 master-0 kubenswrapper[8606]: I1204 22:14:19.947084 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:20.944821 master-0 kubenswrapper[8606]: I1204 22:14:20.944712 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:20.944821 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:20.944821 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:20.944821 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:20.945246 master-0 kubenswrapper[8606]: I1204 22:14:20.944818 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:21.945274 master-0 kubenswrapper[8606]: I1204 22:14:21.945163 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:21.945274 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:21.945274 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:21.945274 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:21.945274 master-0 kubenswrapper[8606]: I1204 22:14:21.945274 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:22.944948 master-0 kubenswrapper[8606]: I1204 22:14:22.944821 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:22.944948 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:22.944948 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:22.944948 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:22.945370 master-0 kubenswrapper[8606]: I1204 22:14:22.944973 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:23.945002 master-0 kubenswrapper[8606]: I1204 22:14:23.944851 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:23.945002 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:23.945002 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:23.945002 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:23.945002 master-0 kubenswrapper[8606]: I1204 22:14:23.944967 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:24.183947 master-0 kubenswrapper[8606]: I1204 22:14:24.183842 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:14:24.184232 master-0 kubenswrapper[8606]: I1204 22:14:24.184097 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:14:24.646679 master-0 kubenswrapper[8606]: E1204 22:14:24.645403 8606 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 04 22:14:24.943989 master-0 kubenswrapper[8606]: I1204 22:14:24.943773 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:24.943989 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:24.943989 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:24.943989 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:24.943989 master-0 kubenswrapper[8606]: I1204 22:14:24.943884 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:24.954894 master-0 kubenswrapper[8606]: I1204 22:14:24.954820 8606 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:14:24.954894 master-0 kubenswrapper[8606]: I1204 22:14:24.954857 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:14:25.236728 master-0 kubenswrapper[8606]: E1204 22:14:25.236470 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:14:25.943866 master-0 kubenswrapper[8606]: I1204 22:14:25.943759 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:25.943866 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:25.943866 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:25.943866 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:25.943866 master-0 kubenswrapper[8606]: I1204 22:14:25.943822 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:26.944173 master-0 kubenswrapper[8606]: I1204 22:14:26.944094 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:26.944173 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:26.944173 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:26.944173 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:26.945276 master-0 kubenswrapper[8606]: I1204 22:14:26.944201 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:27.184722 master-0 kubenswrapper[8606]: I1204 22:14:27.184605 8606 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:14:27.184722 master-0 kubenswrapper[8606]: I1204 22:14:27.184707 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:14:27.944577 master-0 kubenswrapper[8606]: I1204 22:14:27.944444 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:27.944577 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:27.944577 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:27.944577 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:27.944577 master-0 kubenswrapper[8606]: I1204 22:14:27.944539 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:28.379587 master-0 kubenswrapper[8606]: E1204 22:14:28.379432 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:14:28.391185 master-0 kubenswrapper[8606]: I1204 22:14:28.391126 8606 scope.go:117] "RemoveContainer" containerID="dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6" Dec 04 22:14:28.391388 master-0 kubenswrapper[8606]: E1204 22:14:28.391349 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:14:28.943900 master-0 kubenswrapper[8606]: I1204 22:14:28.943796 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:28.943900 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:28.943900 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:28.943900 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:28.943900 master-0 kubenswrapper[8606]: I1204 22:14:28.943877 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:29.943994 master-0 kubenswrapper[8606]: I1204 22:14:29.943935 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:29.943994 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:29.943994 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:29.943994 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:29.944756 master-0 kubenswrapper[8606]: I1204 22:14:29.944017 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:30.392123 master-0 kubenswrapper[8606]: I1204 22:14:30.392031 8606 scope.go:117] "RemoveContainer" containerID="3f6074b8213057dcfc74cc30ffdd750130f6000fde9639fd232853d1c9ba798d" Dec 04 22:14:30.944818 master-0 kubenswrapper[8606]: I1204 22:14:30.944725 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:30.944818 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:30.944818 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:30.944818 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:30.945533 master-0 kubenswrapper[8606]: I1204 22:14:30.944829 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:31.008947 master-0 kubenswrapper[8606]: I1204 22:14:31.008842 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/3.log" Dec 04 22:14:31.008947 master-0 kubenswrapper[8606]: I1204 22:14:31.008945 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerStarted","Data":"27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780"} Dec 04 22:14:31.944063 master-0 kubenswrapper[8606]: I1204 22:14:31.944004 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:31.944063 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:31.944063 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:31.944063 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:31.944398 master-0 kubenswrapper[8606]: I1204 22:14:31.944072 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:32.944138 master-0 kubenswrapper[8606]: I1204 22:14:32.944026 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:32.944138 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:32.944138 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:32.944138 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:32.944138 master-0 kubenswrapper[8606]: I1204 22:14:32.944104 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:33.944565 master-0 kubenswrapper[8606]: I1204 22:14:33.944420 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:33.944565 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:33.944565 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:33.944565 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:33.945822 master-0 kubenswrapper[8606]: I1204 22:14:33.944561 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:34.944956 master-0 kubenswrapper[8606]: I1204 22:14:34.944870 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:34.944956 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:34.944956 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:34.944956 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:34.945853 master-0 kubenswrapper[8606]: I1204 22:14:34.944969 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:35.237327 master-0 kubenswrapper[8606]: E1204 22:14:35.237110 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:14:35.943493 master-0 kubenswrapper[8606]: I1204 22:14:35.943396 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:35.943493 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:35.943493 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:35.943493 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:35.943493 master-0 kubenswrapper[8606]: I1204 22:14:35.943455 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:36.944681 master-0 kubenswrapper[8606]: I1204 22:14:36.944576 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:36.944681 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:36.944681 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:36.944681 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:36.944681 master-0 kubenswrapper[8606]: I1204 22:14:36.944672 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:37.185183 master-0 kubenswrapper[8606]: I1204 22:14:37.185047 8606 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:14:37.185183 master-0 kubenswrapper[8606]: I1204 22:14:37.185139 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:14:37.943157 master-0 kubenswrapper[8606]: I1204 22:14:37.943099 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:37.943157 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:37.943157 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:37.943157 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:37.943727 master-0 kubenswrapper[8606]: I1204 22:14:37.943684 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:38.943824 master-0 kubenswrapper[8606]: I1204 22:14:38.943657 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:38.943824 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:38.943824 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:38.943824 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:38.943824 master-0 kubenswrapper[8606]: I1204 22:14:38.943739 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:39.065366 master-0 kubenswrapper[8606]: I1204 22:14:39.065275 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/2.log" Dec 04 22:14:39.066342 master-0 kubenswrapper[8606]: I1204 22:14:39.066279 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/1.log" Dec 04 22:14:39.066899 master-0 kubenswrapper[8606]: I1204 22:14:39.066843 8606 generic.go:334] "Generic (PLEG): container finished" podID="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" containerID="45f93308614301d84cb0176bf1dabc3de4bbaffff580b91a3cb5db6707e27be7" exitCode=1 Dec 04 22:14:39.067062 master-0 kubenswrapper[8606]: I1204 22:14:39.066906 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerDied","Data":"45f93308614301d84cb0176bf1dabc3de4bbaffff580b91a3cb5db6707e27be7"} Dec 04 22:14:39.067062 master-0 kubenswrapper[8606]: I1204 22:14:39.066963 8606 scope.go:117] "RemoveContainer" containerID="669bc75f8a8081f19feaad95999690146addc9aad7304006d9c07b88b80a73be" Dec 04 22:14:39.069131 master-0 kubenswrapper[8606]: I1204 22:14:39.068313 8606 scope.go:117] "RemoveContainer" containerID="45f93308614301d84cb0176bf1dabc3de4bbaffff580b91a3cb5db6707e27be7" Dec 04 22:14:39.069131 master-0 kubenswrapper[8606]: E1204 22:14:39.068798 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-78f758c7b9-44srj_openshift-machine-api(a3899a38-39b8-4b48-81e5-4d8854ecc8ab)\"" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" podUID="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" Dec 04 22:14:39.392332 master-0 kubenswrapper[8606]: I1204 22:14:39.392250 8606 scope.go:117] "RemoveContainer" containerID="dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6" Dec 04 22:14:39.392785 master-0 kubenswrapper[8606]: E1204 22:14:39.392706 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:14:39.943954 master-0 kubenswrapper[8606]: I1204 22:14:39.943896 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:39.943954 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:39.943954 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:39.943954 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:39.943954 master-0 kubenswrapper[8606]: I1204 22:14:39.943957 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:40.079618 master-0 kubenswrapper[8606]: I1204 22:14:40.079534 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/2.log" Dec 04 22:14:40.945396 master-0 kubenswrapper[8606]: I1204 22:14:40.945342 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:40.945396 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:40.945396 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:40.945396 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:40.946035 master-0 kubenswrapper[8606]: I1204 22:14:40.945441 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:41.944297 master-0 kubenswrapper[8606]: I1204 22:14:41.944224 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:41.944297 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:41.944297 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:41.944297 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:41.945057 master-0 kubenswrapper[8606]: I1204 22:14:41.944989 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:42.944359 master-0 kubenswrapper[8606]: I1204 22:14:42.944295 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:42.944359 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:42.944359 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:42.944359 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:42.945270 master-0 kubenswrapper[8606]: I1204 22:14:42.944368 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:43.944449 master-0 kubenswrapper[8606]: I1204 22:14:43.944361 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:43.944449 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:43.944449 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:43.944449 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:43.945359 master-0 kubenswrapper[8606]: I1204 22:14:43.944457 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:44.944759 master-0 kubenswrapper[8606]: I1204 22:14:44.944683 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:44.944759 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:44.944759 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:44.944759 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:44.945912 master-0 kubenswrapper[8606]: I1204 22:14:44.944770 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:45.238096 master-0 kubenswrapper[8606]: E1204 22:14:45.237935 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:14:45.238096 master-0 kubenswrapper[8606]: E1204 22:14:45.237988 8606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 22:14:45.381285 master-0 kubenswrapper[8606]: E1204 22:14:45.381210 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:14:45.944122 master-0 kubenswrapper[8606]: I1204 22:14:45.944030 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:45.944122 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:45.944122 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:45.944122 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:45.944122 master-0 kubenswrapper[8606]: I1204 22:14:45.944093 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:46.329393 master-0 kubenswrapper[8606]: E1204 22:14:46.329149 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e22357a9df8fe kube-system 9472 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:01:57 +0000 UTC,LastTimestamp:2025-12-04 22:11:20.394642636 +0000 UTC m=+645.204944881,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:14:46.943883 master-0 kubenswrapper[8606]: I1204 22:14:46.943782 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:46.943883 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:46.943883 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:46.943883 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:46.944756 master-0 kubenswrapper[8606]: I1204 22:14:46.943905 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:47.185157 master-0 kubenswrapper[8606]: I1204 22:14:47.185023 8606 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:14:47.185157 master-0 kubenswrapper[8606]: I1204 22:14:47.185142 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:14:47.185538 master-0 kubenswrapper[8606]: I1204 22:14:47.185221 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:14:47.186334 master-0 kubenswrapper[8606]: I1204 22:14:47.186275 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Dec 04 22:14:47.186482 master-0 kubenswrapper[8606]: I1204 22:14:47.186437 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" containerID="cri-o://7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" gracePeriod=30 Dec 04 22:14:47.308244 master-0 kubenswrapper[8606]: E1204 22:14:47.307879 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(fad55397ac8e23f218f25cb714ea5b2b)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" Dec 04 22:14:47.944937 master-0 kubenswrapper[8606]: I1204 22:14:47.944833 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:47.944937 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:47.944937 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:47.944937 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:47.945938 master-0 kubenswrapper[8606]: I1204 22:14:47.944972 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:48.140361 master-0 kubenswrapper[8606]: I1204 22:14:48.140269 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/3.log" Dec 04 22:14:48.141095 master-0 kubenswrapper[8606]: I1204 22:14:48.141017 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/2.log" Dec 04 22:14:48.143000 master-0 kubenswrapper[8606]: I1204 22:14:48.142941 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:14:48.143152 master-0 kubenswrapper[8606]: I1204 22:14:48.143037 8606 generic.go:334] "Generic (PLEG): container finished" podID="fad55397ac8e23f218f25cb714ea5b2b" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" exitCode=255 Dec 04 22:14:48.143152 master-0 kubenswrapper[8606]: I1204 22:14:48.143096 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerDied","Data":"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75"} Dec 04 22:14:48.143300 master-0 kubenswrapper[8606]: I1204 22:14:48.143171 8606 scope.go:117] "RemoveContainer" containerID="f87bf61657cd2e63ffdf4cd18cece9834c806ee679f521114bc6c30454ba3f55" Dec 04 22:14:48.143989 master-0 kubenswrapper[8606]: I1204 22:14:48.143928 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:14:48.144441 master-0 kubenswrapper[8606]: E1204 22:14:48.144369 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(fad55397ac8e23f218f25cb714ea5b2b)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" Dec 04 22:14:48.944175 master-0 kubenswrapper[8606]: I1204 22:14:48.944090 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:48.944175 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:48.944175 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:48.944175 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:48.944662 master-0 kubenswrapper[8606]: I1204 22:14:48.944194 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:49.153687 master-0 kubenswrapper[8606]: I1204 22:14:49.153643 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/3.log" Dec 04 22:14:49.156471 master-0 kubenswrapper[8606]: I1204 22:14:49.156417 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:14:49.944779 master-0 kubenswrapper[8606]: I1204 22:14:49.944703 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:49.944779 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:49.944779 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:49.944779 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:49.945396 master-0 kubenswrapper[8606]: I1204 22:14:49.944793 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:50.944736 master-0 kubenswrapper[8606]: I1204 22:14:50.944614 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:50.944736 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:50.944736 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:50.944736 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:50.944736 master-0 kubenswrapper[8606]: I1204 22:14:50.944730 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:51.392028 master-0 kubenswrapper[8606]: I1204 22:14:51.391896 8606 scope.go:117] "RemoveContainer" containerID="45f93308614301d84cb0176bf1dabc3de4bbaffff580b91a3cb5db6707e27be7" Dec 04 22:14:51.392590 master-0 kubenswrapper[8606]: E1204 22:14:51.392345 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-78f758c7b9-44srj_openshift-machine-api(a3899a38-39b8-4b48-81e5-4d8854ecc8ab)\"" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" podUID="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" Dec 04 22:14:51.944464 master-0 kubenswrapper[8606]: I1204 22:14:51.944362 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:51.944464 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:51.944464 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:51.944464 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:51.945472 master-0 kubenswrapper[8606]: I1204 22:14:51.944466 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:52.944009 master-0 kubenswrapper[8606]: I1204 22:14:52.943917 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:52.944009 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:52.944009 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:52.944009 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:52.944461 master-0 kubenswrapper[8606]: I1204 22:14:52.944027 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:53.944555 master-0 kubenswrapper[8606]: I1204 22:14:53.944484 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:53.944555 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:53.944555 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:53.944555 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:53.945464 master-0 kubenswrapper[8606]: I1204 22:14:53.944578 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:54.183851 master-0 kubenswrapper[8606]: I1204 22:14:54.183776 8606 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:14:54.185030 master-0 kubenswrapper[8606]: I1204 22:14:54.184992 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:14:54.185350 master-0 kubenswrapper[8606]: E1204 22:14:54.185303 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(fad55397ac8e23f218f25cb714ea5b2b)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" Dec 04 22:14:54.391751 master-0 kubenswrapper[8606]: I1204 22:14:54.391702 8606 scope.go:117] "RemoveContainer" containerID="dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6" Dec 04 22:14:54.391984 master-0 kubenswrapper[8606]: E1204 22:14:54.391952 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:14:54.944833 master-0 kubenswrapper[8606]: I1204 22:14:54.944729 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:54.944833 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:54.944833 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:54.944833 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:54.945885 master-0 kubenswrapper[8606]: I1204 22:14:54.944868 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:55.943838 master-0 kubenswrapper[8606]: I1204 22:14:55.943784 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:55.943838 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:55.943838 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:55.943838 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:55.944269 master-0 kubenswrapper[8606]: I1204 22:14:55.944226 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:56.944036 master-0 kubenswrapper[8606]: I1204 22:14:56.943911 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:56.944036 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:56.944036 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:56.944036 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:56.944036 master-0 kubenswrapper[8606]: I1204 22:14:56.944030 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:57.946908 master-0 kubenswrapper[8606]: I1204 22:14:57.946791 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:57.946908 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:57.946908 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:57.946908 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:57.948121 master-0 kubenswrapper[8606]: I1204 22:14:57.946928 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:58.944640 master-0 kubenswrapper[8606]: I1204 22:14:58.944587 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:58.944640 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:58.944640 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:58.944640 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:58.945107 master-0 kubenswrapper[8606]: I1204 22:14:58.945072 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:14:58.957668 master-0 kubenswrapper[8606]: E1204 22:14:58.957574 8606 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 04 22:14:59.944877 master-0 kubenswrapper[8606]: I1204 22:14:59.944813 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:14:59.944877 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:14:59.944877 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:14:59.944877 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:14:59.944877 master-0 kubenswrapper[8606]: I1204 22:14:59.944874 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:00.945291 master-0 kubenswrapper[8606]: I1204 22:15:00.945048 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:00.945291 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:00.945291 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:00.945291 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:00.945291 master-0 kubenswrapper[8606]: I1204 22:15:00.945171 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:01.273336 master-0 kubenswrapper[8606]: I1204 22:15:01.273241 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/4.log" Dec 04 22:15:01.274176 master-0 kubenswrapper[8606]: I1204 22:15:01.274118 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/3.log" Dec 04 22:15:01.274332 master-0 kubenswrapper[8606]: I1204 22:15:01.274198 8606 generic.go:334] "Generic (PLEG): container finished" podID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" containerID="27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780" exitCode=1 Dec 04 22:15:01.274332 master-0 kubenswrapper[8606]: I1204 22:15:01.274249 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerDied","Data":"27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780"} Dec 04 22:15:01.274332 master-0 kubenswrapper[8606]: I1204 22:15:01.274321 8606 scope.go:117] "RemoveContainer" containerID="3f6074b8213057dcfc74cc30ffdd750130f6000fde9639fd232853d1c9ba798d" Dec 04 22:15:01.275270 master-0 kubenswrapper[8606]: I1204 22:15:01.275201 8606 scope.go:117] "RemoveContainer" containerID="27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780" Dec 04 22:15:01.275633 master-0 kubenswrapper[8606]: E1204 22:15:01.275537 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:15:01.944444 master-0 kubenswrapper[8606]: I1204 22:15:01.944326 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:01.944444 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:01.944444 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:01.944444 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:01.944444 master-0 kubenswrapper[8606]: I1204 22:15:01.944432 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:01.995416 master-0 kubenswrapper[8606]: I1204 22:15:01.995291 8606 status_manager.go:851] "Failed to get status for pod" podUID="fad55397ac8e23f218f25cb714ea5b2b" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods kube-controller-manager-master-0)" Dec 04 22:15:02.284711 master-0 kubenswrapper[8606]: I1204 22:15:02.284578 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/4.log" Dec 04 22:15:02.382423 master-0 kubenswrapper[8606]: E1204 22:15:02.382318 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:15:02.944246 master-0 kubenswrapper[8606]: I1204 22:15:02.944148 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:02.944246 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:02.944246 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:02.944246 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:02.944826 master-0 kubenswrapper[8606]: I1204 22:15:02.944277 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:03.944221 master-0 kubenswrapper[8606]: I1204 22:15:03.944127 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:03.944221 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:03.944221 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:03.944221 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:03.945265 master-0 kubenswrapper[8606]: I1204 22:15:03.944236 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:04.944952 master-0 kubenswrapper[8606]: I1204 22:15:04.944844 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:04.944952 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:04.944952 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:04.944952 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:04.944952 master-0 kubenswrapper[8606]: I1204 22:15:04.944933 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:05.392427 master-0 kubenswrapper[8606]: I1204 22:15:05.391705 8606 scope.go:117] "RemoveContainer" containerID="45f93308614301d84cb0176bf1dabc3de4bbaffff580b91a3cb5db6707e27be7" Dec 04 22:15:05.392427 master-0 kubenswrapper[8606]: I1204 22:15:05.392194 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:15:05.396982 master-0 kubenswrapper[8606]: E1204 22:15:05.396757 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(fad55397ac8e23f218f25cb714ea5b2b)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" Dec 04 22:15:05.554660 master-0 kubenswrapper[8606]: E1204 22:15:05.553547 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:14:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:14:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:14:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T22:14:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:05.944697 master-0 kubenswrapper[8606]: I1204 22:15:05.944597 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:05.944697 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:05.944697 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:05.944697 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:05.946059 master-0 kubenswrapper[8606]: I1204 22:15:05.944735 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:06.327939 master-0 kubenswrapper[8606]: I1204 22:15:06.323541 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/2.log" Dec 04 22:15:06.327939 master-0 kubenswrapper[8606]: I1204 22:15:06.324073 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerStarted","Data":"464365c5b542ffa135ae1b7b53dc0c8855618211c6ebb6a47be42bdf1f3e9e4e"} Dec 04 22:15:06.945454 master-0 kubenswrapper[8606]: I1204 22:15:06.945374 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:06.945454 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:06.945454 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:06.945454 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:06.947017 master-0 kubenswrapper[8606]: I1204 22:15:06.945485 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:07.392159 master-0 kubenswrapper[8606]: I1204 22:15:07.392061 8606 scope.go:117] "RemoveContainer" containerID="dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6" Dec 04 22:15:07.392483 master-0 kubenswrapper[8606]: E1204 22:15:07.392443 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:15:07.944540 master-0 kubenswrapper[8606]: I1204 22:15:07.944441 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:07.944540 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:07.944540 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:07.944540 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:07.944987 master-0 kubenswrapper[8606]: I1204 22:15:07.944558 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:08.944337 master-0 kubenswrapper[8606]: I1204 22:15:08.944256 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:08.944337 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:08.944337 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:08.944337 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:08.945214 master-0 kubenswrapper[8606]: I1204 22:15:08.944358 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:09.943938 master-0 kubenswrapper[8606]: I1204 22:15:09.943861 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:09.943938 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:09.943938 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:09.943938 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:09.944390 master-0 kubenswrapper[8606]: I1204 22:15:09.943971 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:10.944222 master-0 kubenswrapper[8606]: I1204 22:15:10.944118 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:10.944222 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:10.944222 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:10.944222 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:10.944222 master-0 kubenswrapper[8606]: I1204 22:15:10.944212 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:11.944557 master-0 kubenswrapper[8606]: I1204 22:15:11.944445 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:11.944557 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:11.944557 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:11.944557 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:11.944557 master-0 kubenswrapper[8606]: I1204 22:15:11.944554 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:12.944445 master-0 kubenswrapper[8606]: I1204 22:15:12.944360 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:12.944445 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:12.944445 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:12.944445 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:12.945089 master-0 kubenswrapper[8606]: I1204 22:15:12.944469 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:13.944969 master-0 kubenswrapper[8606]: I1204 22:15:13.944882 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:13.944969 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:13.944969 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:13.944969 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:13.946348 master-0 kubenswrapper[8606]: I1204 22:15:13.944980 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:14.393046 master-0 kubenswrapper[8606]: I1204 22:15:14.392944 8606 scope.go:117] "RemoveContainer" containerID="27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780" Dec 04 22:15:14.393558 master-0 kubenswrapper[8606]: E1204 22:15:14.393435 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:15:14.944910 master-0 kubenswrapper[8606]: I1204 22:15:14.944825 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:14.944910 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:14.944910 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:14.944910 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:14.945989 master-0 kubenswrapper[8606]: I1204 22:15:14.944938 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:15.414872 master-0 kubenswrapper[8606]: I1204 22:15:15.414791 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/3.log" Dec 04 22:15:15.416761 master-0 kubenswrapper[8606]: I1204 22:15:15.416705 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager-cert-syncer/0.log" Dec 04 22:15:15.417751 master-0 kubenswrapper[8606]: I1204 22:15:15.417711 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:15:15.417950 master-0 kubenswrapper[8606]: I1204 22:15:15.417805 8606 generic.go:334] "Generic (PLEG): container finished" podID="fad55397ac8e23f218f25cb714ea5b2b" containerID="f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7" exitCode=1 Dec 04 22:15:15.417950 master-0 kubenswrapper[8606]: I1204 22:15:15.417869 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerDied","Data":"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7"} Dec 04 22:15:15.418751 master-0 kubenswrapper[8606]: I1204 22:15:15.418706 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:15:15.418751 master-0 kubenswrapper[8606]: I1204 22:15:15.418742 8606 scope.go:117] "RemoveContainer" containerID="f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7" Dec 04 22:15:15.554402 master-0 kubenswrapper[8606]: E1204 22:15:15.554240 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:15.672300 master-0 kubenswrapper[8606]: E1204 22:15:15.672184 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(fad55397ac8e23f218f25cb714ea5b2b)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" Dec 04 22:15:15.944058 master-0 kubenswrapper[8606]: I1204 22:15:15.943890 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:15.944058 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:15.944058 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:15.944058 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:15.944058 master-0 kubenswrapper[8606]: I1204 22:15:15.943979 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:16.429346 master-0 kubenswrapper[8606]: I1204 22:15:16.429261 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/3.log" Dec 04 22:15:16.430969 master-0 kubenswrapper[8606]: I1204 22:15:16.430920 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager-cert-syncer/0.log" Dec 04 22:15:16.431848 master-0 kubenswrapper[8606]: I1204 22:15:16.431800 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:15:16.431945 master-0 kubenswrapper[8606]: I1204 22:15:16.431870 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671"} Dec 04 22:15:16.432605 master-0 kubenswrapper[8606]: I1204 22:15:16.432538 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:15:16.432992 master-0 kubenswrapper[8606]: E1204 22:15:16.432923 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(fad55397ac8e23f218f25cb714ea5b2b)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" Dec 04 22:15:16.944023 master-0 kubenswrapper[8606]: I1204 22:15:16.943920 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:16.944023 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:16.944023 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:16.944023 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:16.944550 master-0 kubenswrapper[8606]: I1204 22:15:16.944026 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:17.944257 master-0 kubenswrapper[8606]: I1204 22:15:17.944106 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:17.944257 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:17.944257 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:17.944257 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:17.945393 master-0 kubenswrapper[8606]: I1204 22:15:17.944382 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:18.946105 master-0 kubenswrapper[8606]: I1204 22:15:18.946021 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:18.946105 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:18.946105 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:18.946105 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:18.946906 master-0 kubenswrapper[8606]: I1204 22:15:18.946120 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:19.383134 master-0 kubenswrapper[8606]: E1204 22:15:19.382952 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:15:19.944462 master-0 kubenswrapper[8606]: I1204 22:15:19.944361 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:19.944462 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:19.944462 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:19.944462 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:19.945112 master-0 kubenswrapper[8606]: I1204 22:15:19.944537 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:20.333192 master-0 kubenswrapper[8606]: E1204 22:15:20.332998 8606 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e22358a407317 kube-system 9473 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:01:57 +0000 UTC,LastTimestamp:2025-12-04 22:11:20.696043895 +0000 UTC m=+645.506346150,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:15:20.391757 master-0 kubenswrapper[8606]: I1204 22:15:20.391678 8606 scope.go:117] "RemoveContainer" containerID="dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6" Dec 04 22:15:20.392080 master-0 kubenswrapper[8606]: E1204 22:15:20.392032 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:15:20.944389 master-0 kubenswrapper[8606]: I1204 22:15:20.944249 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:20.944389 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:20.944389 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:20.944389 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:20.944995 master-0 kubenswrapper[8606]: I1204 22:15:20.944394 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:21.945020 master-0 kubenswrapper[8606]: I1204 22:15:21.944922 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:21.945020 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:21.945020 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:21.945020 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:21.946100 master-0 kubenswrapper[8606]: I1204 22:15:21.945036 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:22.944301 master-0 kubenswrapper[8606]: I1204 22:15:22.944210 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:22.944301 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:22.944301 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:22.944301 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:22.944824 master-0 kubenswrapper[8606]: I1204 22:15:22.944316 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:23.944276 master-0 kubenswrapper[8606]: I1204 22:15:23.944174 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:23.944276 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:23.944276 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:23.944276 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:23.945106 master-0 kubenswrapper[8606]: I1204 22:15:23.944320 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:24.944865 master-0 kubenswrapper[8606]: I1204 22:15:24.944753 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:24.944865 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:24.944865 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:24.944865 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:24.946195 master-0 kubenswrapper[8606]: I1204 22:15:24.944889 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:25.555768 master-0 kubenswrapper[8606]: E1204 22:15:25.555669 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:25.945851 master-0 kubenswrapper[8606]: I1204 22:15:25.945786 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:25.945851 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:25.945851 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:25.945851 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:25.947487 master-0 kubenswrapper[8606]: I1204 22:15:25.945881 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:26.944265 master-0 kubenswrapper[8606]: I1204 22:15:26.944196 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:26.944265 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:26.944265 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:26.944265 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:26.944690 master-0 kubenswrapper[8606]: I1204 22:15:26.944306 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:27.944845 master-0 kubenswrapper[8606]: I1204 22:15:27.944750 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:27.944845 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:27.944845 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:27.944845 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:27.946052 master-0 kubenswrapper[8606]: I1204 22:15:27.944877 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:28.944751 master-0 kubenswrapper[8606]: I1204 22:15:28.944643 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:28.944751 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:28.944751 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:28.944751 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:28.945809 master-0 kubenswrapper[8606]: I1204 22:15:28.944761 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:29.392872 master-0 kubenswrapper[8606]: I1204 22:15:29.392785 8606 scope.go:117] "RemoveContainer" containerID="27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780" Dec 04 22:15:29.393284 master-0 kubenswrapper[8606]: E1204 22:15:29.393233 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:15:29.944328 master-0 kubenswrapper[8606]: I1204 22:15:29.944191 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:29.944328 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:29.944328 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:29.944328 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:29.944328 master-0 kubenswrapper[8606]: I1204 22:15:29.944307 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:30.393340 master-0 kubenswrapper[8606]: I1204 22:15:30.393213 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:15:30.857172 master-0 kubenswrapper[8606]: I1204 22:15:30.857089 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/3.log" Dec 04 22:15:30.859378 master-0 kubenswrapper[8606]: I1204 22:15:30.859305 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager-cert-syncer/0.log" Dec 04 22:15:30.859983 master-0 kubenswrapper[8606]: I1204 22:15:30.859941 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:15:30.860138 master-0 kubenswrapper[8606]: I1204 22:15:30.860058 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"fad55397ac8e23f218f25cb714ea5b2b","Type":"ContainerStarted","Data":"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628"} Dec 04 22:15:30.862569 master-0 kubenswrapper[8606]: I1204 22:15:30.862478 8606 generic.go:334] "Generic (PLEG): container finished" podID="ceb419e4-d804-4111-b8d8-8436cc2ee617" containerID="255af5caa519126130f0822a951a744700ae3fbbe4597a788a35633eb402cf2a" exitCode=0 Dec 04 22:15:30.862569 master-0 kubenswrapper[8606]: I1204 22:15:30.862533 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" event={"ID":"ceb419e4-d804-4111-b8d8-8436cc2ee617","Type":"ContainerDied","Data":"255af5caa519126130f0822a951a744700ae3fbbe4597a788a35633eb402cf2a"} Dec 04 22:15:30.862777 master-0 kubenswrapper[8606]: I1204 22:15:30.862579 8606 scope.go:117] "RemoveContainer" containerID="466a053aebc195d2f55d104f73cf9c35f09469c457c1576c051e6861f31f8a13" Dec 04 22:15:30.864762 master-0 kubenswrapper[8606]: I1204 22:15:30.864609 8606 scope.go:117] "RemoveContainer" containerID="255af5caa519126130f0822a951a744700ae3fbbe4597a788a35633eb402cf2a" Dec 04 22:15:30.945575 master-0 kubenswrapper[8606]: I1204 22:15:30.945442 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:30.945575 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:30.945575 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:30.945575 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:30.946034 master-0 kubenswrapper[8606]: I1204 22:15:30.945568 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:31.876702 master-0 kubenswrapper[8606]: I1204 22:15:31.876589 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" event={"ID":"ceb419e4-d804-4111-b8d8-8436cc2ee617","Type":"ContainerStarted","Data":"41a26220612508dc13a06873ec1bc278ee38a1762a54d12aed432fef2bd1f57f"} Dec 04 22:15:31.944330 master-0 kubenswrapper[8606]: I1204 22:15:31.944207 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:31.944330 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:31.944330 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:31.944330 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:31.944842 master-0 kubenswrapper[8606]: I1204 22:15:31.944347 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:32.944916 master-0 kubenswrapper[8606]: I1204 22:15:32.944836 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:32.944916 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:32.944916 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:32.944916 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:32.945922 master-0 kubenswrapper[8606]: I1204 22:15:32.944937 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:33.944495 master-0 kubenswrapper[8606]: I1204 22:15:33.944342 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:33.944495 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:33.944495 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:33.944495 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:33.944495 master-0 kubenswrapper[8606]: I1204 22:15:33.944455 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:34.184103 master-0 kubenswrapper[8606]: I1204 22:15:34.183978 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:15:34.184548 master-0 kubenswrapper[8606]: I1204 22:15:34.184441 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:15:34.392987 master-0 kubenswrapper[8606]: I1204 22:15:34.392897 8606 scope.go:117] "RemoveContainer" containerID="dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6" Dec 04 22:15:34.912194 master-0 kubenswrapper[8606]: I1204 22:15:34.912063 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/4.log" Dec 04 22:15:34.912911 master-0 kubenswrapper[8606]: I1204 22:15:34.912822 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e"} Dec 04 22:15:34.944613 master-0 kubenswrapper[8606]: I1204 22:15:34.944462 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:34.944613 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:34.944613 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:34.944613 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:34.945950 master-0 kubenswrapper[8606]: I1204 22:15:34.944616 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:35.556755 master-0 kubenswrapper[8606]: E1204 22:15:35.556693 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:35.944427 master-0 kubenswrapper[8606]: I1204 22:15:35.944349 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:35.944427 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:35.944427 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:35.944427 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:35.944870 master-0 kubenswrapper[8606]: I1204 22:15:35.944451 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:36.356482 master-0 kubenswrapper[8606]: E1204 22:15:36.356414 8606 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0beb871c_3bf1_471c_a028_746a650267bf.slice/crio-3319d82050d7163f2a7f96d02081f2908fac76018c64e251ebfd0f4e73ccabfd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d68dcb1_efe4_425f_9b28_1e5575548a32.slice/crio-2f8f422694aa4bc57d4ecc64211f7f799287be11acc20da48bbd2da04f761575.scope\": RecentStats: unable to find data in memory cache]" Dec 04 22:15:36.385121 master-0 kubenswrapper[8606]: E1204 22:15:36.384983 8606 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 04 22:15:36.932857 master-0 kubenswrapper[8606]: I1204 22:15:36.932717 8606 generic.go:334] "Generic (PLEG): container finished" podID="465637a4-42be-4a65-a859-7af699960138" containerID="206992e5a976be25c0ca246941e52ef047963087cb7fb3d7fae48784f22c1968" exitCode=0 Dec 04 22:15:36.932857 master-0 kubenswrapper[8606]: I1204 22:15:36.932826 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerDied","Data":"206992e5a976be25c0ca246941e52ef047963087cb7fb3d7fae48784f22c1968"} Dec 04 22:15:36.933334 master-0 kubenswrapper[8606]: I1204 22:15:36.932948 8606 scope.go:117] "RemoveContainer" containerID="55ff2217087c08bbb5a594e4d764f860ce087d9b5069af52b9f8daf47ea1941f" Dec 04 22:15:36.933887 master-0 kubenswrapper[8606]: I1204 22:15:36.933792 8606 scope.go:117] "RemoveContainer" containerID="206992e5a976be25c0ca246941e52ef047963087cb7fb3d7fae48784f22c1968" Dec 04 22:15:36.935715 master-0 kubenswrapper[8606]: I1204 22:15:36.934971 8606 generic.go:334] "Generic (PLEG): container finished" podID="a8636bd7-fa9e-44b9-82df-9d37b398736d" containerID="74597696ddcec56d10e68ca1d29cef5d1bfa40762646f7e9dc8729a8c66636fd" exitCode=0 Dec 04 22:15:36.935715 master-0 kubenswrapper[8606]: I1204 22:15:36.935046 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" event={"ID":"a8636bd7-fa9e-44b9-82df-9d37b398736d","Type":"ContainerDied","Data":"74597696ddcec56d10e68ca1d29cef5d1bfa40762646f7e9dc8729a8c66636fd"} Dec 04 22:15:36.935715 master-0 kubenswrapper[8606]: I1204 22:15:36.935295 8606 scope.go:117] "RemoveContainer" containerID="74597696ddcec56d10e68ca1d29cef5d1bfa40762646f7e9dc8729a8c66636fd" Dec 04 22:15:36.942813 master-0 kubenswrapper[8606]: I1204 22:15:36.942733 8606 generic.go:334] "Generic (PLEG): container finished" podID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerID="577801a549fb8e8b80c730f6cb1e1c0076264ab71677f9e7afd6abe1e4f77036" exitCode=0 Dec 04 22:15:36.942955 master-0 kubenswrapper[8606]: I1204 22:15:36.942834 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" event={"ID":"b86ff0e8-2c72-4dc6-ac55-3c21940d044f","Type":"ContainerDied","Data":"577801a549fb8e8b80c730f6cb1e1c0076264ab71677f9e7afd6abe1e4f77036"} Dec 04 22:15:36.943707 master-0 kubenswrapper[8606]: I1204 22:15:36.943659 8606 scope.go:117] "RemoveContainer" containerID="577801a549fb8e8b80c730f6cb1e1c0076264ab71677f9e7afd6abe1e4f77036" Dec 04 22:15:36.946622 master-0 kubenswrapper[8606]: I1204 22:15:36.946567 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:36.946622 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:36.946622 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:36.946622 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:36.946976 master-0 kubenswrapper[8606]: I1204 22:15:36.946634 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:36.947569 master-0 kubenswrapper[8606]: I1204 22:15:36.947493 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-dcf7fc84b-qmhlw_a043ea49-97f9-4ae6-83b9-733f12754d94/cluster-storage-operator/0.log" Dec 04 22:15:36.947569 master-0 kubenswrapper[8606]: I1204 22:15:36.947564 8606 generic.go:334] "Generic (PLEG): container finished" podID="a043ea49-97f9-4ae6-83b9-733f12754d94" containerID="3e05aae4893276e8afe69a36c63278b1771095172b32eb2660a3d4bf0f266404" exitCode=0 Dec 04 22:15:36.947758 master-0 kubenswrapper[8606]: I1204 22:15:36.947658 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" event={"ID":"a043ea49-97f9-4ae6-83b9-733f12754d94","Type":"ContainerDied","Data":"3e05aae4893276e8afe69a36c63278b1771095172b32eb2660a3d4bf0f266404"} Dec 04 22:15:36.948332 master-0 kubenswrapper[8606]: I1204 22:15:36.948277 8606 scope.go:117] "RemoveContainer" containerID="3e05aae4893276e8afe69a36c63278b1771095172b32eb2660a3d4bf0f266404" Dec 04 22:15:36.950941 master-0 kubenswrapper[8606]: I1204 22:15:36.950877 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-85cff47f46-4dv2b_0beb871c-3bf1-471c-a028-746a650267bf/cluster-node-tuning-operator/0.log" Dec 04 22:15:36.950941 master-0 kubenswrapper[8606]: I1204 22:15:36.950925 8606 generic.go:334] "Generic (PLEG): container finished" podID="0beb871c-3bf1-471c-a028-746a650267bf" containerID="3319d82050d7163f2a7f96d02081f2908fac76018c64e251ebfd0f4e73ccabfd" exitCode=1 Dec 04 22:15:36.951432 master-0 kubenswrapper[8606]: I1204 22:15:36.951353 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" event={"ID":"0beb871c-3bf1-471c-a028-746a650267bf","Type":"ContainerDied","Data":"3319d82050d7163f2a7f96d02081f2908fac76018c64e251ebfd0f4e73ccabfd"} Dec 04 22:15:36.952157 master-0 kubenswrapper[8606]: I1204 22:15:36.952094 8606 scope.go:117] "RemoveContainer" containerID="3319d82050d7163f2a7f96d02081f2908fac76018c64e251ebfd0f4e73ccabfd" Dec 04 22:15:36.956366 master-0 kubenswrapper[8606]: I1204 22:15:36.956285 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/3.log" Dec 04 22:15:36.957145 master-0 kubenswrapper[8606]: I1204 22:15:36.957068 8606 generic.go:334] "Generic (PLEG): container finished" podID="810c363b-a4c7-428d-a2fb-285adc29f477" containerID="cc16e00745629c42a3771375fb21884cb52774d57fe0324c10a8680dd9a3742b" exitCode=0 Dec 04 22:15:36.957294 master-0 kubenswrapper[8606]: I1204 22:15:36.957131 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerDied","Data":"cc16e00745629c42a3771375fb21884cb52774d57fe0324c10a8680dd9a3742b"} Dec 04 22:15:36.958369 master-0 kubenswrapper[8606]: I1204 22:15:36.958299 8606 scope.go:117] "RemoveContainer" containerID="cc16e00745629c42a3771375fb21884cb52774d57fe0324c10a8680dd9a3742b" Dec 04 22:15:36.980765 master-0 kubenswrapper[8606]: I1204 22:15:36.978599 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-67477646d4-bslb5_813f3ee7-35b5-4ee8-b453-00d16d910eae/package-server-manager/0.log" Dec 04 22:15:36.981025 master-0 kubenswrapper[8606]: I1204 22:15:36.980865 8606 generic.go:334] "Generic (PLEG): container finished" podID="813f3ee7-35b5-4ee8-b453-00d16d910eae" containerID="f11072c38e40de60dafeffc2c5ef9e1780820ca0ce672700aba155fa414fe72c" exitCode=1 Dec 04 22:15:36.981129 master-0 kubenswrapper[8606]: I1204 22:15:36.980993 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" event={"ID":"813f3ee7-35b5-4ee8-b453-00d16d910eae","Type":"ContainerDied","Data":"f11072c38e40de60dafeffc2c5ef9e1780820ca0ce672700aba155fa414fe72c"} Dec 04 22:15:36.982304 master-0 kubenswrapper[8606]: I1204 22:15:36.982208 8606 scope.go:117] "RemoveContainer" containerID="f11072c38e40de60dafeffc2c5ef9e1780820ca0ce672700aba155fa414fe72c" Dec 04 22:15:36.985324 master-0 kubenswrapper[8606]: I1204 22:15:36.985278 8606 generic.go:334] "Generic (PLEG): container finished" podID="4d68dcb1-efe4-425f-9b28-1e5575548a32" containerID="2f8f422694aa4bc57d4ecc64211f7f799287be11acc20da48bbd2da04f761575" exitCode=0 Dec 04 22:15:36.985430 master-0 kubenswrapper[8606]: I1204 22:15:36.985345 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" event={"ID":"4d68dcb1-efe4-425f-9b28-1e5575548a32","Type":"ContainerDied","Data":"2f8f422694aa4bc57d4ecc64211f7f799287be11acc20da48bbd2da04f761575"} Dec 04 22:15:36.986443 master-0 kubenswrapper[8606]: I1204 22:15:36.986378 8606 scope.go:117] "RemoveContainer" containerID="2f8f422694aa4bc57d4ecc64211f7f799287be11acc20da48bbd2da04f761575" Dec 04 22:15:36.991258 master-0 kubenswrapper[8606]: I1204 22:15:36.990689 8606 generic.go:334] "Generic (PLEG): container finished" podID="35821f48-b000-4915-847f-a739b6efc5ee" containerID="50388f7f0e80879ab901f5d4bca3a6dc9ea1a39437e9578943365b4294d8b25d" exitCode=0 Dec 04 22:15:36.991258 master-0 kubenswrapper[8606]: I1204 22:15:36.990805 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" event={"ID":"35821f48-b000-4915-847f-a739b6efc5ee","Type":"ContainerDied","Data":"50388f7f0e80879ab901f5d4bca3a6dc9ea1a39437e9578943365b4294d8b25d"} Dec 04 22:15:36.992060 master-0 kubenswrapper[8606]: I1204 22:15:36.991693 8606 scope.go:117] "RemoveContainer" containerID="50388f7f0e80879ab901f5d4bca3a6dc9ea1a39437e9578943365b4294d8b25d" Dec 04 22:15:37.002207 master-0 kubenswrapper[8606]: I1204 22:15:37.001138 8606 scope.go:117] "RemoveContainer" containerID="b24a52101599e57bc25b6c160a06c23124bc447eb919bdd2267f0b91d0f6aaee" Dec 04 22:15:37.146123 master-0 kubenswrapper[8606]: I1204 22:15:37.146077 8606 scope.go:117] "RemoveContainer" containerID="89b06e692941a98417c89bb9068d2a41907167dc117005a2191cab580a9cd940" Dec 04 22:15:37.184840 master-0 kubenswrapper[8606]: I1204 22:15:37.184744 8606 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:15:37.184951 master-0 kubenswrapper[8606]: I1204 22:15:37.184822 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:37.944712 master-0 kubenswrapper[8606]: I1204 22:15:37.944604 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:37.944712 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:37.944712 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:37.944712 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:37.945820 master-0 kubenswrapper[8606]: I1204 22:15:37.944730 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:38.002600 master-0 kubenswrapper[8606]: I1204 22:15:38.002487 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerStarted","Data":"edea31e1fa600a4ae379a373cf8ee62b0384aee39c0285d544acdd5941a71cf8"} Dec 04 22:15:38.002935 master-0 kubenswrapper[8606]: I1204 22:15:38.002815 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:15:38.005710 master-0 kubenswrapper[8606]: I1204 22:15:38.005648 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" event={"ID":"35821f48-b000-4915-847f-a739b6efc5ee","Type":"ContainerStarted","Data":"158753a86c5c01314d89c2674122c45b776d4868ad7bb53382d3dcedd2977cf8"} Dec 04 22:15:38.009042 master-0 kubenswrapper[8606]: I1204 22:15:38.008958 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" event={"ID":"b86ff0e8-2c72-4dc6-ac55-3c21940d044f","Type":"ContainerStarted","Data":"f60afa3e3fc7300e00d2058f8d9e1e5f1ef0102a3ee02b211373a7e4b06ab2d2"} Dec 04 22:15:38.009448 master-0 kubenswrapper[8606]: I1204 22:15:38.009382 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:15:38.012432 master-0 kubenswrapper[8606]: I1204 22:15:38.012339 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" event={"ID":"a043ea49-97f9-4ae6-83b9-733f12754d94","Type":"ContainerStarted","Data":"5f1510dd754c8015c3efa13b31a418b33c7dc7f1e77672856392caad09ab716a"} Dec 04 22:15:38.016332 master-0 kubenswrapper[8606]: I1204 22:15:38.016292 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerStarted","Data":"ade905f65dc817c49631ae16c039b2f7a28b57152bfbb968cd152562a26b9a76"} Dec 04 22:15:38.020945 master-0 kubenswrapper[8606]: I1204 22:15:38.019332 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-85cff47f46-4dv2b_0beb871c-3bf1-471c-a028-746a650267bf/cluster-node-tuning-operator/0.log" Dec 04 22:15:38.020945 master-0 kubenswrapper[8606]: I1204 22:15:38.019498 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" event={"ID":"0beb871c-3bf1-471c-a028-746a650267bf","Type":"ContainerStarted","Data":"ca5799c309b09795ff95214f2ec9158f268801b85d2051e30751956963a75745"} Dec 04 22:15:38.021633 master-0 kubenswrapper[8606]: I1204 22:15:38.021593 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" event={"ID":"4d68dcb1-efe4-425f-9b28-1e5575548a32","Type":"ContainerStarted","Data":"db27458e6b27bc7ef79661747271bca2ab81c5f5d722426e70bfbf3ba534f396"} Dec 04 22:15:38.024209 master-0 kubenswrapper[8606]: I1204 22:15:38.024174 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-67477646d4-bslb5_813f3ee7-35b5-4ee8-b453-00d16d910eae/package-server-manager/0.log" Dec 04 22:15:38.024786 master-0 kubenswrapper[8606]: I1204 22:15:38.024730 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" event={"ID":"813f3ee7-35b5-4ee8-b453-00d16d910eae","Type":"ContainerStarted","Data":"34e81e1de548a0f7a7581ef26a98220c98e277cec852a516b3aef35984f983d0"} Dec 04 22:15:38.025179 master-0 kubenswrapper[8606]: I1204 22:15:38.025125 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:15:38.027108 master-0 kubenswrapper[8606]: I1204 22:15:38.027071 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" event={"ID":"a8636bd7-fa9e-44b9-82df-9d37b398736d","Type":"ContainerStarted","Data":"ecb995ffd687db0f9c53116cb470abf630b360b0cf85e7d21f6f2cc7513d1f11"} Dec 04 22:15:38.945024 master-0 kubenswrapper[8606]: I1204 22:15:38.944919 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:38.945024 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:38.945024 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:38.945024 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:38.946084 master-0 kubenswrapper[8606]: I1204 22:15:38.945037 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:39.010460 master-0 kubenswrapper[8606]: I1204 22:15:39.010276 8606 patch_prober.go:28] interesting pod/route-controller-manager-9db9db957-zdrjg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:15:39.010460 master-0 kubenswrapper[8606]: I1204 22:15:39.010435 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:39.943923 master-0 kubenswrapper[8606]: I1204 22:15:39.943823 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:39.943923 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:39.943923 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:39.943923 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:39.944550 master-0 kubenswrapper[8606]: I1204 22:15:39.943952 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:40.035858 master-0 kubenswrapper[8606]: I1204 22:15:40.035720 8606 patch_prober.go:28] interesting pod/route-controller-manager-9db9db957-zdrjg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:15:40.036568 master-0 kubenswrapper[8606]: I1204 22:15:40.035886 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:40.055343 master-0 kubenswrapper[8606]: I1204 22:15:40.055225 8606 generic.go:334] "Generic (PLEG): container finished" podID="46229484-5fa1-4595-94a0-44477abae90e" containerID="43708b3c1f0fc23d49f5b68e72b2abf6c84e81e5b8fe673a0fccaff92e14b81d" exitCode=0 Dec 04 22:15:40.055343 master-0 kubenswrapper[8606]: I1204 22:15:40.055335 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" event={"ID":"46229484-5fa1-4595-94a0-44477abae90e","Type":"ContainerDied","Data":"43708b3c1f0fc23d49f5b68e72b2abf6c84e81e5b8fe673a0fccaff92e14b81d"} Dec 04 22:15:40.055843 master-0 kubenswrapper[8606]: I1204 22:15:40.055769 8606 scope.go:117] "RemoveContainer" containerID="c77537fc4f2900520f8e93c8fc7a9508c178081936170d16a0dcd4122f2c7777" Dec 04 22:15:40.056544 master-0 kubenswrapper[8606]: I1204 22:15:40.056470 8606 scope.go:117] "RemoveContainer" containerID="43708b3c1f0fc23d49f5b68e72b2abf6c84e81e5b8fe673a0fccaff92e14b81d" Dec 04 22:15:40.944742 master-0 kubenswrapper[8606]: I1204 22:15:40.944657 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:40.944742 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:40.944742 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:40.944742 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:40.944742 master-0 kubenswrapper[8606]: I1204 22:15:40.944744 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:41.069416 master-0 kubenswrapper[8606]: I1204 22:15:41.069314 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" event={"ID":"46229484-5fa1-4595-94a0-44477abae90e","Type":"ContainerStarted","Data":"c83311c437c02b54931377c7c49f736c6deca7ea65c74397bdf6ab810158ea6e"} Dec 04 22:15:41.944745 master-0 kubenswrapper[8606]: I1204 22:15:41.944661 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:41.944745 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:41.944745 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:41.944745 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:41.945181 master-0 kubenswrapper[8606]: I1204 22:15:41.944752 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:42.596775 master-0 kubenswrapper[8606]: I1204 22:15:42.596633 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:15:42.597409 master-0 kubenswrapper[8606]: I1204 22:15:42.596815 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:42.597409 master-0 kubenswrapper[8606]: I1204 22:15:42.596710 8606 patch_prober.go:28] interesting pod/openshift-config-operator-68758cbcdb-fg6vx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:15:42.597409 master-0 kubenswrapper[8606]: I1204 22:15:42.597293 8606 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" podUID="810c363b-a4c7-428d-a2fb-285adc29f477" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:42.945575 master-0 kubenswrapper[8606]: I1204 22:15:42.945453 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:42.945575 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:42.945575 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:42.945575 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:42.946081 master-0 kubenswrapper[8606]: I1204 22:15:42.945598 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:43.093472 master-0 kubenswrapper[8606]: I1204 22:15:43.093362 8606 generic.go:334] "Generic (PLEG): container finished" podID="690b447a-19c0-4925-bc9d-d0c86a83a377" containerID="a2b4230e9757af974dea8f15ae2efeb4c125ff55b0c9cdc6e359f1aae71c8941" exitCode=0 Dec 04 22:15:43.093815 master-0 kubenswrapper[8606]: I1204 22:15:43.093470 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" event={"ID":"690b447a-19c0-4925-bc9d-d0c86a83a377","Type":"ContainerDied","Data":"a2b4230e9757af974dea8f15ae2efeb4c125ff55b0c9cdc6e359f1aae71c8941"} Dec 04 22:15:43.093815 master-0 kubenswrapper[8606]: I1204 22:15:43.093642 8606 scope.go:117] "RemoveContainer" containerID="f701b6e27b366f9b3e2d799e563c87e892e7b625684a50d11abda6232179d479" Dec 04 22:15:43.094326 master-0 kubenswrapper[8606]: I1204 22:15:43.094270 8606 scope.go:117] "RemoveContainer" containerID="a2b4230e9757af974dea8f15ae2efeb4c125ff55b0c9cdc6e359f1aae71c8941" Dec 04 22:15:43.392233 master-0 kubenswrapper[8606]: I1204 22:15:43.392164 8606 scope.go:117] "RemoveContainer" containerID="27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780" Dec 04 22:15:43.392606 master-0 kubenswrapper[8606]: E1204 22:15:43.392564 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:15:43.944625 master-0 kubenswrapper[8606]: I1204 22:15:43.944465 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:43.944625 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:43.944625 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:43.944625 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:43.945645 master-0 kubenswrapper[8606]: I1204 22:15:43.944661 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:44.106352 master-0 kubenswrapper[8606]: I1204 22:15:44.106253 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" event={"ID":"690b447a-19c0-4925-bc9d-d0c86a83a377","Type":"ContainerStarted","Data":"d5f941a3f84766c224cb550f03c9798dd97a48eb4d1b0dc82d2ca740885ed464"} Dec 04 22:15:44.706809 master-0 kubenswrapper[8606]: I1204 22:15:44.706682 8606 patch_prober.go:28] interesting pod/route-controller-manager-9db9db957-zdrjg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 22:15:44.707176 master-0 kubenswrapper[8606]: I1204 22:15:44.706806 8606 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:44.944242 master-0 kubenswrapper[8606]: I1204 22:15:44.944187 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:44.944242 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:44.944242 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:44.944242 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:44.944659 master-0 kubenswrapper[8606]: I1204 22:15:44.944268 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:45.114267 master-0 kubenswrapper[8606]: I1204 22:15:45.111907 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:15:45.128547 master-0 kubenswrapper[8606]: I1204 22:15:45.127730 8606 generic.go:334] "Generic (PLEG): container finished" podID="55c4f1e1-1b78-45ec-915d-8055ab3e2786" containerID="1326348e3e2d8bc1da0a2a933c011acb7d00a92e48a2890745437b6f15960271" exitCode=0 Dec 04 22:15:45.128547 master-0 kubenswrapper[8606]: I1204 22:15:45.127819 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" event={"ID":"55c4f1e1-1b78-45ec-915d-8055ab3e2786","Type":"ContainerDied","Data":"1326348e3e2d8bc1da0a2a933c011acb7d00a92e48a2890745437b6f15960271"} Dec 04 22:15:45.129727 master-0 kubenswrapper[8606]: I1204 22:15:45.129168 8606 scope.go:117] "RemoveContainer" containerID="1326348e3e2d8bc1da0a2a933c011acb7d00a92e48a2890745437b6f15960271" Dec 04 22:15:45.129990 master-0 kubenswrapper[8606]: I1204 22:15:45.129955 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:15:45.130124 master-0 kubenswrapper[8606]: I1204 22:15:45.130096 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:15:45.132320 master-0 kubenswrapper[8606]: I1204 22:15:45.132194 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" event={"ID":"e37d318a-5bf8-46ed-b6de-494102738da7","Type":"ContainerDied","Data":"79e1b635bb095edbd66388094922c6134f767f1a2efc7b3eca6e45abd8f571c6"} Dec 04 22:15:45.132716 master-0 kubenswrapper[8606]: I1204 22:15:45.132594 8606 generic.go:334] "Generic (PLEG): container finished" podID="e37d318a-5bf8-46ed-b6de-494102738da7" containerID="79e1b635bb095edbd66388094922c6134f767f1a2efc7b3eca6e45abd8f571c6" exitCode=0 Dec 04 22:15:45.132899 master-0 kubenswrapper[8606]: I1204 22:15:45.132844 8606 scope.go:117] "RemoveContainer" containerID="79e1b635bb095edbd66388094922c6134f767f1a2efc7b3eca6e45abd8f571c6" Dec 04 22:15:45.143038 master-0 kubenswrapper[8606]: I1204 22:15:45.142979 8606 generic.go:334] "Generic (PLEG): container finished" podID="ebfbb13d-c3f2-476d-bd89-cb8a13d2acee" containerID="989d170e7a52318dc012c9f9d9615ad932d427b1e6a54fccb0b6d83f4ebb7d45" exitCode=0 Dec 04 22:15:45.143150 master-0 kubenswrapper[8606]: I1204 22:15:45.143061 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" event={"ID":"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee","Type":"ContainerDied","Data":"989d170e7a52318dc012c9f9d9615ad932d427b1e6a54fccb0b6d83f4ebb7d45"} Dec 04 22:15:45.143852 master-0 kubenswrapper[8606]: I1204 22:15:45.143825 8606 scope.go:117] "RemoveContainer" containerID="989d170e7a52318dc012c9f9d9615ad932d427b1e6a54fccb0b6d83f4ebb7d45" Dec 04 22:15:45.561133 master-0 kubenswrapper[8606]: E1204 22:15:45.561074 8606 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:15:45.561133 master-0 kubenswrapper[8606]: E1204 22:15:45.561110 8606 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 22:15:45.946694 master-0 kubenswrapper[8606]: I1204 22:15:45.946592 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:45.946694 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:45.946694 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:45.946694 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:45.947831 master-0 kubenswrapper[8606]: I1204 22:15:45.946699 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:46.154032 master-0 kubenswrapper[8606]: I1204 22:15:46.153938 8606 generic.go:334] "Generic (PLEG): container finished" podID="a544105a-5bec-456a-aef6-c160943c1f67" containerID="419aa165bec1fc028d5393e03a6724568d6a2d80fb3a00accfe0a6d847f186e7" exitCode=0 Dec 04 22:15:46.154382 master-0 kubenswrapper[8606]: I1204 22:15:46.154020 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" event={"ID":"a544105a-5bec-456a-aef6-c160943c1f67","Type":"ContainerDied","Data":"419aa165bec1fc028d5393e03a6724568d6a2d80fb3a00accfe0a6d847f186e7"} Dec 04 22:15:46.154382 master-0 kubenswrapper[8606]: I1204 22:15:46.154143 8606 scope.go:117] "RemoveContainer" containerID="9820de7c24faf6bdc5aac51f81548f854bf3fa05b1f8fd46fe8346195ddc8ca4" Dec 04 22:15:46.154897 master-0 kubenswrapper[8606]: I1204 22:15:46.154795 8606 scope.go:117] "RemoveContainer" containerID="419aa165bec1fc028d5393e03a6724568d6a2d80fb3a00accfe0a6d847f186e7" Dec 04 22:15:46.157385 master-0 kubenswrapper[8606]: I1204 22:15:46.157128 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" event={"ID":"55c4f1e1-1b78-45ec-915d-8055ab3e2786","Type":"ContainerStarted","Data":"63e7406a61d2d8793cce7442c337061b250815115e1952c3a631583a81662033"} Dec 04 22:15:46.160422 master-0 kubenswrapper[8606]: I1204 22:15:46.160359 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-jb4xf_7f091088-2166-4026-9fa6-62bd83407edb/openshift-controller-manager-operator/1.log" Dec 04 22:15:46.160422 master-0 kubenswrapper[8606]: I1204 22:15:46.160409 8606 generic.go:334] "Generic (PLEG): container finished" podID="7f091088-2166-4026-9fa6-62bd83407edb" containerID="437ce0db468372672eda2ac00bd5f2a8af4827f3a8e23b48967061bc95032bfb" exitCode=0 Dec 04 22:15:46.160912 master-0 kubenswrapper[8606]: I1204 22:15:46.160526 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerDied","Data":"437ce0db468372672eda2ac00bd5f2a8af4827f3a8e23b48967061bc95032bfb"} Dec 04 22:15:46.161337 master-0 kubenswrapper[8606]: I1204 22:15:46.161205 8606 scope.go:117] "RemoveContainer" containerID="437ce0db468372672eda2ac00bd5f2a8af4827f3a8e23b48967061bc95032bfb" Dec 04 22:15:46.165806 master-0 kubenswrapper[8606]: I1204 22:15:46.165682 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" event={"ID":"e37d318a-5bf8-46ed-b6de-494102738da7","Type":"ContainerStarted","Data":"b3dfadb55c93406611410f9ae78bf9ce21b1ad64df79b0b3e8022aaceefccc9f"} Dec 04 22:15:46.174202 master-0 kubenswrapper[8606]: I1204 22:15:46.174147 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-bm2pk_f893663c-7c1e-4eda-9839-99c1c0440304/authentication-operator/1.log" Dec 04 22:15:46.174397 master-0 kubenswrapper[8606]: I1204 22:15:46.174229 8606 generic.go:334] "Generic (PLEG): container finished" podID="f893663c-7c1e-4eda-9839-99c1c0440304" containerID="4325f17544835c85c6a52e1b1f681caef960ed59819852b48ea3b6353d61e1b5" exitCode=0 Dec 04 22:15:46.174397 master-0 kubenswrapper[8606]: I1204 22:15:46.174330 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerDied","Data":"4325f17544835c85c6a52e1b1f681caef960ed59819852b48ea3b6353d61e1b5"} Dec 04 22:15:46.175021 master-0 kubenswrapper[8606]: I1204 22:15:46.174986 8606 scope.go:117] "RemoveContainer" containerID="4325f17544835c85c6a52e1b1f681caef960ed59819852b48ea3b6353d61e1b5" Dec 04 22:15:46.178326 master-0 kubenswrapper[8606]: I1204 22:15:46.178285 8606 generic.go:334] "Generic (PLEG): container finished" podID="800f436c-145d-4281-8d4d-644ba2cb0ebb" containerID="66fa513342b7f47d4f807e0f29b5398451337b70d0aea1cac07ea70f754f3d14" exitCode=0 Dec 04 22:15:46.180375 master-0 kubenswrapper[8606]: I1204 22:15:46.178355 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" event={"ID":"800f436c-145d-4281-8d4d-644ba2cb0ebb","Type":"ContainerDied","Data":"66fa513342b7f47d4f807e0f29b5398451337b70d0aea1cac07ea70f754f3d14"} Dec 04 22:15:46.181412 master-0 kubenswrapper[8606]: I1204 22:15:46.181372 8606 scope.go:117] "RemoveContainer" containerID="66fa513342b7f47d4f807e0f29b5398451337b70d0aea1cac07ea70f754f3d14" Dec 04 22:15:46.183072 master-0 kubenswrapper[8606]: I1204 22:15:46.182920 8606 generic.go:334] "Generic (PLEG): container finished" podID="24648a41-875f-4e98-8b21-3bdd38dffa32" containerID="0e28b0cb43f14ec5571ccac1eddb3ba4fa32d2d2a461323c61e5fc655cd04d29" exitCode=0 Dec 04 22:15:46.183204 master-0 kubenswrapper[8606]: I1204 22:15:46.183099 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" event={"ID":"24648a41-875f-4e98-8b21-3bdd38dffa32","Type":"ContainerDied","Data":"0e28b0cb43f14ec5571ccac1eddb3ba4fa32d2d2a461323c61e5fc655cd04d29"} Dec 04 22:15:46.184048 master-0 kubenswrapper[8606]: I1204 22:15:46.183992 8606 scope.go:117] "RemoveContainer" containerID="0e28b0cb43f14ec5571ccac1eddb3ba4fa32d2d2a461323c61e5fc655cd04d29" Dec 04 22:15:46.190345 master-0 kubenswrapper[8606]: I1204 22:15:46.190219 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" event={"ID":"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee","Type":"ContainerStarted","Data":"8e8fdeb690fdc94cc15bed644ba01ada6bb64532ff187c42fd17e48621eea529"} Dec 04 22:15:46.262882 master-0 kubenswrapper[8606]: I1204 22:15:46.262835 8606 scope.go:117] "RemoveContainer" containerID="3c8faa0cec9898a47039ead85f90eab240ebf83ecd040f53acd3c80c7bec151c" Dec 04 22:15:46.376226 master-0 kubenswrapper[8606]: I1204 22:15:46.376182 8606 scope.go:117] "RemoveContainer" containerID="980d371480e7b1b6a921dafc58f3636bd451634500e5c1c642030a39e001a8a8" Dec 04 22:15:46.486924 master-0 kubenswrapper[8606]: I1204 22:15:46.486862 8606 scope.go:117] "RemoveContainer" containerID="cb9981e4dfed9821dbae6b8b7a8e8e8f099f873bacacc6149961ccf58995e524" Dec 04 22:15:46.944767 master-0 kubenswrapper[8606]: I1204 22:15:46.944692 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:46.944767 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:46.944767 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:46.944767 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:46.945158 master-0 kubenswrapper[8606]: I1204 22:15:46.944793 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:47.199954 master-0 kubenswrapper[8606]: I1204 22:15:47.199794 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerStarted","Data":"b15cd21f3f4e9fee4d49615520e3cb875b8d92374b9511d0ad4dc25bdd542ba5"} Dec 04 22:15:47.203264 master-0 kubenswrapper[8606]: I1204 22:15:47.203195 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" event={"ID":"800f436c-145d-4281-8d4d-644ba2cb0ebb","Type":"ContainerStarted","Data":"0fa9349d6f2854385acfa0a5510d95ae0764adbe71a4b80f503e71769e643a05"} Dec 04 22:15:47.206425 master-0 kubenswrapper[8606]: I1204 22:15:47.206386 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" event={"ID":"24648a41-875f-4e98-8b21-3bdd38dffa32","Type":"ContainerStarted","Data":"a0dd18e9d53c04bd711b1f1f59edffb36005d2ce1832890d22002f1e0075036b"} Dec 04 22:15:47.209861 master-0 kubenswrapper[8606]: I1204 22:15:47.209805 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" event={"ID":"a544105a-5bec-456a-aef6-c160943c1f67","Type":"ContainerStarted","Data":"d3af1e9bd4b92fa430f21c986760b6d7883d675546cc4919c1dfaa7e778ab068"} Dec 04 22:15:47.212966 master-0 kubenswrapper[8606]: I1204 22:15:47.212895 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerStarted","Data":"3879333ec3106dc5e5897a72b9f25043008a8c4c0e423bb23d06f11ee99e9552"} Dec 04 22:15:47.944074 master-0 kubenswrapper[8606]: I1204 22:15:47.943995 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:47.944074 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:47.944074 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:47.944074 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:47.944589 master-0 kubenswrapper[8606]: I1204 22:15:47.944088 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:48.943406 master-0 kubenswrapper[8606]: I1204 22:15:48.943323 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:48.943406 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:48.943406 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:48.943406 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:48.944410 master-0 kubenswrapper[8606]: I1204 22:15:48.943419 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:49.944179 master-0 kubenswrapper[8606]: I1204 22:15:49.944091 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:49.944179 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:49.944179 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:49.944179 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:49.944994 master-0 kubenswrapper[8606]: I1204 22:15:49.944217 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:50.944816 master-0 kubenswrapper[8606]: I1204 22:15:50.944711 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:50.944816 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:50.944816 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:50.944816 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:50.945798 master-0 kubenswrapper[8606]: I1204 22:15:50.944824 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:51.944226 master-0 kubenswrapper[8606]: I1204 22:15:51.944120 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:51.944226 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:51.944226 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:51.944226 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:51.944745 master-0 kubenswrapper[8606]: I1204 22:15:51.944217 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:52.945123 master-0 kubenswrapper[8606]: I1204 22:15:52.945039 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:52.945123 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:52.945123 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:52.945123 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:52.945753 master-0 kubenswrapper[8606]: I1204 22:15:52.945129 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:53.714394 master-0 kubenswrapper[8606]: I1204 22:15:53.714309 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:15:53.944288 master-0 kubenswrapper[8606]: I1204 22:15:53.944198 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:53.944288 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:53.944288 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:53.944288 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:53.944288 master-0 kubenswrapper[8606]: I1204 22:15:53.944274 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:54.391863 master-0 kubenswrapper[8606]: I1204 22:15:54.391760 8606 scope.go:117] "RemoveContainer" containerID="27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780" Dec 04 22:15:54.392861 master-0 kubenswrapper[8606]: E1204 22:15:54.392179 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:15:54.944680 master-0 kubenswrapper[8606]: I1204 22:15:54.944524 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:54.944680 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:54.944680 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:54.944680 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:54.945175 master-0 kubenswrapper[8606]: I1204 22:15:54.944684 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:55.944395 master-0 kubenswrapper[8606]: I1204 22:15:55.944323 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:55.944395 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:55.944395 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:55.944395 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:55.944841 master-0 kubenswrapper[8606]: I1204 22:15:55.944428 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:56.945425 master-0 kubenswrapper[8606]: I1204 22:15:56.945348 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:56.945425 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:56.945425 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:56.945425 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:56.946057 master-0 kubenswrapper[8606]: I1204 22:15:56.945443 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:57.944708 master-0 kubenswrapper[8606]: I1204 22:15:57.944631 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:57.944708 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:57.944708 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:57.944708 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:57.945051 master-0 kubenswrapper[8606]: I1204 22:15:57.944728 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:58.944690 master-0 kubenswrapper[8606]: I1204 22:15:58.944578 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:58.944690 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:58.944690 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:58.944690 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:58.944690 master-0 kubenswrapper[8606]: I1204 22:15:58.944684 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:15:59.947449 master-0 kubenswrapper[8606]: I1204 22:15:59.947371 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:15:59.947449 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:15:59.947449 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:15:59.947449 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:15:59.948159 master-0 kubenswrapper[8606]: I1204 22:15:59.947482 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:00.945002 master-0 kubenswrapper[8606]: I1204 22:16:00.944865 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:00.945002 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:00.945002 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:00.945002 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:00.945433 master-0 kubenswrapper[8606]: I1204 22:16:00.945026 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:00.945433 master-0 kubenswrapper[8606]: I1204 22:16:00.945129 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:16:00.946620 master-0 kubenswrapper[8606]: I1204 22:16:00.946549 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3"} pod="openshift-ingress/router-default-5465c8b4db-8vm66" containerMessage="Container router failed startup probe, will be restarted" Dec 04 22:16:00.946816 master-0 kubenswrapper[8606]: I1204 22:16:00.946641 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" containerID="cri-o://1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3" gracePeriod=3600 Dec 04 22:16:02.971532 master-0 kubenswrapper[8606]: I1204 22:16:02.970995 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Dec 04 22:16:02.971532 master-0 kubenswrapper[8606]: E1204 22:16:02.971335 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e011b0a-89e2-47e3-9112-d46a828416b1" containerName="installer" Dec 04 22:16:02.971532 master-0 kubenswrapper[8606]: I1204 22:16:02.971373 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e011b0a-89e2-47e3-9112-d46a828416b1" containerName="installer" Dec 04 22:16:02.973587 master-0 kubenswrapper[8606]: I1204 22:16:02.973264 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e011b0a-89e2-47e3-9112-d46a828416b1" containerName="installer" Dec 04 22:16:02.975217 master-0 kubenswrapper[8606]: I1204 22:16:02.974055 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:02.977012 master-0 kubenswrapper[8606]: I1204 22:16:02.976429 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-t57bp" Dec 04 22:16:02.977012 master-0 kubenswrapper[8606]: I1204 22:16:02.976615 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Dec 04 22:16:02.980219 master-0 kubenswrapper[8606]: I1204 22:16:02.980174 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Dec 04 22:16:03.162511 master-0 kubenswrapper[8606]: I1204 22:16:03.162414 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:03.162800 master-0 kubenswrapper[8606]: I1204 22:16:03.162613 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:03.162800 master-0 kubenswrapper[8606]: I1204 22:16:03.162667 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:03.263785 master-0 kubenswrapper[8606]: I1204 22:16:03.263625 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:03.264039 master-0 kubenswrapper[8606]: I1204 22:16:03.263838 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:03.264039 master-0 kubenswrapper[8606]: I1204 22:16:03.263939 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:03.264436 master-0 kubenswrapper[8606]: I1204 22:16:03.264389 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:03.264538 master-0 kubenswrapper[8606]: I1204 22:16:03.264472 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:03.284784 master-0 kubenswrapper[8606]: I1204 22:16:03.284727 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:03.296565 master-0 kubenswrapper[8606]: I1204 22:16:03.296490 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:03.751989 master-0 kubenswrapper[8606]: I1204 22:16:03.751927 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Dec 04 22:16:04.120199 master-0 kubenswrapper[8606]: I1204 22:16:04.120129 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7wjzf"] Dec 04 22:16:04.122934 master-0 kubenswrapper[8606]: I1204 22:16:04.122886 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.140596 master-0 kubenswrapper[8606]: I1204 22:16:04.138878 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-md4z6"] Dec 04 22:16:04.153844 master-0 kubenswrapper[8606]: I1204 22:16:04.143587 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.153844 master-0 kubenswrapper[8606]: I1204 22:16:04.152539 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xdxp5"] Dec 04 22:16:04.159711 master-0 kubenswrapper[8606]: I1204 22:16:04.159655 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.163323 master-0 kubenswrapper[8606]: I1204 22:16:04.163116 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr"] Dec 04 22:16:04.169424 master-0 kubenswrapper[8606]: I1204 22:16:04.169368 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.171259 master-0 kubenswrapper[8606]: I1204 22:16:04.171182 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 22:16:04.172131 master-0 kubenswrapper[8606]: I1204 22:16:04.171623 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-njflt" Dec 04 22:16:04.177303 master-0 kubenswrapper[8606]: I1204 22:16:04.174747 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s7vv6"] Dec 04 22:16:04.177303 master-0 kubenswrapper[8606]: I1204 22:16:04.176967 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.180330 master-0 kubenswrapper[8606]: I1204 22:16:04.180243 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-md4z6"] Dec 04 22:16:04.184477 master-0 kubenswrapper[8606]: I1204 22:16:04.184086 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7wjzf"] Dec 04 22:16:04.190400 master-0 kubenswrapper[8606]: I1204 22:16:04.190310 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr"] Dec 04 22:16:04.203207 master-0 kubenswrapper[8606]: I1204 22:16:04.202258 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s7vv6"] Dec 04 22:16:04.203207 master-0 kubenswrapper[8606]: I1204 22:16:04.203137 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdxp5"] Dec 04 22:16:04.291303 master-0 kubenswrapper[8606]: I1204 22:16:04.291231 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-catalog-content\") pod \"community-operators-md4z6\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.291717 master-0 kubenswrapper[8606]: I1204 22:16:04.291680 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-catalog-content\") pod \"redhat-operators-s7vv6\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.291855 master-0 kubenswrapper[8606]: I1204 22:16:04.291836 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-config-volume\") pod \"collect-profiles-29414775-47tzr\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.291933 master-0 kubenswrapper[8606]: I1204 22:16:04.291920 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-catalog-content\") pod \"redhat-marketplace-xdxp5\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.292042 master-0 kubenswrapper[8606]: I1204 22:16:04.292027 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxr7p\" (UniqueName: \"kubernetes.io/projected/f25df7a7-c30a-4580-a572-122277c2ff7c-kube-api-access-zxr7p\") pod \"community-operators-md4z6\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.292150 master-0 kubenswrapper[8606]: I1204 22:16:04.292134 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-utilities\") pod \"redhat-operators-s7vv6\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.292231 master-0 kubenswrapper[8606]: I1204 22:16:04.292218 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbgvr\" (UniqueName: \"kubernetes.io/projected/8173c976-7b52-4172-8b80-e174d957d705-kube-api-access-zbgvr\") pod \"certified-operators-7wjzf\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.292343 master-0 kubenswrapper[8606]: I1204 22:16:04.292322 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-utilities\") pod \"certified-operators-7wjzf\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.292490 master-0 kubenswrapper[8606]: I1204 22:16:04.292473 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-catalog-content\") pod \"certified-operators-7wjzf\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.292610 master-0 kubenswrapper[8606]: I1204 22:16:04.292595 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvcgz\" (UniqueName: \"kubernetes.io/projected/eecc4b0d-529b-4c1f-a4ba-80057c765b80-kube-api-access-wvcgz\") pod \"redhat-operators-s7vv6\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.292819 master-0 kubenswrapper[8606]: I1204 22:16:04.292771 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w5fd\" (UniqueName: \"kubernetes.io/projected/15d1a112-f449-4ad5-9c27-e9428491fec1-kube-api-access-4w5fd\") pod \"redhat-marketplace-xdxp5\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.292895 master-0 kubenswrapper[8606]: I1204 22:16:04.292867 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmbx2\" (UniqueName: \"kubernetes.io/projected/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-kube-api-access-dmbx2\") pod \"collect-profiles-29414775-47tzr\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.292931 master-0 kubenswrapper[8606]: I1204 22:16:04.292910 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-utilities\") pod \"redhat-marketplace-xdxp5\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.292972 master-0 kubenswrapper[8606]: I1204 22:16:04.292957 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-secret-volume\") pod \"collect-profiles-29414775-47tzr\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.293182 master-0 kubenswrapper[8606]: I1204 22:16:04.293155 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-utilities\") pod \"community-operators-md4z6\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.364992 master-0 kubenswrapper[8606]: I1204 22:16:04.364917 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"b4a17a75-3abb-4d82-911a-b5923ba1cef1","Type":"ContainerStarted","Data":"3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85"} Dec 04 22:16:04.364992 master-0 kubenswrapper[8606]: I1204 22:16:04.364981 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"b4a17a75-3abb-4d82-911a-b5923ba1cef1","Type":"ContainerStarted","Data":"5200ff32baf614f84811e473842fee6f8db2a3ed9b519e8bc8ac1e3cf04b6b65"} Dec 04 22:16:04.393166 master-0 kubenswrapper[8606]: I1204 22:16:04.392065 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" podStartSLOduration=2.392036766 podStartE2EDuration="2.392036766s" podCreationTimestamp="2025-12-04 22:16:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:16:04.391322125 +0000 UTC m=+929.201624370" watchObservedRunningTime="2025-12-04 22:16:04.392036766 +0000 UTC m=+929.202338981" Dec 04 22:16:04.395255 master-0 kubenswrapper[8606]: I1204 22:16:04.395199 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxr7p\" (UniqueName: \"kubernetes.io/projected/f25df7a7-c30a-4580-a572-122277c2ff7c-kube-api-access-zxr7p\") pod \"community-operators-md4z6\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.395368 master-0 kubenswrapper[8606]: I1204 22:16:04.395259 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-utilities\") pod \"redhat-operators-s7vv6\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.395368 master-0 kubenswrapper[8606]: I1204 22:16:04.395290 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbgvr\" (UniqueName: \"kubernetes.io/projected/8173c976-7b52-4172-8b80-e174d957d705-kube-api-access-zbgvr\") pod \"certified-operators-7wjzf\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.395368 master-0 kubenswrapper[8606]: I1204 22:16:04.395333 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-utilities\") pod \"certified-operators-7wjzf\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.395721 master-0 kubenswrapper[8606]: I1204 22:16:04.395377 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-catalog-content\") pod \"certified-operators-7wjzf\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.395721 master-0 kubenswrapper[8606]: I1204 22:16:04.395398 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvcgz\" (UniqueName: \"kubernetes.io/projected/eecc4b0d-529b-4c1f-a4ba-80057c765b80-kube-api-access-wvcgz\") pod \"redhat-operators-s7vv6\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.395721 master-0 kubenswrapper[8606]: I1204 22:16:04.395427 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w5fd\" (UniqueName: \"kubernetes.io/projected/15d1a112-f449-4ad5-9c27-e9428491fec1-kube-api-access-4w5fd\") pod \"redhat-marketplace-xdxp5\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.395721 master-0 kubenswrapper[8606]: I1204 22:16:04.395450 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbx2\" (UniqueName: \"kubernetes.io/projected/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-kube-api-access-dmbx2\") pod \"collect-profiles-29414775-47tzr\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.395721 master-0 kubenswrapper[8606]: I1204 22:16:04.395469 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-utilities\") pod \"redhat-marketplace-xdxp5\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.395721 master-0 kubenswrapper[8606]: I1204 22:16:04.395489 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-secret-volume\") pod \"collect-profiles-29414775-47tzr\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.395721 master-0 kubenswrapper[8606]: I1204 22:16:04.395603 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-utilities\") pod \"community-operators-md4z6\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.395721 master-0 kubenswrapper[8606]: I1204 22:16:04.395648 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-catalog-content\") pod \"community-operators-md4z6\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.395721 master-0 kubenswrapper[8606]: I1204 22:16:04.395683 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-catalog-content\") pod \"redhat-operators-s7vv6\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.395721 master-0 kubenswrapper[8606]: I1204 22:16:04.395716 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-config-volume\") pod \"collect-profiles-29414775-47tzr\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.397380 master-0 kubenswrapper[8606]: I1204 22:16:04.395742 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-catalog-content\") pod \"redhat-marketplace-xdxp5\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.397380 master-0 kubenswrapper[8606]: I1204 22:16:04.396286 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-catalog-content\") pod \"redhat-marketplace-xdxp5\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.397380 master-0 kubenswrapper[8606]: I1204 22:16:04.396640 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-utilities\") pod \"redhat-operators-s7vv6\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.397380 master-0 kubenswrapper[8606]: I1204 22:16:04.396752 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-utilities\") pod \"redhat-marketplace-xdxp5\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.397380 master-0 kubenswrapper[8606]: I1204 22:16:04.396868 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-catalog-content\") pod \"redhat-operators-s7vv6\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.397380 master-0 kubenswrapper[8606]: I1204 22:16:04.397050 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-utilities\") pod \"certified-operators-7wjzf\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.397638 master-0 kubenswrapper[8606]: I1204 22:16:04.397615 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-config-volume\") pod \"collect-profiles-29414775-47tzr\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.397686 master-0 kubenswrapper[8606]: I1204 22:16:04.397614 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-catalog-content\") pod \"certified-operators-7wjzf\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.397686 master-0 kubenswrapper[8606]: I1204 22:16:04.397644 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-catalog-content\") pod \"community-operators-md4z6\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.400210 master-0 kubenswrapper[8606]: I1204 22:16:04.399977 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-utilities\") pod \"community-operators-md4z6\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.400646 master-0 kubenswrapper[8606]: I1204 22:16:04.400609 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-secret-volume\") pod \"collect-profiles-29414775-47tzr\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.415914 master-0 kubenswrapper[8606]: I1204 22:16:04.415853 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxr7p\" (UniqueName: \"kubernetes.io/projected/f25df7a7-c30a-4580-a572-122277c2ff7c-kube-api-access-zxr7p\") pod \"community-operators-md4z6\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.416642 master-0 kubenswrapper[8606]: I1204 22:16:04.416606 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w5fd\" (UniqueName: \"kubernetes.io/projected/15d1a112-f449-4ad5-9c27-e9428491fec1-kube-api-access-4w5fd\") pod \"redhat-marketplace-xdxp5\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.422686 master-0 kubenswrapper[8606]: I1204 22:16:04.419856 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvcgz\" (UniqueName: \"kubernetes.io/projected/eecc4b0d-529b-4c1f-a4ba-80057c765b80-kube-api-access-wvcgz\") pod \"redhat-operators-s7vv6\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.423002 master-0 kubenswrapper[8606]: I1204 22:16:04.422970 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmbx2\" (UniqueName: \"kubernetes.io/projected/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-kube-api-access-dmbx2\") pod \"collect-profiles-29414775-47tzr\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.425366 master-0 kubenswrapper[8606]: I1204 22:16:04.425315 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbgvr\" (UniqueName: \"kubernetes.io/projected/8173c976-7b52-4172-8b80-e174d957d705-kube-api-access-zbgvr\") pod \"certified-operators-7wjzf\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.473207 master-0 kubenswrapper[8606]: I1204 22:16:04.473115 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:04.494423 master-0 kubenswrapper[8606]: I1204 22:16:04.494347 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:04.507319 master-0 kubenswrapper[8606]: I1204 22:16:04.507243 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:04.526771 master-0 kubenswrapper[8606]: I1204 22:16:04.526717 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:04.540065 master-0 kubenswrapper[8606]: I1204 22:16:04.539992 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:04.964251 master-0 kubenswrapper[8606]: W1204 22:16:04.964089 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8173c976_7b52_4172_8b80_e174d957d705.slice/crio-7a4328c266fbb2fea6968364fb7c7564fb04321ac1d55240bd90df6d43f10362 WatchSource:0}: Error finding container 7a4328c266fbb2fea6968364fb7c7564fb04321ac1d55240bd90df6d43f10362: Status 404 returned error can't find the container with id 7a4328c266fbb2fea6968364fb7c7564fb04321ac1d55240bd90df6d43f10362 Dec 04 22:16:05.003643 master-0 kubenswrapper[8606]: I1204 22:16:04.992184 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7wjzf"] Dec 04 22:16:05.039671 master-0 kubenswrapper[8606]: I1204 22:16:05.039599 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-md4z6"] Dec 04 22:16:05.049237 master-0 kubenswrapper[8606]: W1204 22:16:05.049122 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf25df7a7_c30a_4580_a572_122277c2ff7c.slice/crio-d59c94fe0debd0dd34f0289fe87a5367ddc4cb590f2ab430c4c05648c3e9787c WatchSource:0}: Error finding container d59c94fe0debd0dd34f0289fe87a5367ddc4cb590f2ab430c4c05648c3e9787c: Status 404 returned error can't find the container with id d59c94fe0debd0dd34f0289fe87a5367ddc4cb590f2ab430c4c05648c3e9787c Dec 04 22:16:05.117665 master-0 kubenswrapper[8606]: I1204 22:16:05.117531 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdxp5"] Dec 04 22:16:05.125840 master-0 kubenswrapper[8606]: W1204 22:16:05.125453 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15d1a112_f449_4ad5_9c27_e9428491fec1.slice/crio-3c5f5f9b120b28d25c13a5d9606587f36c8f6d3abfee3c6299c5e26d8e70fbfd WatchSource:0}: Error finding container 3c5f5f9b120b28d25c13a5d9606587f36c8f6d3abfee3c6299c5e26d8e70fbfd: Status 404 returned error can't find the container with id 3c5f5f9b120b28d25c13a5d9606587f36c8f6d3abfee3c6299c5e26d8e70fbfd Dec 04 22:16:05.157932 master-0 kubenswrapper[8606]: I1204 22:16:05.157657 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr"] Dec 04 22:16:05.159951 master-0 kubenswrapper[8606]: I1204 22:16:05.159890 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s7vv6"] Dec 04 22:16:05.237448 master-0 kubenswrapper[8606]: W1204 22:16:05.237186 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeecc4b0d_529b_4c1f_a4ba_80057c765b80.slice/crio-0b0c9fd3c02f89fd9af3a5e305e018905e6be1f956cfb0045233a7cfc1ed2fca WatchSource:0}: Error finding container 0b0c9fd3c02f89fd9af3a5e305e018905e6be1f956cfb0045233a7cfc1ed2fca: Status 404 returned error can't find the container with id 0b0c9fd3c02f89fd9af3a5e305e018905e6be1f956cfb0045233a7cfc1ed2fca Dec 04 22:16:05.374155 master-0 kubenswrapper[8606]: I1204 22:16:05.374083 8606 generic.go:334] "Generic (PLEG): container finished" podID="8173c976-7b52-4172-8b80-e174d957d705" containerID="8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f" exitCode=0 Dec 04 22:16:05.374568 master-0 kubenswrapper[8606]: I1204 22:16:05.374174 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wjzf" event={"ID":"8173c976-7b52-4172-8b80-e174d957d705","Type":"ContainerDied","Data":"8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f"} Dec 04 22:16:05.374568 master-0 kubenswrapper[8606]: I1204 22:16:05.374213 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wjzf" event={"ID":"8173c976-7b52-4172-8b80-e174d957d705","Type":"ContainerStarted","Data":"7a4328c266fbb2fea6968364fb7c7564fb04321ac1d55240bd90df6d43f10362"} Dec 04 22:16:05.376529 master-0 kubenswrapper[8606]: I1204 22:16:05.376479 8606 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 22:16:05.377184 master-0 kubenswrapper[8606]: I1204 22:16:05.377131 8606 generic.go:334] "Generic (PLEG): container finished" podID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerID="5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998" exitCode=0 Dec 04 22:16:05.377366 master-0 kubenswrapper[8606]: I1204 22:16:05.377258 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-md4z6" event={"ID":"f25df7a7-c30a-4580-a572-122277c2ff7c","Type":"ContainerDied","Data":"5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998"} Dec 04 22:16:05.377366 master-0 kubenswrapper[8606]: I1204 22:16:05.377299 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-md4z6" event={"ID":"f25df7a7-c30a-4580-a572-122277c2ff7c","Type":"ContainerStarted","Data":"d59c94fe0debd0dd34f0289fe87a5367ddc4cb590f2ab430c4c05648c3e9787c"} Dec 04 22:16:05.379429 master-0 kubenswrapper[8606]: I1204 22:16:05.379378 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" event={"ID":"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f","Type":"ContainerStarted","Data":"69107f4cb3ef5b3b0251fc55638d465f044f2dd2f76a36beea2e418eac9fab2d"} Dec 04 22:16:05.381866 master-0 kubenswrapper[8606]: I1204 22:16:05.381816 8606 generic.go:334] "Generic (PLEG): container finished" podID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerID="e1d8730a080e7d5774f84fbb41c8785ca93cb335811d01d830a7f679e41bd66e" exitCode=0 Dec 04 22:16:05.381987 master-0 kubenswrapper[8606]: I1204 22:16:05.381902 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdxp5" event={"ID":"15d1a112-f449-4ad5-9c27-e9428491fec1","Type":"ContainerDied","Data":"e1d8730a080e7d5774f84fbb41c8785ca93cb335811d01d830a7f679e41bd66e"} Dec 04 22:16:05.382034 master-0 kubenswrapper[8606]: I1204 22:16:05.382015 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdxp5" event={"ID":"15d1a112-f449-4ad5-9c27-e9428491fec1","Type":"ContainerStarted","Data":"3c5f5f9b120b28d25c13a5d9606587f36c8f6d3abfee3c6299c5e26d8e70fbfd"} Dec 04 22:16:05.383712 master-0 kubenswrapper[8606]: I1204 22:16:05.383672 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7vv6" event={"ID":"eecc4b0d-529b-4c1f-a4ba-80057c765b80","Type":"ContainerStarted","Data":"0b0c9fd3c02f89fd9af3a5e305e018905e6be1f956cfb0045233a7cfc1ed2fca"} Dec 04 22:16:05.710302 master-0 kubenswrapper[8606]: I1204 22:16:05.710184 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Dec 04 22:16:06.397335 master-0 kubenswrapper[8606]: I1204 22:16:06.396130 8606 generic.go:334] "Generic (PLEG): container finished" podID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerID="442a245ed4230223a6837f3a30f95dde82b173406f0654bf47dd554da457b254" exitCode=0 Dec 04 22:16:06.397335 master-0 kubenswrapper[8606]: I1204 22:16:06.397338 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdxp5" event={"ID":"15d1a112-f449-4ad5-9c27-e9428491fec1","Type":"ContainerDied","Data":"442a245ed4230223a6837f3a30f95dde82b173406f0654bf47dd554da457b254"} Dec 04 22:16:06.401762 master-0 kubenswrapper[8606]: I1204 22:16:06.401704 8606 generic.go:334] "Generic (PLEG): container finished" podID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerID="c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47" exitCode=0 Dec 04 22:16:06.401845 master-0 kubenswrapper[8606]: I1204 22:16:06.401803 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7vv6" event={"ID":"eecc4b0d-529b-4c1f-a4ba-80057c765b80","Type":"ContainerDied","Data":"c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47"} Dec 04 22:16:06.413298 master-0 kubenswrapper[8606]: I1204 22:16:06.413232 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wjzf" event={"ID":"8173c976-7b52-4172-8b80-e174d957d705","Type":"ContainerStarted","Data":"f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df"} Dec 04 22:16:06.419824 master-0 kubenswrapper[8606]: I1204 22:16:06.419783 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-md4z6" event={"ID":"f25df7a7-c30a-4580-a572-122277c2ff7c","Type":"ContainerStarted","Data":"53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810"} Dec 04 22:16:06.422265 master-0 kubenswrapper[8606]: I1204 22:16:06.422199 8606 generic.go:334] "Generic (PLEG): container finished" podID="85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" containerID="00713a1c06d69e4187d092bf84b0d17670a9eda7c3ce1307b7efa35d4e53871c" exitCode=0 Dec 04 22:16:06.422456 master-0 kubenswrapper[8606]: I1204 22:16:06.422407 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" event={"ID":"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f","Type":"ContainerDied","Data":"00713a1c06d69e4187d092bf84b0d17670a9eda7c3ce1307b7efa35d4e53871c"} Dec 04 22:16:06.422601 master-0 kubenswrapper[8606]: I1204 22:16:06.422565 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" podUID="b4a17a75-3abb-4d82-911a-b5923ba1cef1" containerName="installer" containerID="cri-o://3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85" gracePeriod=30 Dec 04 22:16:07.441236 master-0 kubenswrapper[8606]: I1204 22:16:07.435262 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdxp5" event={"ID":"15d1a112-f449-4ad5-9c27-e9428491fec1","Type":"ContainerStarted","Data":"f4721e02aa4946a4882889e7b565089e4bf32778f77ac4c0a12e05dd06675241"} Dec 04 22:16:07.441997 master-0 kubenswrapper[8606]: I1204 22:16:07.441532 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7vv6" event={"ID":"eecc4b0d-529b-4c1f-a4ba-80057c765b80","Type":"ContainerStarted","Data":"5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320"} Dec 04 22:16:07.454064 master-0 kubenswrapper[8606]: I1204 22:16:07.454010 8606 generic.go:334] "Generic (PLEG): container finished" podID="8173c976-7b52-4172-8b80-e174d957d705" containerID="f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df" exitCode=0 Dec 04 22:16:07.454236 master-0 kubenswrapper[8606]: I1204 22:16:07.454091 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wjzf" event={"ID":"8173c976-7b52-4172-8b80-e174d957d705","Type":"ContainerDied","Data":"f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df"} Dec 04 22:16:07.465552 master-0 kubenswrapper[8606]: I1204 22:16:07.465051 8606 generic.go:334] "Generic (PLEG): container finished" podID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerID="53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810" exitCode=0 Dec 04 22:16:07.465552 master-0 kubenswrapper[8606]: I1204 22:16:07.465122 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-md4z6" event={"ID":"f25df7a7-c30a-4580-a572-122277c2ff7c","Type":"ContainerDied","Data":"53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810"} Dec 04 22:16:07.476056 master-0 kubenswrapper[8606]: I1204 22:16:07.475950 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xdxp5" podStartSLOduration=58.988106623 podStartE2EDuration="1m0.475921746s" podCreationTimestamp="2025-12-04 22:15:07 +0000 UTC" firstStartedPulling="2025-12-04 22:16:05.383245948 +0000 UTC m=+930.193548163" lastFinishedPulling="2025-12-04 22:16:06.871061041 +0000 UTC m=+931.681363286" observedRunningTime="2025-12-04 22:16:07.472406978 +0000 UTC m=+932.282709243" watchObservedRunningTime="2025-12-04 22:16:07.475921746 +0000 UTC m=+932.286223961" Dec 04 22:16:07.924210 master-0 kubenswrapper[8606]: I1204 22:16:07.924120 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 04 22:16:07.926963 master-0 kubenswrapper[8606]: I1204 22:16:07.926907 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:07.931713 master-0 kubenswrapper[8606]: I1204 22:16:07.931538 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 04 22:16:07.952002 master-0 kubenswrapper[8606]: I1204 22:16:07.951927 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:07.982985 master-0 kubenswrapper[8606]: I1204 22:16:07.982818 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmbx2\" (UniqueName: \"kubernetes.io/projected/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-kube-api-access-dmbx2\") pod \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " Dec 04 22:16:07.982985 master-0 kubenswrapper[8606]: I1204 22:16:07.982982 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-secret-volume\") pod \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " Dec 04 22:16:07.983137 master-0 kubenswrapper[8606]: I1204 22:16:07.983039 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-config-volume\") pod \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\" (UID: \"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f\") " Dec 04 22:16:07.983468 master-0 kubenswrapper[8606]: I1204 22:16:07.983340 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-var-lock\") pod \"installer-5-master-0\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:07.983468 master-0 kubenswrapper[8606]: I1204 22:16:07.983404 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:07.983468 master-0 kubenswrapper[8606]: I1204 22:16:07.983442 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3099f7b5-f904-4d15-aedb-f4e558b813e4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:07.985649 master-0 kubenswrapper[8606]: I1204 22:16:07.983978 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-config-volume" (OuterVolumeSpecName: "config-volume") pod "85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" (UID: "85a7edee-7a4c-4f4f-b537-d1ce3a9f812f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:16:07.986704 master-0 kubenswrapper[8606]: I1204 22:16:07.986660 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" (UID: "85a7edee-7a4c-4f4f-b537-d1ce3a9f812f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:16:07.988155 master-0 kubenswrapper[8606]: I1204 22:16:07.988118 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-kube-api-access-dmbx2" (OuterVolumeSpecName: "kube-api-access-dmbx2") pod "85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" (UID: "85a7edee-7a4c-4f4f-b537-d1ce3a9f812f"). InnerVolumeSpecName "kube-api-access-dmbx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:16:08.085954 master-0 kubenswrapper[8606]: I1204 22:16:08.085855 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-var-lock\") pod \"installer-5-master-0\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:08.086192 master-0 kubenswrapper[8606]: I1204 22:16:08.085983 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:08.086192 master-0 kubenswrapper[8606]: I1204 22:16:08.086031 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3099f7b5-f904-4d15-aedb-f4e558b813e4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:08.086192 master-0 kubenswrapper[8606]: I1204 22:16:08.086153 8606 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:08.086192 master-0 kubenswrapper[8606]: I1204 22:16:08.086169 8606 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:08.086192 master-0 kubenswrapper[8606]: I1204 22:16:08.086189 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmbx2\" (UniqueName: \"kubernetes.io/projected/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f-kube-api-access-dmbx2\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:08.086895 master-0 kubenswrapper[8606]: I1204 22:16:08.086824 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:08.087031 master-0 kubenswrapper[8606]: I1204 22:16:08.086947 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-var-lock\") pod \"installer-5-master-0\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:08.119568 master-0 kubenswrapper[8606]: I1204 22:16:08.119465 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3099f7b5-f904-4d15-aedb-f4e558b813e4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:08.343712 master-0 kubenswrapper[8606]: I1204 22:16:08.343539 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:16:08.393827 master-0 kubenswrapper[8606]: I1204 22:16:08.391478 8606 scope.go:117] "RemoveContainer" containerID="27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780" Dec 04 22:16:08.393827 master-0 kubenswrapper[8606]: E1204 22:16:08.391715 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-w7hnc_openshift-cluster-storage-operator(4f22eee4-a42d-4d2b-bffa-6c3f29f1f026)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" podUID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" Dec 04 22:16:08.474801 master-0 kubenswrapper[8606]: I1204 22:16:08.474409 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:16:08.505136 master-0 kubenswrapper[8606]: I1204 22:16:08.505006 8606 generic.go:334] "Generic (PLEG): container finished" podID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerID="5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320" exitCode=0 Dec 04 22:16:08.505136 master-0 kubenswrapper[8606]: I1204 22:16:08.505121 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7vv6" event={"ID":"eecc4b0d-529b-4c1f-a4ba-80057c765b80","Type":"ContainerDied","Data":"5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320"} Dec 04 22:16:08.511460 master-0 kubenswrapper[8606]: I1204 22:16:08.509881 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wjzf" event={"ID":"8173c976-7b52-4172-8b80-e174d957d705","Type":"ContainerStarted","Data":"87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844"} Dec 04 22:16:08.519591 master-0 kubenswrapper[8606]: I1204 22:16:08.519525 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-md4z6" event={"ID":"f25df7a7-c30a-4580-a572-122277c2ff7c","Type":"ContainerStarted","Data":"b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4"} Dec 04 22:16:08.530985 master-0 kubenswrapper[8606]: I1204 22:16:08.530922 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" event={"ID":"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f","Type":"ContainerDied","Data":"69107f4cb3ef5b3b0251fc55638d465f044f2dd2f76a36beea2e418eac9fab2d"} Dec 04 22:16:08.531239 master-0 kubenswrapper[8606]: I1204 22:16:08.531224 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69107f4cb3ef5b3b0251fc55638d465f044f2dd2f76a36beea2e418eac9fab2d" Dec 04 22:16:08.531310 master-0 kubenswrapper[8606]: I1204 22:16:08.530950 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:16:08.559447 master-0 kubenswrapper[8606]: I1204 22:16:08.559380 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7wjzf" podStartSLOduration=47.940034761 podStartE2EDuration="50.559361637s" podCreationTimestamp="2025-12-04 22:15:18 +0000 UTC" firstStartedPulling="2025-12-04 22:16:05.376412418 +0000 UTC m=+930.186714653" lastFinishedPulling="2025-12-04 22:16:07.995739304 +0000 UTC m=+932.806041529" observedRunningTime="2025-12-04 22:16:08.557224717 +0000 UTC m=+933.367526942" watchObservedRunningTime="2025-12-04 22:16:08.559361637 +0000 UTC m=+933.369663852" Dec 04 22:16:08.583281 master-0 kubenswrapper[8606]: I1204 22:16:08.583159 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-md4z6" podStartSLOduration=59.049224872 podStartE2EDuration="1m1.583108821s" podCreationTimestamp="2025-12-04 22:15:07 +0000 UTC" firstStartedPulling="2025-12-04 22:16:05.37935835 +0000 UTC m=+930.189660585" lastFinishedPulling="2025-12-04 22:16:07.913242309 +0000 UTC m=+932.723544534" observedRunningTime="2025-12-04 22:16:08.577172345 +0000 UTC m=+933.387474570" watchObservedRunningTime="2025-12-04 22:16:08.583108821 +0000 UTC m=+933.393411046" Dec 04 22:16:08.814426 master-0 kubenswrapper[8606]: I1204 22:16:08.814348 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 04 22:16:09.543283 master-0 kubenswrapper[8606]: I1204 22:16:09.543213 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7vv6" event={"ID":"eecc4b0d-529b-4c1f-a4ba-80057c765b80","Type":"ContainerStarted","Data":"822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb"} Dec 04 22:16:09.545848 master-0 kubenswrapper[8606]: I1204 22:16:09.545770 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"3099f7b5-f904-4d15-aedb-f4e558b813e4","Type":"ContainerStarted","Data":"f9af7ae05881c66c990776ea5e9ecae6917372ad2e83deed7c505b583fa9da46"} Dec 04 22:16:09.545920 master-0 kubenswrapper[8606]: I1204 22:16:09.545854 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"3099f7b5-f904-4d15-aedb-f4e558b813e4","Type":"ContainerStarted","Data":"bc39063be03324c773b296bb527536f85c03a71f7444ce95b585b37a77beb76b"} Dec 04 22:16:09.568492 master-0 kubenswrapper[8606]: I1204 22:16:09.568410 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s7vv6" podStartSLOduration=48.824772168 podStartE2EDuration="51.568385547s" podCreationTimestamp="2025-12-04 22:15:18 +0000 UTC" firstStartedPulling="2025-12-04 22:16:06.411682113 +0000 UTC m=+931.221984338" lastFinishedPulling="2025-12-04 22:16:09.155295512 +0000 UTC m=+933.965597717" observedRunningTime="2025-12-04 22:16:09.56417288 +0000 UTC m=+934.374475115" watchObservedRunningTime="2025-12-04 22:16:09.568385547 +0000 UTC m=+934.378687772" Dec 04 22:16:09.586578 master-0 kubenswrapper[8606]: I1204 22:16:09.586470 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=2.586448153 podStartE2EDuration="2.586448153s" podCreationTimestamp="2025-12-04 22:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:16:09.580570048 +0000 UTC m=+934.390872273" watchObservedRunningTime="2025-12-04 22:16:09.586448153 +0000 UTC m=+934.396750358" Dec 04 22:16:14.392720 master-0 kubenswrapper[8606]: I1204 22:16:14.392555 8606 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:16:14.392720 master-0 kubenswrapper[8606]: I1204 22:16:14.392626 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:16:14.420588 master-0 kubenswrapper[8606]: I1204 22:16:14.420487 8606 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Dec 04 22:16:14.424005 master-0 kubenswrapper[8606]: I1204 22:16:14.423932 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 04 22:16:14.432386 master-0 kubenswrapper[8606]: I1204 22:16:14.432321 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 04 22:16:14.453986 master-0 kubenswrapper[8606]: I1204 22:16:14.453917 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 04 22:16:14.474154 master-0 kubenswrapper[8606]: I1204 22:16:14.473705 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:14.474154 master-0 kubenswrapper[8606]: I1204 22:16:14.473791 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:14.495398 master-0 kubenswrapper[8606]: I1204 22:16:14.495331 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:14.495398 master-0 kubenswrapper[8606]: I1204 22:16:14.495401 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:14.510754 master-0 kubenswrapper[8606]: I1204 22:16:14.510688 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:14.510892 master-0 kubenswrapper[8606]: I1204 22:16:14.510781 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:14.541791 master-0 kubenswrapper[8606]: I1204 22:16:14.541719 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:14.542007 master-0 kubenswrapper[8606]: I1204 22:16:14.541809 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:14.543676 master-0 kubenswrapper[8606]: I1204 22:16:14.543632 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:14.545270 master-0 kubenswrapper[8606]: I1204 22:16:14.545226 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:14.593070 master-0 kubenswrapper[8606]: I1204 22:16:14.592954 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=0.592914487 podStartE2EDuration="592.914487ms" podCreationTimestamp="2025-12-04 22:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:16:14.580413458 +0000 UTC m=+939.390715703" watchObservedRunningTime="2025-12-04 22:16:14.592914487 +0000 UTC m=+939.403216722" Dec 04 22:16:14.606670 master-0 kubenswrapper[8606]: I1204 22:16:14.606375 8606 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:16:14.606670 master-0 kubenswrapper[8606]: I1204 22:16:14.606420 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b55fa1d7-c345-4373-856e-1a13acb56498" Dec 04 22:16:14.636248 master-0 kubenswrapper[8606]: I1204 22:16:14.634856 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:14.697374 master-0 kubenswrapper[8606]: I1204 22:16:14.697221 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:14.706259 master-0 kubenswrapper[8606]: I1204 22:16:14.706212 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:14.708040 master-0 kubenswrapper[8606]: I1204 22:16:14.708013 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:15.588868 master-0 kubenswrapper[8606]: I1204 22:16:15.588608 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s7vv6" podUID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerName="registry-server" probeResult="failure" output=< Dec 04 22:16:15.588868 master-0 kubenswrapper[8606]: timeout: failed to connect service ":50051" within 1s Dec 04 22:16:15.588868 master-0 kubenswrapper[8606]: > Dec 04 22:16:15.935607 master-0 kubenswrapper[8606]: I1204 22:16:15.935367 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdxp5"] Dec 04 22:16:16.630544 master-0 kubenswrapper[8606]: I1204 22:16:16.629231 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xdxp5" podUID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerName="registry-server" containerID="cri-o://f4721e02aa4946a4882889e7b565089e4bf32778f77ac4c0a12e05dd06675241" gracePeriod=2 Dec 04 22:16:17.641961 master-0 kubenswrapper[8606]: I1204 22:16:17.641807 8606 generic.go:334] "Generic (PLEG): container finished" podID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerID="f4721e02aa4946a4882889e7b565089e4bf32778f77ac4c0a12e05dd06675241" exitCode=0 Dec 04 22:16:17.641961 master-0 kubenswrapper[8606]: I1204 22:16:17.641883 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdxp5" event={"ID":"15d1a112-f449-4ad5-9c27-e9428491fec1","Type":"ContainerDied","Data":"f4721e02aa4946a4882889e7b565089e4bf32778f77ac4c0a12e05dd06675241"} Dec 04 22:16:17.803934 master-0 kubenswrapper[8606]: I1204 22:16:17.803864 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7wjzf"] Dec 04 22:16:17.804236 master-0 kubenswrapper[8606]: I1204 22:16:17.804200 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7wjzf" podUID="8173c976-7b52-4172-8b80-e174d957d705" containerName="registry-server" containerID="cri-o://87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844" gracePeriod=2 Dec 04 22:16:18.203632 master-0 kubenswrapper[8606]: I1204 22:16:18.203398 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:18.273171 master-0 kubenswrapper[8606]: I1204 22:16:18.273018 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w5fd\" (UniqueName: \"kubernetes.io/projected/15d1a112-f449-4ad5-9c27-e9428491fec1-kube-api-access-4w5fd\") pod \"15d1a112-f449-4ad5-9c27-e9428491fec1\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " Dec 04 22:16:18.273171 master-0 kubenswrapper[8606]: I1204 22:16:18.273071 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-utilities\") pod \"15d1a112-f449-4ad5-9c27-e9428491fec1\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " Dec 04 22:16:18.273171 master-0 kubenswrapper[8606]: I1204 22:16:18.273097 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-catalog-content\") pod \"15d1a112-f449-4ad5-9c27-e9428491fec1\" (UID: \"15d1a112-f449-4ad5-9c27-e9428491fec1\") " Dec 04 22:16:18.275623 master-0 kubenswrapper[8606]: I1204 22:16:18.274369 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-utilities" (OuterVolumeSpecName: "utilities") pod "15d1a112-f449-4ad5-9c27-e9428491fec1" (UID: "15d1a112-f449-4ad5-9c27-e9428491fec1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:16:18.276673 master-0 kubenswrapper[8606]: I1204 22:16:18.276622 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d1a112-f449-4ad5-9c27-e9428491fec1-kube-api-access-4w5fd" (OuterVolumeSpecName: "kube-api-access-4w5fd") pod "15d1a112-f449-4ad5-9c27-e9428491fec1" (UID: "15d1a112-f449-4ad5-9c27-e9428491fec1"). InnerVolumeSpecName "kube-api-access-4w5fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:16:18.290181 master-0 kubenswrapper[8606]: I1204 22:16:18.290106 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15d1a112-f449-4ad5-9c27-e9428491fec1" (UID: "15d1a112-f449-4ad5-9c27-e9428491fec1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:16:18.329606 master-0 kubenswrapper[8606]: I1204 22:16:18.329568 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:18.375111 master-0 kubenswrapper[8606]: I1204 22:16:18.374998 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-catalog-content\") pod \"8173c976-7b52-4172-8b80-e174d957d705\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " Dec 04 22:16:18.375111 master-0 kubenswrapper[8606]: I1204 22:16:18.375089 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-utilities\") pod \"8173c976-7b52-4172-8b80-e174d957d705\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " Dec 04 22:16:18.375399 master-0 kubenswrapper[8606]: I1204 22:16:18.375337 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbgvr\" (UniqueName: \"kubernetes.io/projected/8173c976-7b52-4172-8b80-e174d957d705-kube-api-access-zbgvr\") pod \"8173c976-7b52-4172-8b80-e174d957d705\" (UID: \"8173c976-7b52-4172-8b80-e174d957d705\") " Dec 04 22:16:18.375776 master-0 kubenswrapper[8606]: I1204 22:16:18.375742 8606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:18.375776 master-0 kubenswrapper[8606]: I1204 22:16:18.375765 8606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15d1a112-f449-4ad5-9c27-e9428491fec1-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:18.375776 master-0 kubenswrapper[8606]: I1204 22:16:18.375776 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w5fd\" (UniqueName: \"kubernetes.io/projected/15d1a112-f449-4ad5-9c27-e9428491fec1-kube-api-access-4w5fd\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:18.376094 master-0 kubenswrapper[8606]: I1204 22:16:18.376033 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-utilities" (OuterVolumeSpecName: "utilities") pod "8173c976-7b52-4172-8b80-e174d957d705" (UID: "8173c976-7b52-4172-8b80-e174d957d705"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:16:18.378306 master-0 kubenswrapper[8606]: I1204 22:16:18.378244 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8173c976-7b52-4172-8b80-e174d957d705-kube-api-access-zbgvr" (OuterVolumeSpecName: "kube-api-access-zbgvr") pod "8173c976-7b52-4172-8b80-e174d957d705" (UID: "8173c976-7b52-4172-8b80-e174d957d705"). InnerVolumeSpecName "kube-api-access-zbgvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:16:18.417927 master-0 kubenswrapper[8606]: I1204 22:16:18.417810 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8173c976-7b52-4172-8b80-e174d957d705" (UID: "8173c976-7b52-4172-8b80-e174d957d705"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:16:18.478318 master-0 kubenswrapper[8606]: I1204 22:16:18.478167 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbgvr\" (UniqueName: \"kubernetes.io/projected/8173c976-7b52-4172-8b80-e174d957d705-kube-api-access-zbgvr\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:18.478318 master-0 kubenswrapper[8606]: I1204 22:16:18.478223 8606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:18.478318 master-0 kubenswrapper[8606]: I1204 22:16:18.478239 8606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8173c976-7b52-4172-8b80-e174d957d705-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:18.657074 master-0 kubenswrapper[8606]: I1204 22:16:18.656970 8606 generic.go:334] "Generic (PLEG): container finished" podID="8173c976-7b52-4172-8b80-e174d957d705" containerID="87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844" exitCode=0 Dec 04 22:16:18.657074 master-0 kubenswrapper[8606]: I1204 22:16:18.657048 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wjzf" Dec 04 22:16:18.658433 master-0 kubenswrapper[8606]: I1204 22:16:18.658348 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wjzf" event={"ID":"8173c976-7b52-4172-8b80-e174d957d705","Type":"ContainerDied","Data":"87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844"} Dec 04 22:16:18.658967 master-0 kubenswrapper[8606]: I1204 22:16:18.658719 8606 scope.go:117] "RemoveContainer" containerID="87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844" Dec 04 22:16:18.659189 master-0 kubenswrapper[8606]: I1204 22:16:18.659150 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wjzf" event={"ID":"8173c976-7b52-4172-8b80-e174d957d705","Type":"ContainerDied","Data":"7a4328c266fbb2fea6968364fb7c7564fb04321ac1d55240bd90df6d43f10362"} Dec 04 22:16:18.662664 master-0 kubenswrapper[8606]: I1204 22:16:18.662298 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdxp5" event={"ID":"15d1a112-f449-4ad5-9c27-e9428491fec1","Type":"ContainerDied","Data":"3c5f5f9b120b28d25c13a5d9606587f36c8f6d3abfee3c6299c5e26d8e70fbfd"} Dec 04 22:16:18.662664 master-0 kubenswrapper[8606]: I1204 22:16:18.662413 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdxp5" Dec 04 22:16:18.694111 master-0 kubenswrapper[8606]: I1204 22:16:18.694065 8606 scope.go:117] "RemoveContainer" containerID="f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df" Dec 04 22:16:18.724520 master-0 kubenswrapper[8606]: I1204 22:16:18.724462 8606 scope.go:117] "RemoveContainer" containerID="8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f" Dec 04 22:16:18.751167 master-0 kubenswrapper[8606]: I1204 22:16:18.751121 8606 scope.go:117] "RemoveContainer" containerID="87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844" Dec 04 22:16:18.751894 master-0 kubenswrapper[8606]: E1204 22:16:18.751838 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844\": container with ID starting with 87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844 not found: ID does not exist" containerID="87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844" Dec 04 22:16:18.751981 master-0 kubenswrapper[8606]: I1204 22:16:18.751898 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844"} err="failed to get container status \"87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844\": rpc error: code = NotFound desc = could not find container \"87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844\": container with ID starting with 87285a9b28e25516ec22412a570755d1a762e3190386b88ad39a0f613b549844 not found: ID does not exist" Dec 04 22:16:18.751981 master-0 kubenswrapper[8606]: I1204 22:16:18.751939 8606 scope.go:117] "RemoveContainer" containerID="f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df" Dec 04 22:16:18.752385 master-0 kubenswrapper[8606]: E1204 22:16:18.752342 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df\": container with ID starting with f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df not found: ID does not exist" containerID="f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df" Dec 04 22:16:18.752458 master-0 kubenswrapper[8606]: I1204 22:16:18.752393 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df"} err="failed to get container status \"f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df\": rpc error: code = NotFound desc = could not find container \"f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df\": container with ID starting with f3b9a93d457323f1902bd1f5686977d0be3a5771495f829055f8af88dc6fd3df not found: ID does not exist" Dec 04 22:16:18.752458 master-0 kubenswrapper[8606]: I1204 22:16:18.752421 8606 scope.go:117] "RemoveContainer" containerID="8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f" Dec 04 22:16:18.752847 master-0 kubenswrapper[8606]: E1204 22:16:18.752818 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f\": container with ID starting with 8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f not found: ID does not exist" containerID="8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f" Dec 04 22:16:18.752914 master-0 kubenswrapper[8606]: I1204 22:16:18.752849 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f"} err="failed to get container status \"8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f\": rpc error: code = NotFound desc = could not find container \"8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f\": container with ID starting with 8a3a56e77ba4155e297385a2bdb6d5227fa4ca043c8c7193fbb85cae0c33e91f not found: ID does not exist" Dec 04 22:16:18.752914 master-0 kubenswrapper[8606]: I1204 22:16:18.752870 8606 scope.go:117] "RemoveContainer" containerID="f4721e02aa4946a4882889e7b565089e4bf32778f77ac4c0a12e05dd06675241" Dec 04 22:16:18.777742 master-0 kubenswrapper[8606]: I1204 22:16:18.777677 8606 scope.go:117] "RemoveContainer" containerID="442a245ed4230223a6837f3a30f95dde82b173406f0654bf47dd554da457b254" Dec 04 22:16:18.802686 master-0 kubenswrapper[8606]: I1204 22:16:18.801990 8606 scope.go:117] "RemoveContainer" containerID="e1d8730a080e7d5774f84fbb41c8785ca93cb335811d01d830a7f679e41bd66e" Dec 04 22:16:19.761841 master-0 kubenswrapper[8606]: I1204 22:16:19.761770 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-md4z6"] Dec 04 22:16:19.765129 master-0 kubenswrapper[8606]: I1204 22:16:19.762171 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-md4z6" podUID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerName="registry-server" containerID="cri-o://b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4" gracePeriod=2 Dec 04 22:16:19.778708 master-0 kubenswrapper[8606]: I1204 22:16:19.778657 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdxp5"] Dec 04 22:16:19.786519 master-0 kubenswrapper[8606]: I1204 22:16:19.786151 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdxp5"] Dec 04 22:16:19.819978 master-0 kubenswrapper[8606]: I1204 22:16:19.819925 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7wjzf"] Dec 04 22:16:19.826205 master-0 kubenswrapper[8606]: I1204 22:16:19.826155 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7wjzf"] Dec 04 22:16:20.260083 master-0 kubenswrapper[8606]: I1204 22:16:20.260034 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:20.305589 master-0 kubenswrapper[8606]: I1204 22:16:20.305467 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-utilities\") pod \"f25df7a7-c30a-4580-a572-122277c2ff7c\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " Dec 04 22:16:20.305855 master-0 kubenswrapper[8606]: I1204 22:16:20.305606 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxr7p\" (UniqueName: \"kubernetes.io/projected/f25df7a7-c30a-4580-a572-122277c2ff7c-kube-api-access-zxr7p\") pod \"f25df7a7-c30a-4580-a572-122277c2ff7c\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " Dec 04 22:16:20.305855 master-0 kubenswrapper[8606]: I1204 22:16:20.305721 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-catalog-content\") pod \"f25df7a7-c30a-4580-a572-122277c2ff7c\" (UID: \"f25df7a7-c30a-4580-a572-122277c2ff7c\") " Dec 04 22:16:20.331226 master-0 kubenswrapper[8606]: I1204 22:16:20.331160 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-utilities" (OuterVolumeSpecName: "utilities") pod "f25df7a7-c30a-4580-a572-122277c2ff7c" (UID: "f25df7a7-c30a-4580-a572-122277c2ff7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:16:20.333806 master-0 kubenswrapper[8606]: I1204 22:16:20.333745 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25df7a7-c30a-4580-a572-122277c2ff7c-kube-api-access-zxr7p" (OuterVolumeSpecName: "kube-api-access-zxr7p") pod "f25df7a7-c30a-4580-a572-122277c2ff7c" (UID: "f25df7a7-c30a-4580-a572-122277c2ff7c"). InnerVolumeSpecName "kube-api-access-zxr7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:16:20.390295 master-0 kubenswrapper[8606]: I1204 22:16:20.390133 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f25df7a7-c30a-4580-a572-122277c2ff7c" (UID: "f25df7a7-c30a-4580-a572-122277c2ff7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:16:20.407821 master-0 kubenswrapper[8606]: I1204 22:16:20.407746 8606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:20.407821 master-0 kubenswrapper[8606]: I1204 22:16:20.407790 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxr7p\" (UniqueName: \"kubernetes.io/projected/f25df7a7-c30a-4580-a572-122277c2ff7c-kube-api-access-zxr7p\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:20.407821 master-0 kubenswrapper[8606]: I1204 22:16:20.407803 8606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f25df7a7-c30a-4580-a572-122277c2ff7c-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:20.689689 master-0 kubenswrapper[8606]: I1204 22:16:20.689606 8606 generic.go:334] "Generic (PLEG): container finished" podID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerID="b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4" exitCode=0 Dec 04 22:16:20.689689 master-0 kubenswrapper[8606]: I1204 22:16:20.689673 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-md4z6" event={"ID":"f25df7a7-c30a-4580-a572-122277c2ff7c","Type":"ContainerDied","Data":"b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4"} Dec 04 22:16:20.690220 master-0 kubenswrapper[8606]: I1204 22:16:20.689724 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-md4z6" Dec 04 22:16:20.690220 master-0 kubenswrapper[8606]: I1204 22:16:20.689763 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-md4z6" event={"ID":"f25df7a7-c30a-4580-a572-122277c2ff7c","Type":"ContainerDied","Data":"d59c94fe0debd0dd34f0289fe87a5367ddc4cb590f2ab430c4c05648c3e9787c"} Dec 04 22:16:20.690220 master-0 kubenswrapper[8606]: I1204 22:16:20.689804 8606 scope.go:117] "RemoveContainer" containerID="b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4" Dec 04 22:16:20.713485 master-0 kubenswrapper[8606]: I1204 22:16:20.713390 8606 scope.go:117] "RemoveContainer" containerID="53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810" Dec 04 22:16:20.766198 master-0 kubenswrapper[8606]: I1204 22:16:20.766110 8606 scope.go:117] "RemoveContainer" containerID="5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998" Dec 04 22:16:20.793762 master-0 kubenswrapper[8606]: I1204 22:16:20.793629 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-md4z6"] Dec 04 22:16:20.799827 master-0 kubenswrapper[8606]: I1204 22:16:20.799713 8606 scope.go:117] "RemoveContainer" containerID="b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4" Dec 04 22:16:20.800582 master-0 kubenswrapper[8606]: E1204 22:16:20.800467 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4\": container with ID starting with b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4 not found: ID does not exist" containerID="b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4" Dec 04 22:16:20.800722 master-0 kubenswrapper[8606]: I1204 22:16:20.800589 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4"} err="failed to get container status \"b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4\": rpc error: code = NotFound desc = could not find container \"b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4\": container with ID starting with b1a10134de1a2af855c3bc30d928cb3f68891ce12a1153768b27601f7733a7e4 not found: ID does not exist" Dec 04 22:16:20.800722 master-0 kubenswrapper[8606]: I1204 22:16:20.800639 8606 scope.go:117] "RemoveContainer" containerID="53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810" Dec 04 22:16:20.801342 master-0 kubenswrapper[8606]: I1204 22:16:20.801284 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-md4z6"] Dec 04 22:16:20.801533 master-0 kubenswrapper[8606]: E1204 22:16:20.801444 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810\": container with ID starting with 53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810 not found: ID does not exist" containerID="53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810" Dec 04 22:16:20.801533 master-0 kubenswrapper[8606]: I1204 22:16:20.801497 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810"} err="failed to get container status \"53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810\": rpc error: code = NotFound desc = could not find container \"53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810\": container with ID starting with 53406092c1343b8cec39a6890662b6ecbe0c0763c976a98c294321bfc7fbe810 not found: ID does not exist" Dec 04 22:16:20.801853 master-0 kubenswrapper[8606]: I1204 22:16:20.801546 8606 scope.go:117] "RemoveContainer" containerID="5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998" Dec 04 22:16:20.802125 master-0 kubenswrapper[8606]: E1204 22:16:20.802044 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998\": container with ID starting with 5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998 not found: ID does not exist" containerID="5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998" Dec 04 22:16:20.802246 master-0 kubenswrapper[8606]: I1204 22:16:20.802121 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998"} err="failed to get container status \"5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998\": rpc error: code = NotFound desc = could not find container \"5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998\": container with ID starting with 5611ae737ab54df4c2e848ffe10c7676199335b2bcd0dcca8f0db3d6bfa1a998 not found: ID does not exist" Dec 04 22:16:21.392408 master-0 kubenswrapper[8606]: I1204 22:16:21.392297 8606 scope.go:117] "RemoveContainer" containerID="27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780" Dec 04 22:16:21.423363 master-0 kubenswrapper[8606]: I1204 22:16:21.418620 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d1a112-f449-4ad5-9c27-e9428491fec1" path="/var/lib/kubelet/pods/15d1a112-f449-4ad5-9c27-e9428491fec1/volumes" Dec 04 22:16:21.423363 master-0 kubenswrapper[8606]: I1204 22:16:21.420371 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8173c976-7b52-4172-8b80-e174d957d705" path="/var/lib/kubelet/pods/8173c976-7b52-4172-8b80-e174d957d705/volumes" Dec 04 22:16:21.423363 master-0 kubenswrapper[8606]: I1204 22:16:21.422092 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25df7a7-c30a-4580-a572-122277c2ff7c" path="/var/lib/kubelet/pods/f25df7a7-c30a-4580-a572-122277c2ff7c/volumes" Dec 04 22:16:21.703795 master-0 kubenswrapper[8606]: I1204 22:16:21.703230 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/4.log" Dec 04 22:16:21.703795 master-0 kubenswrapper[8606]: I1204 22:16:21.703323 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerStarted","Data":"c21b1f28f2ecd3a6f853caa962c1e919058a4a8d42a7386884dd5b88c192ad87"} Dec 04 22:16:24.618441 master-0 kubenswrapper[8606]: I1204 22:16:24.618357 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:24.692558 master-0 kubenswrapper[8606]: I1204 22:16:24.690344 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:26.932178 master-0 kubenswrapper[8606]: I1204 22:16:26.932075 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s7vv6"] Dec 04 22:16:26.933174 master-0 kubenswrapper[8606]: I1204 22:16:26.932456 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s7vv6" podUID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerName="registry-server" containerID="cri-o://822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb" gracePeriod=2 Dec 04 22:16:27.453423 master-0 kubenswrapper[8606]: I1204 22:16:27.453366 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:27.537818 master-0 kubenswrapper[8606]: I1204 22:16:27.537734 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-catalog-content\") pod \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " Dec 04 22:16:27.538098 master-0 kubenswrapper[8606]: I1204 22:16:27.537903 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvcgz\" (UniqueName: \"kubernetes.io/projected/eecc4b0d-529b-4c1f-a4ba-80057c765b80-kube-api-access-wvcgz\") pod \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " Dec 04 22:16:27.538098 master-0 kubenswrapper[8606]: I1204 22:16:27.537961 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-utilities\") pod \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\" (UID: \"eecc4b0d-529b-4c1f-a4ba-80057c765b80\") " Dec 04 22:16:27.539060 master-0 kubenswrapper[8606]: I1204 22:16:27.538988 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-utilities" (OuterVolumeSpecName: "utilities") pod "eecc4b0d-529b-4c1f-a4ba-80057c765b80" (UID: "eecc4b0d-529b-4c1f-a4ba-80057c765b80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:16:27.541812 master-0 kubenswrapper[8606]: I1204 22:16:27.541743 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eecc4b0d-529b-4c1f-a4ba-80057c765b80-kube-api-access-wvcgz" (OuterVolumeSpecName: "kube-api-access-wvcgz") pod "eecc4b0d-529b-4c1f-a4ba-80057c765b80" (UID: "eecc4b0d-529b-4c1f-a4ba-80057c765b80"). InnerVolumeSpecName "kube-api-access-wvcgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:16:27.639801 master-0 kubenswrapper[8606]: I1204 22:16:27.639645 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvcgz\" (UniqueName: \"kubernetes.io/projected/eecc4b0d-529b-4c1f-a4ba-80057c765b80-kube-api-access-wvcgz\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:27.639801 master-0 kubenswrapper[8606]: I1204 22:16:27.639697 8606 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:27.713628 master-0 kubenswrapper[8606]: I1204 22:16:27.713562 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eecc4b0d-529b-4c1f-a4ba-80057c765b80" (UID: "eecc4b0d-529b-4c1f-a4ba-80057c765b80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:16:27.741209 master-0 kubenswrapper[8606]: I1204 22:16:27.741163 8606 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eecc4b0d-529b-4c1f-a4ba-80057c765b80-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:27.761794 master-0 kubenswrapper[8606]: I1204 22:16:27.761749 8606 generic.go:334] "Generic (PLEG): container finished" podID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerID="822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb" exitCode=0 Dec 04 22:16:27.762034 master-0 kubenswrapper[8606]: I1204 22:16:27.761822 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s7vv6" Dec 04 22:16:27.762216 master-0 kubenswrapper[8606]: I1204 22:16:27.761843 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7vv6" event={"ID":"eecc4b0d-529b-4c1f-a4ba-80057c765b80","Type":"ContainerDied","Data":"822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb"} Dec 04 22:16:27.762272 master-0 kubenswrapper[8606]: I1204 22:16:27.762240 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s7vv6" event={"ID":"eecc4b0d-529b-4c1f-a4ba-80057c765b80","Type":"ContainerDied","Data":"0b0c9fd3c02f89fd9af3a5e305e018905e6be1f956cfb0045233a7cfc1ed2fca"} Dec 04 22:16:27.762309 master-0 kubenswrapper[8606]: I1204 22:16:27.762270 8606 scope.go:117] "RemoveContainer" containerID="822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb" Dec 04 22:16:27.789748 master-0 kubenswrapper[8606]: I1204 22:16:27.789683 8606 scope.go:117] "RemoveContainer" containerID="5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320" Dec 04 22:16:27.811896 master-0 kubenswrapper[8606]: I1204 22:16:27.811815 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s7vv6"] Dec 04 22:16:27.816605 master-0 kubenswrapper[8606]: I1204 22:16:27.816517 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s7vv6"] Dec 04 22:16:27.833948 master-0 kubenswrapper[8606]: I1204 22:16:27.833887 8606 scope.go:117] "RemoveContainer" containerID="c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47" Dec 04 22:16:27.858117 master-0 kubenswrapper[8606]: I1204 22:16:27.858037 8606 scope.go:117] "RemoveContainer" containerID="822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb" Dec 04 22:16:27.858924 master-0 kubenswrapper[8606]: E1204 22:16:27.858828 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb\": container with ID starting with 822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb not found: ID does not exist" containerID="822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb" Dec 04 22:16:27.859183 master-0 kubenswrapper[8606]: I1204 22:16:27.858952 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb"} err="failed to get container status \"822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb\": rpc error: code = NotFound desc = could not find container \"822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb\": container with ID starting with 822f3ce6155024d41ca8107896a2c3fe8c4e00eefba7580b2c00ab55c79aedeb not found: ID does not exist" Dec 04 22:16:27.859183 master-0 kubenswrapper[8606]: I1204 22:16:27.859007 8606 scope.go:117] "RemoveContainer" containerID="5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320" Dec 04 22:16:27.859832 master-0 kubenswrapper[8606]: E1204 22:16:27.859754 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320\": container with ID starting with 5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320 not found: ID does not exist" containerID="5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320" Dec 04 22:16:27.859985 master-0 kubenswrapper[8606]: I1204 22:16:27.859813 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320"} err="failed to get container status \"5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320\": rpc error: code = NotFound desc = could not find container \"5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320\": container with ID starting with 5df75376390f3cd51ca7fb8ab69ff1a9e404d83ba25c8f446908d5c7e28f2320 not found: ID does not exist" Dec 04 22:16:27.859985 master-0 kubenswrapper[8606]: I1204 22:16:27.859859 8606 scope.go:117] "RemoveContainer" containerID="c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47" Dec 04 22:16:27.860372 master-0 kubenswrapper[8606]: E1204 22:16:27.860276 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47\": container with ID starting with c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47 not found: ID does not exist" containerID="c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47" Dec 04 22:16:27.860676 master-0 kubenswrapper[8606]: I1204 22:16:27.860347 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47"} err="failed to get container status \"c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47\": rpc error: code = NotFound desc = could not find container \"c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47\": container with ID starting with c518d6300b164e5f8c580999cbe8b3ca828460753fbb31f16c184eef9bd65e47 not found: ID does not exist" Dec 04 22:16:29.406865 master-0 kubenswrapper[8606]: I1204 22:16:29.406769 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" path="/var/lib/kubelet/pods/eecc4b0d-529b-4c1f-a4ba-80057c765b80/volumes" Dec 04 22:16:35.259800 master-0 kubenswrapper[8606]: I1204 22:16:35.259754 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-1-master-0_b4a17a75-3abb-4d82-911a-b5923ba1cef1/installer/0.log" Dec 04 22:16:35.260310 master-0 kubenswrapper[8606]: I1204 22:16:35.259841 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:35.274602 master-0 kubenswrapper[8606]: I1204 22:16:35.274545 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kubelet-dir\") pod \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " Dec 04 22:16:35.274768 master-0 kubenswrapper[8606]: I1204 22:16:35.274618 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b4a17a75-3abb-4d82-911a-b5923ba1cef1" (UID: "b4a17a75-3abb-4d82-911a-b5923ba1cef1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:16:35.274768 master-0 kubenswrapper[8606]: I1204 22:16:35.274683 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-var-lock\") pod \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " Dec 04 22:16:35.274840 master-0 kubenswrapper[8606]: I1204 22:16:35.274779 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-var-lock" (OuterVolumeSpecName: "var-lock") pod "b4a17a75-3abb-4d82-911a-b5923ba1cef1" (UID: "b4a17a75-3abb-4d82-911a-b5923ba1cef1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:16:35.274875 master-0 kubenswrapper[8606]: I1204 22:16:35.274836 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kube-api-access\") pod \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\" (UID: \"b4a17a75-3abb-4d82-911a-b5923ba1cef1\") " Dec 04 22:16:35.275393 master-0 kubenswrapper[8606]: I1204 22:16:35.275271 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:35.275393 master-0 kubenswrapper[8606]: I1204 22:16:35.275303 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b4a17a75-3abb-4d82-911a-b5923ba1cef1-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:35.282013 master-0 kubenswrapper[8606]: I1204 22:16:35.281935 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b4a17a75-3abb-4d82-911a-b5923ba1cef1" (UID: "b4a17a75-3abb-4d82-911a-b5923ba1cef1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:16:35.376685 master-0 kubenswrapper[8606]: I1204 22:16:35.376616 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4a17a75-3abb-4d82-911a-b5923ba1cef1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:16:35.841221 master-0 kubenswrapper[8606]: I1204 22:16:35.841141 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-1-master-0_b4a17a75-3abb-4d82-911a-b5923ba1cef1/installer/0.log" Dec 04 22:16:35.841221 master-0 kubenswrapper[8606]: I1204 22:16:35.841223 8606 generic.go:334] "Generic (PLEG): container finished" podID="b4a17a75-3abb-4d82-911a-b5923ba1cef1" containerID="3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85" exitCode=1 Dec 04 22:16:35.841583 master-0 kubenswrapper[8606]: I1204 22:16:35.841258 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"b4a17a75-3abb-4d82-911a-b5923ba1cef1","Type":"ContainerDied","Data":"3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85"} Dec 04 22:16:35.841583 master-0 kubenswrapper[8606]: I1204 22:16:35.841296 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"b4a17a75-3abb-4d82-911a-b5923ba1cef1","Type":"ContainerDied","Data":"5200ff32baf614f84811e473842fee6f8db2a3ed9b519e8bc8ac1e3cf04b6b65"} Dec 04 22:16:35.841583 master-0 kubenswrapper[8606]: I1204 22:16:35.841319 8606 scope.go:117] "RemoveContainer" containerID="3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85" Dec 04 22:16:35.841583 master-0 kubenswrapper[8606]: I1204 22:16:35.841389 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 04 22:16:35.871413 master-0 kubenswrapper[8606]: I1204 22:16:35.871326 8606 scope.go:117] "RemoveContainer" containerID="3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85" Dec 04 22:16:35.873470 master-0 kubenswrapper[8606]: E1204 22:16:35.872143 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85\": container with ID starting with 3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85 not found: ID does not exist" containerID="3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85" Dec 04 22:16:35.873470 master-0 kubenswrapper[8606]: I1204 22:16:35.872200 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85"} err="failed to get container status \"3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85\": rpc error: code = NotFound desc = could not find container \"3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85\": container with ID starting with 3fb55953cb58b6b439f337fff13c34c612c7cd55e6a3dc5a8bbcace4273d9e85 not found: ID does not exist" Dec 04 22:16:35.875802 master-0 kubenswrapper[8606]: I1204 22:16:35.875762 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Dec 04 22:16:35.882024 master-0 kubenswrapper[8606]: I1204 22:16:35.881949 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Dec 04 22:16:37.407268 master-0 kubenswrapper[8606]: I1204 22:16:37.407194 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a17a75-3abb-4d82-911a-b5923ba1cef1" path="/var/lib/kubelet/pods/b4a17a75-3abb-4d82-911a-b5923ba1cef1/volumes" Dec 04 22:16:47.952823 master-0 kubenswrapper[8606]: I1204 22:16:47.952734 8606 generic.go:334] "Generic (PLEG): container finished" podID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerID="1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3" exitCode=0 Dec 04 22:16:47.953983 master-0 kubenswrapper[8606]: I1204 22:16:47.953918 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerDied","Data":"1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3"} Dec 04 22:16:47.954189 master-0 kubenswrapper[8606]: I1204 22:16:47.954159 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerStarted","Data":"0036ee313e2e8fbc7aa4a79880a6b001a94f998abd62378bddfdf0a04bcdd8e0"} Dec 04 22:16:47.954341 master-0 kubenswrapper[8606]: I1204 22:16:47.954223 8606 scope.go:117] "RemoveContainer" containerID="57c531c30874bf4998e2715db0baeccbcbac79537b43a4f58e8644a7f87789e1" Dec 04 22:16:48.941033 master-0 kubenswrapper[8606]: I1204 22:16:48.940923 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:16:48.941636 master-0 kubenswrapper[8606]: I1204 22:16:48.941178 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:16:48.945221 master-0 kubenswrapper[8606]: I1204 22:16:48.945153 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:48.945221 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:48.945221 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:48.945221 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:48.945619 master-0 kubenswrapper[8606]: I1204 22:16:48.945240 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:49.949208 master-0 kubenswrapper[8606]: I1204 22:16:49.949067 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:49.949208 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:49.949208 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:49.949208 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:49.949208 master-0 kubenswrapper[8606]: I1204 22:16:49.949144 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:50.945364 master-0 kubenswrapper[8606]: I1204 22:16:50.945257 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:50.945364 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:50.945364 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:50.945364 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:50.945874 master-0 kubenswrapper[8606]: I1204 22:16:50.945361 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:51.945412 master-0 kubenswrapper[8606]: I1204 22:16:51.945321 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:51.945412 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:51.945412 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:51.945412 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:51.946480 master-0 kubenswrapper[8606]: I1204 22:16:51.945439 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:52.944802 master-0 kubenswrapper[8606]: I1204 22:16:52.944716 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:52.944802 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:52.944802 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:52.944802 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:52.945146 master-0 kubenswrapper[8606]: I1204 22:16:52.944820 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:53.944954 master-0 kubenswrapper[8606]: I1204 22:16:53.944805 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:53.944954 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:53.944954 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:53.944954 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:53.944954 master-0 kubenswrapper[8606]: I1204 22:16:53.944905 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:54.945632 master-0 kubenswrapper[8606]: I1204 22:16:54.945549 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:54.945632 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:54.945632 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:54.945632 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:54.945632 master-0 kubenswrapper[8606]: I1204 22:16:54.945631 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:55.944431 master-0 kubenswrapper[8606]: I1204 22:16:55.944286 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:55.944431 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:55.944431 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:55.944431 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:55.944431 master-0 kubenswrapper[8606]: I1204 22:16:55.944376 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:56.944845 master-0 kubenswrapper[8606]: I1204 22:16:56.944743 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:56.944845 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:56.944845 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:56.944845 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:56.944845 master-0 kubenswrapper[8606]: I1204 22:16:56.944837 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:57.944319 master-0 kubenswrapper[8606]: I1204 22:16:57.944220 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:57.944319 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:57.944319 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:57.944319 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:57.944319 master-0 kubenswrapper[8606]: I1204 22:16:57.944313 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:58.944287 master-0 kubenswrapper[8606]: I1204 22:16:58.944172 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:58.944287 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:58.944287 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:58.944287 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:58.945298 master-0 kubenswrapper[8606]: I1204 22:16:58.944295 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:16:59.944041 master-0 kubenswrapper[8606]: I1204 22:16:59.943910 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:16:59.944041 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:16:59.944041 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:16:59.944041 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:16:59.945161 master-0 kubenswrapper[8606]: I1204 22:16:59.944044 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:00.943694 master-0 kubenswrapper[8606]: I1204 22:17:00.943579 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:00.943694 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:00.943694 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:00.943694 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:00.944407 master-0 kubenswrapper[8606]: I1204 22:17:00.943723 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:01.944780 master-0 kubenswrapper[8606]: I1204 22:17:01.944667 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:01.944780 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:01.944780 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:01.944780 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:01.946096 master-0 kubenswrapper[8606]: I1204 22:17:01.944817 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:02.944562 master-0 kubenswrapper[8606]: I1204 22:17:02.944457 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:02.944562 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:02.944562 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:02.944562 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:02.945747 master-0 kubenswrapper[8606]: I1204 22:17:02.944590 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:03.944646 master-0 kubenswrapper[8606]: I1204 22:17:03.944558 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:03.944646 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:03.944646 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:03.944646 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:03.944646 master-0 kubenswrapper[8606]: I1204 22:17:03.944651 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:04.944464 master-0 kubenswrapper[8606]: I1204 22:17:04.944356 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:04.944464 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:04.944464 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:04.944464 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:04.945678 master-0 kubenswrapper[8606]: I1204 22:17:04.944486 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:05.943394 master-0 kubenswrapper[8606]: I1204 22:17:05.943294 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:05.943394 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:05.943394 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:05.943394 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:05.943776 master-0 kubenswrapper[8606]: I1204 22:17:05.943398 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:06.085108 master-0 kubenswrapper[8606]: I1204 22:17:06.084765 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085171 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085195 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085227 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8173c976-7b52-4172-8b80-e174d957d705" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085241 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8173c976-7b52-4172-8b80-e174d957d705" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085259 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerName="extract-content" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085273 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerName="extract-content" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085294 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a17a75-3abb-4d82-911a-b5923ba1cef1" containerName="installer" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085307 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a17a75-3abb-4d82-911a-b5923ba1cef1" containerName="installer" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085325 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085338 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085358 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerName="extract-utilities" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085371 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerName="extract-utilities" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085386 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerName="extract-utilities" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085399 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerName="extract-utilities" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085415 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerName="extract-utilities" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085432 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerName="extract-utilities" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085456 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerName="extract-content" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085468 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerName="extract-content" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085526 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" containerName="collect-profiles" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085541 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" containerName="collect-profiles" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085571 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8173c976-7b52-4172-8b80-e174d957d705" containerName="extract-utilities" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085584 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8173c976-7b52-4172-8b80-e174d957d705" containerName="extract-utilities" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085600 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8173c976-7b52-4172-8b80-e174d957d705" containerName="extract-content" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085613 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="8173c976-7b52-4172-8b80-e174d957d705" containerName="extract-content" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085642 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085655 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: E1204 22:17:06.085671 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerName="extract-content" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085684 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerName="extract-content" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085931 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a17a75-3abb-4d82-911a-b5923ba1cef1" containerName="installer" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.085982 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="8173c976-7b52-4172-8b80-e174d957d705" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.086005 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="eecc4b0d-529b-4c1f-a4ba-80057c765b80" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.086039 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d1a112-f449-4ad5-9c27-e9428491fec1" containerName="registry-server" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.086059 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" containerName="collect-profiles" Dec 04 22:17:06.086070 master-0 kubenswrapper[8606]: I1204 22:17:06.086107 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25df7a7-c30a-4580-a572-122277c2ff7c" containerName="registry-server" Dec 04 22:17:06.088754 master-0 kubenswrapper[8606]: I1204 22:17:06.086934 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.092897 master-0 kubenswrapper[8606]: I1204 22:17:06.092713 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-wkcjf" Dec 04 22:17:06.093704 master-0 kubenswrapper[8606]: I1204 22:17:06.093173 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 04 22:17:06.134077 master-0 kubenswrapper[8606]: I1204 22:17:06.102049 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 04 22:17:06.239949 master-0 kubenswrapper[8606]: I1204 22:17:06.239751 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-var-lock\") pod \"installer-3-master-0\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.239949 master-0 kubenswrapper[8606]: I1204 22:17:06.239905 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.240303 master-0 kubenswrapper[8606]: I1204 22:17:06.239963 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kube-api-access\") pod \"installer-3-master-0\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.341773 master-0 kubenswrapper[8606]: I1204 22:17:06.341676 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.341773 master-0 kubenswrapper[8606]: I1204 22:17:06.341746 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kube-api-access\") pod \"installer-3-master-0\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.342337 master-0 kubenswrapper[8606]: I1204 22:17:06.341826 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-var-lock\") pod \"installer-3-master-0\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.342337 master-0 kubenswrapper[8606]: I1204 22:17:06.341843 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.342337 master-0 kubenswrapper[8606]: I1204 22:17:06.342087 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-var-lock\") pod \"installer-3-master-0\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.370584 master-0 kubenswrapper[8606]: I1204 22:17:06.370363 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kube-api-access\") pod \"installer-3-master-0\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.464904 master-0 kubenswrapper[8606]: I1204 22:17:06.464796 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:06.917373 master-0 kubenswrapper[8606]: I1204 22:17:06.917134 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 04 22:17:06.947859 master-0 kubenswrapper[8606]: I1204 22:17:06.947798 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:06.947859 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:06.947859 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:06.947859 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:06.948211 master-0 kubenswrapper[8606]: I1204 22:17:06.947893 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:07.163871 master-0 kubenswrapper[8606]: I1204 22:17:07.163778 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"986a4de7-3a54-48dc-9599-49cf19ba0ad5","Type":"ContainerStarted","Data":"1fee06d19aff5ef21eb8427a31bd857aa51bbbcb2fe5924a93729689e0a74832"} Dec 04 22:17:07.945326 master-0 kubenswrapper[8606]: I1204 22:17:07.945186 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:07.945326 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:07.945326 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:07.945326 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:07.945855 master-0 kubenswrapper[8606]: I1204 22:17:07.945337 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:08.174942 master-0 kubenswrapper[8606]: I1204 22:17:08.174826 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"986a4de7-3a54-48dc-9599-49cf19ba0ad5","Type":"ContainerStarted","Data":"42c592fcd97dd09de62f2c07d511eb1f7fedb875ba01c50a76d8a639e15849ae"} Dec 04 22:17:08.207407 master-0 kubenswrapper[8606]: I1204 22:17:08.207116 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=2.207082386 podStartE2EDuration="2.207082386s" podCreationTimestamp="2025-12-04 22:17:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:17:08.199841865 +0000 UTC m=+993.010144120" watchObservedRunningTime="2025-12-04 22:17:08.207082386 +0000 UTC m=+993.017384631" Dec 04 22:17:08.944792 master-0 kubenswrapper[8606]: I1204 22:17:08.944688 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:08.944792 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:08.944792 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:08.944792 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:08.945129 master-0 kubenswrapper[8606]: I1204 22:17:08.944823 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:09.944655 master-0 kubenswrapper[8606]: I1204 22:17:09.944541 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:09.944655 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:09.944655 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:09.944655 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:09.944655 master-0 kubenswrapper[8606]: I1204 22:17:09.944640 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:10.330342 master-0 kubenswrapper[8606]: I1204 22:17:10.330228 8606 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 04 22:17:10.330749 master-0 kubenswrapper[8606]: I1204 22:17:10.330690 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" containerID="cri-o://b2de34afcf16d55af0ab629ff305e02bc4e8c470038e92112248dabc18c8bf30" gracePeriod=30 Dec 04 22:17:10.332290 master-0 kubenswrapper[8606]: I1204 22:17:10.332231 8606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 04 22:17:10.332709 master-0 kubenswrapper[8606]: E1204 22:17:10.332663 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 04 22:17:10.332709 master-0 kubenswrapper[8606]: I1204 22:17:10.332694 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 04 22:17:10.332709 master-0 kubenswrapper[8606]: E1204 22:17:10.332710 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 04 22:17:10.332956 master-0 kubenswrapper[8606]: I1204 22:17:10.332720 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 04 22:17:10.332956 master-0 kubenswrapper[8606]: E1204 22:17:10.332738 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 04 22:17:10.332956 master-0 kubenswrapper[8606]: I1204 22:17:10.332750 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 04 22:17:10.332956 master-0 kubenswrapper[8606]: I1204 22:17:10.332910 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 04 22:17:10.332956 master-0 kubenswrapper[8606]: I1204 22:17:10.332931 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 04 22:17:10.332956 master-0 kubenswrapper[8606]: I1204 22:17:10.332949 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 04 22:17:10.334537 master-0 kubenswrapper[8606]: I1204 22:17:10.334454 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:17:10.384879 master-0 kubenswrapper[8606]: I1204 22:17:10.384801 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 04 22:17:10.520806 master-0 kubenswrapper[8606]: I1204 22:17:10.520721 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:17:10.522114 master-0 kubenswrapper[8606]: I1204 22:17:10.522055 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:17:10.525955 master-0 kubenswrapper[8606]: I1204 22:17:10.525908 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 22:17:10.560090 master-0 kubenswrapper[8606]: I1204 22:17:10.559998 8606 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="6962b66d-b878-429c-bf53-6e343e1255a0" Dec 04 22:17:10.623736 master-0 kubenswrapper[8606]: I1204 22:17:10.623494 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"5e09e2af7200e6f9be469dbfd9bb1127\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " Dec 04 22:17:10.623736 master-0 kubenswrapper[8606]: I1204 22:17:10.623622 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"5e09e2af7200e6f9be469dbfd9bb1127\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " Dec 04 22:17:10.624103 master-0 kubenswrapper[8606]: I1204 22:17:10.624015 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets" (OuterVolumeSpecName: "secrets") pod "5e09e2af7200e6f9be469dbfd9bb1127" (UID: "5e09e2af7200e6f9be469dbfd9bb1127"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:10.624181 master-0 kubenswrapper[8606]: I1204 22:17:10.624133 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs" (OuterVolumeSpecName: "logs") pod "5e09e2af7200e6f9be469dbfd9bb1127" (UID: "5e09e2af7200e6f9be469dbfd9bb1127"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:10.624321 master-0 kubenswrapper[8606]: I1204 22:17:10.624252 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:17:10.624483 master-0 kubenswrapper[8606]: I1204 22:17:10.624428 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:17:10.624602 master-0 kubenswrapper[8606]: I1204 22:17:10.624494 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:17:10.624602 master-0 kubenswrapper[8606]: I1204 22:17:10.624565 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:17:10.624725 master-0 kubenswrapper[8606]: I1204 22:17:10.624644 8606 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:10.624725 master-0 kubenswrapper[8606]: I1204 22:17:10.624670 8606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:10.679701 master-0 kubenswrapper[8606]: I1204 22:17:10.679583 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:17:10.715347 master-0 kubenswrapper[8606]: W1204 22:17:10.715256 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cb8c983acca0c27a191b3f720d4b1e0.slice/crio-e976d29655bd6a4804e44aefca912f97ea12da492eb732dd0d56f5c8ee61e225 WatchSource:0}: Error finding container e976d29655bd6a4804e44aefca912f97ea12da492eb732dd0d56f5c8ee61e225: Status 404 returned error can't find the container with id e976d29655bd6a4804e44aefca912f97ea12da492eb732dd0d56f5c8ee61e225 Dec 04 22:17:10.944365 master-0 kubenswrapper[8606]: I1204 22:17:10.944217 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:10.944365 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:10.944365 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:10.944365 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:10.944365 master-0 kubenswrapper[8606]: I1204 22:17:10.944317 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:11.206720 master-0 kubenswrapper[8606]: I1204 22:17:11.205672 8606 generic.go:334] "Generic (PLEG): container finished" podID="2cb8c983acca0c27a191b3f720d4b1e0" containerID="509e3ce53fc945130075276f6099e96d73baf21a6fcaddff5d395b3b94de9c58" exitCode=0 Dec 04 22:17:11.206720 master-0 kubenswrapper[8606]: I1204 22:17:11.205780 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerDied","Data":"509e3ce53fc945130075276f6099e96d73baf21a6fcaddff5d395b3b94de9c58"} Dec 04 22:17:11.206720 master-0 kubenswrapper[8606]: I1204 22:17:11.205816 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"e976d29655bd6a4804e44aefca912f97ea12da492eb732dd0d56f5c8ee61e225"} Dec 04 22:17:11.213105 master-0 kubenswrapper[8606]: I1204 22:17:11.213049 8606 generic.go:334] "Generic (PLEG): container finished" podID="5e09e2af7200e6f9be469dbfd9bb1127" containerID="b2de34afcf16d55af0ab629ff305e02bc4e8c470038e92112248dabc18c8bf30" exitCode=0 Dec 04 22:17:11.213255 master-0 kubenswrapper[8606]: I1204 22:17:11.213137 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 04 22:17:11.213343 master-0 kubenswrapper[8606]: I1204 22:17:11.213264 8606 scope.go:117] "RemoveContainer" containerID="229f04f01e031c096a8d66a3a3b9f5322d73a495869829416a90812a311e2aee" Dec 04 22:17:11.213421 master-0 kubenswrapper[8606]: I1204 22:17:11.213228 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c2dca88da5957207daf9ea6b5fc6fb1e136ded6ac37a7d6c7df7c70ee7176a1" Dec 04 22:17:11.218022 master-0 kubenswrapper[8606]: I1204 22:17:11.217979 8606 generic.go:334] "Generic (PLEG): container finished" podID="3099f7b5-f904-4d15-aedb-f4e558b813e4" containerID="f9af7ae05881c66c990776ea5e9ecae6917372ad2e83deed7c505b583fa9da46" exitCode=0 Dec 04 22:17:11.218100 master-0 kubenswrapper[8606]: I1204 22:17:11.218032 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"3099f7b5-f904-4d15-aedb-f4e558b813e4","Type":"ContainerDied","Data":"f9af7ae05881c66c990776ea5e9ecae6917372ad2e83deed7c505b583fa9da46"} Dec 04 22:17:11.406289 master-0 kubenswrapper[8606]: I1204 22:17:11.406183 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e09e2af7200e6f9be469dbfd9bb1127" path="/var/lib/kubelet/pods/5e09e2af7200e6f9be469dbfd9bb1127/volumes" Dec 04 22:17:11.406922 master-0 kubenswrapper[8606]: I1204 22:17:11.406636 8606 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 04 22:17:11.459612 master-0 kubenswrapper[8606]: I1204 22:17:11.459022 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 04 22:17:11.459612 master-0 kubenswrapper[8606]: I1204 22:17:11.459065 8606 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="6962b66d-b878-429c-bf53-6e343e1255a0" Dec 04 22:17:11.468532 master-0 kubenswrapper[8606]: I1204 22:17:11.464546 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 04 22:17:11.468532 master-0 kubenswrapper[8606]: I1204 22:17:11.464861 8606 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="6962b66d-b878-429c-bf53-6e343e1255a0" Dec 04 22:17:11.959710 master-0 kubenswrapper[8606]: I1204 22:17:11.944443 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:11.959710 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:11.959710 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:11.959710 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:11.959710 master-0 kubenswrapper[8606]: I1204 22:17:11.944590 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:12.232486 master-0 kubenswrapper[8606]: I1204 22:17:12.232417 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"5d13043a38ae0fb09211cd5a587bc3304b77f315cf7a3d95f4c81e25cbe2aabc"} Dec 04 22:17:12.232486 master-0 kubenswrapper[8606]: I1204 22:17:12.232484 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"b9c2a59d2b2d384d9b6ed01768b63f9f489ffd4ed0753bd5fb34a22342dcc2b9"} Dec 04 22:17:12.232486 master-0 kubenswrapper[8606]: I1204 22:17:12.232498 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"d0d775c28298e632a37100becb71b0843ebb15158d232ea527d7de5420ce8047"} Dec 04 22:17:12.232869 master-0 kubenswrapper[8606]: I1204 22:17:12.232811 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:17:12.258645 master-0 kubenswrapper[8606]: I1204 22:17:12.258458 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.2584344290000002 podStartE2EDuration="2.258434429s" podCreationTimestamp="2025-12-04 22:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:17:12.253002798 +0000 UTC m=+997.063305043" watchObservedRunningTime="2025-12-04 22:17:12.258434429 +0000 UTC m=+997.068736654" Dec 04 22:17:12.647330 master-0 kubenswrapper[8606]: I1204 22:17:12.647230 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:17:12.789615 master-0 kubenswrapper[8606]: I1204 22:17:12.789481 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-kubelet-dir\") pod \"3099f7b5-f904-4d15-aedb-f4e558b813e4\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " Dec 04 22:17:12.789615 master-0 kubenswrapper[8606]: I1204 22:17:12.789590 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-var-lock\") pod \"3099f7b5-f904-4d15-aedb-f4e558b813e4\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " Dec 04 22:17:12.789981 master-0 kubenswrapper[8606]: I1204 22:17:12.789620 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3099f7b5-f904-4d15-aedb-f4e558b813e4" (UID: "3099f7b5-f904-4d15-aedb-f4e558b813e4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:12.789981 master-0 kubenswrapper[8606]: I1204 22:17:12.789758 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3099f7b5-f904-4d15-aedb-f4e558b813e4-kube-api-access\") pod \"3099f7b5-f904-4d15-aedb-f4e558b813e4\" (UID: \"3099f7b5-f904-4d15-aedb-f4e558b813e4\") " Dec 04 22:17:12.789981 master-0 kubenswrapper[8606]: I1204 22:17:12.789781 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-var-lock" (OuterVolumeSpecName: "var-lock") pod "3099f7b5-f904-4d15-aedb-f4e558b813e4" (UID: "3099f7b5-f904-4d15-aedb-f4e558b813e4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:12.790651 master-0 kubenswrapper[8606]: I1204 22:17:12.790574 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:12.790651 master-0 kubenswrapper[8606]: I1204 22:17:12.790640 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3099f7b5-f904-4d15-aedb-f4e558b813e4-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:12.794121 master-0 kubenswrapper[8606]: I1204 22:17:12.794087 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3099f7b5-f904-4d15-aedb-f4e558b813e4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3099f7b5-f904-4d15-aedb-f4e558b813e4" (UID: "3099f7b5-f904-4d15-aedb-f4e558b813e4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:17:12.892318 master-0 kubenswrapper[8606]: I1204 22:17:12.892222 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3099f7b5-f904-4d15-aedb-f4e558b813e4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:12.945203 master-0 kubenswrapper[8606]: I1204 22:17:12.945097 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:12.945203 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:12.945203 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:12.945203 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:12.945662 master-0 kubenswrapper[8606]: I1204 22:17:12.945201 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:13.250419 master-0 kubenswrapper[8606]: I1204 22:17:13.250242 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"3099f7b5-f904-4d15-aedb-f4e558b813e4","Type":"ContainerDied","Data":"bc39063be03324c773b296bb527536f85c03a71f7444ce95b585b37a77beb76b"} Dec 04 22:17:13.250419 master-0 kubenswrapper[8606]: I1204 22:17:13.250302 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc39063be03324c773b296bb527536f85c03a71f7444ce95b585b37a77beb76b" Dec 04 22:17:13.250419 master-0 kubenswrapper[8606]: I1204 22:17:13.250341 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:17:13.945035 master-0 kubenswrapper[8606]: I1204 22:17:13.944898 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:13.945035 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:13.945035 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:13.945035 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:13.945475 master-0 kubenswrapper[8606]: I1204 22:17:13.945043 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:14.944814 master-0 kubenswrapper[8606]: I1204 22:17:14.944685 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:14.944814 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:14.944814 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:14.944814 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:14.945946 master-0 kubenswrapper[8606]: I1204 22:17:14.944816 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:15.944380 master-0 kubenswrapper[8606]: I1204 22:17:15.944240 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:15.944380 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:15.944380 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:15.944380 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:15.944380 master-0 kubenswrapper[8606]: I1204 22:17:15.944370 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:16.944867 master-0 kubenswrapper[8606]: I1204 22:17:16.944764 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:16.944867 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:16.944867 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:16.944867 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:16.946154 master-0 kubenswrapper[8606]: I1204 22:17:16.944882 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:17.944054 master-0 kubenswrapper[8606]: I1204 22:17:17.943928 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:17.944054 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:17.944054 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:17.944054 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:17.944054 master-0 kubenswrapper[8606]: I1204 22:17:17.944008 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:18.944742 master-0 kubenswrapper[8606]: I1204 22:17:18.944632 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:18.944742 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:18.944742 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:18.944742 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:18.944742 master-0 kubenswrapper[8606]: I1204 22:17:18.944732 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:19.944863 master-0 kubenswrapper[8606]: I1204 22:17:19.944701 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:19.944863 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:19.944863 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:19.944863 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:19.944863 master-0 kubenswrapper[8606]: I1204 22:17:19.944865 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:21.090035 master-0 kubenswrapper[8606]: I1204 22:17:21.089918 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:21.090035 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:21.090035 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:21.090035 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:21.090981 master-0 kubenswrapper[8606]: I1204 22:17:21.090053 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:21.457526 master-0 kubenswrapper[8606]: I1204 22:17:21.457300 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zx64w"] Dec 04 22:17:21.457874 master-0 kubenswrapper[8606]: E1204 22:17:21.457690 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3099f7b5-f904-4d15-aedb-f4e558b813e4" containerName="installer" Dec 04 22:17:21.457874 master-0 kubenswrapper[8606]: I1204 22:17:21.457708 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="3099f7b5-f904-4d15-aedb-f4e558b813e4" containerName="installer" Dec 04 22:17:21.457874 master-0 kubenswrapper[8606]: I1204 22:17:21.457874 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="3099f7b5-f904-4d15-aedb-f4e558b813e4" containerName="installer" Dec 04 22:17:21.458563 master-0 kubenswrapper[8606]: I1204 22:17:21.458464 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.460790 master-0 kubenswrapper[8606]: I1204 22:17:21.460705 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Dec 04 22:17:21.464615 master-0 kubenswrapper[8606]: I1204 22:17:21.464552 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-smdpr" Dec 04 22:17:21.595465 master-0 kubenswrapper[8606]: I1204 22:17:21.595357 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20e042ef-169e-4928-a98d-236282fe83a5-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.595846 master-0 kubenswrapper[8606]: I1204 22:17:21.595495 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20e042ef-169e-4928-a98d-236282fe83a5-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.595846 master-0 kubenswrapper[8606]: I1204 22:17:21.595550 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6znn\" (UniqueName: \"kubernetes.io/projected/20e042ef-169e-4928-a98d-236282fe83a5-kube-api-access-z6znn\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.595846 master-0 kubenswrapper[8606]: I1204 22:17:21.595589 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/20e042ef-169e-4928-a98d-236282fe83a5-ready\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.697036 master-0 kubenswrapper[8606]: I1204 22:17:21.696961 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20e042ef-169e-4928-a98d-236282fe83a5-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.697363 master-0 kubenswrapper[8606]: I1204 22:17:21.697312 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20e042ef-169e-4928-a98d-236282fe83a5-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.697446 master-0 kubenswrapper[8606]: I1204 22:17:21.697379 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6znn\" (UniqueName: \"kubernetes.io/projected/20e042ef-169e-4928-a98d-236282fe83a5-kube-api-access-z6znn\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.697549 master-0 kubenswrapper[8606]: I1204 22:17:21.697441 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/20e042ef-169e-4928-a98d-236282fe83a5-ready\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.697549 master-0 kubenswrapper[8606]: I1204 22:17:21.697530 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20e042ef-169e-4928-a98d-236282fe83a5-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.698385 master-0 kubenswrapper[8606]: I1204 22:17:21.698321 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/20e042ef-169e-4928-a98d-236282fe83a5-ready\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.698385 master-0 kubenswrapper[8606]: I1204 22:17:21.698348 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20e042ef-169e-4928-a98d-236282fe83a5-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.715273 master-0 kubenswrapper[8606]: I1204 22:17:21.715137 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6znn\" (UniqueName: \"kubernetes.io/projected/20e042ef-169e-4928-a98d-236282fe83a5-kube-api-access-z6znn\") pod \"cni-sysctl-allowlist-ds-zx64w\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.779534 master-0 kubenswrapper[8606]: I1204 22:17:21.779359 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:21.819032 master-0 kubenswrapper[8606]: W1204 22:17:21.818795 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e042ef_169e_4928_a98d_236282fe83a5.slice/crio-8918274b67ec97aefc6e3c15e824021c73fe5c571b20580a3bbcad9aa8fc5b27 WatchSource:0}: Error finding container 8918274b67ec97aefc6e3c15e824021c73fe5c571b20580a3bbcad9aa8fc5b27: Status 404 returned error can't find the container with id 8918274b67ec97aefc6e3c15e824021c73fe5c571b20580a3bbcad9aa8fc5b27 Dec 04 22:17:21.944639 master-0 kubenswrapper[8606]: I1204 22:17:21.944547 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:21.944639 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:21.944639 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:21.944639 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:21.945061 master-0 kubenswrapper[8606]: I1204 22:17:21.944645 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:22.337625 master-0 kubenswrapper[8606]: I1204 22:17:22.337418 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" event={"ID":"20e042ef-169e-4928-a98d-236282fe83a5","Type":"ContainerStarted","Data":"ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af"} Dec 04 22:17:22.337625 master-0 kubenswrapper[8606]: I1204 22:17:22.337533 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" event={"ID":"20e042ef-169e-4928-a98d-236282fe83a5","Type":"ContainerStarted","Data":"8918274b67ec97aefc6e3c15e824021c73fe5c571b20580a3bbcad9aa8fc5b27"} Dec 04 22:17:22.338586 master-0 kubenswrapper[8606]: I1204 22:17:22.337863 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:22.362210 master-0 kubenswrapper[8606]: I1204 22:17:22.362076 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" podStartSLOduration=1.3620525620000001 podStartE2EDuration="1.362052562s" podCreationTimestamp="2025-12-04 22:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:17:22.360336145 +0000 UTC m=+1007.170638370" watchObservedRunningTime="2025-12-04 22:17:22.362052562 +0000 UTC m=+1007.172354787" Dec 04 22:17:22.944848 master-0 kubenswrapper[8606]: I1204 22:17:22.944732 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:22.944848 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:22.944848 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:22.944848 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:22.945419 master-0 kubenswrapper[8606]: I1204 22:17:22.944848 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:23.402559 master-0 kubenswrapper[8606]: I1204 22:17:23.402479 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:23.946359 master-0 kubenswrapper[8606]: I1204 22:17:23.945726 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:23.946359 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:23.946359 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:23.946359 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:23.946359 master-0 kubenswrapper[8606]: I1204 22:17:23.945976 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:24.444685 master-0 kubenswrapper[8606]: I1204 22:17:24.444579 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zx64w"] Dec 04 22:17:24.943730 master-0 kubenswrapper[8606]: I1204 22:17:24.943649 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:24.943730 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:24.943730 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:24.943730 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:24.943730 master-0 kubenswrapper[8606]: I1204 22:17:24.943738 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:25.378991 master-0 kubenswrapper[8606]: I1204 22:17:25.378862 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" podUID="20e042ef-169e-4928-a98d-236282fe83a5" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" gracePeriod=30 Dec 04 22:17:25.944538 master-0 kubenswrapper[8606]: I1204 22:17:25.944383 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:25.944538 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:25.944538 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:25.944538 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:25.945685 master-0 kubenswrapper[8606]: I1204 22:17:25.944547 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:26.945173 master-0 kubenswrapper[8606]: I1204 22:17:26.945081 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:26.945173 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:26.945173 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:26.945173 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:26.945173 master-0 kubenswrapper[8606]: I1204 22:17:26.945169 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:27.944665 master-0 kubenswrapper[8606]: I1204 22:17:27.944571 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:27.944665 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:27.944665 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:27.944665 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:27.946295 master-0 kubenswrapper[8606]: I1204 22:17:27.944701 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:28.945536 master-0 kubenswrapper[8606]: I1204 22:17:28.945408 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:28.945536 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:28.945536 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:28.945536 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:28.945536 master-0 kubenswrapper[8606]: I1204 22:17:28.945518 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:29.944582 master-0 kubenswrapper[8606]: I1204 22:17:29.944447 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:29.944582 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:29.944582 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:29.944582 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:29.945069 master-0 kubenswrapper[8606]: I1204 22:17:29.944604 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:30.944703 master-0 kubenswrapper[8606]: I1204 22:17:30.944596 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:30.944703 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:30.944703 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:30.944703 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:30.945724 master-0 kubenswrapper[8606]: I1204 22:17:30.944705 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:31.022474 master-0 kubenswrapper[8606]: I1204 22:17:31.022367 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-8dbbb5754-c9fx2"] Dec 04 22:17:31.024052 master-0 kubenswrapper[8606]: I1204 22:17:31.024001 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:17:31.027485 master-0 kubenswrapper[8606]: I1204 22:17:31.027442 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-496f9" Dec 04 22:17:31.045035 master-0 kubenswrapper[8606]: I1204 22:17:31.044966 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8dbbb5754-c9fx2"] Dec 04 22:17:31.175283 master-0 kubenswrapper[8606]: I1204 22:17:31.175166 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq55v\" (UniqueName: \"kubernetes.io/projected/34cad3de-8f3f-48cd-bd39-8745fad19e65-kube-api-access-tq55v\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:17:31.175283 master-0 kubenswrapper[8606]: I1204 22:17:31.175291 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:17:31.278016 master-0 kubenswrapper[8606]: I1204 22:17:31.277957 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq55v\" (UniqueName: \"kubernetes.io/projected/34cad3de-8f3f-48cd-bd39-8745fad19e65-kube-api-access-tq55v\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:17:31.278016 master-0 kubenswrapper[8606]: I1204 22:17:31.278022 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:17:31.282437 master-0 kubenswrapper[8606]: I1204 22:17:31.282399 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:17:31.300478 master-0 kubenswrapper[8606]: I1204 22:17:31.300424 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq55v\" (UniqueName: \"kubernetes.io/projected/34cad3de-8f3f-48cd-bd39-8745fad19e65-kube-api-access-tq55v\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:17:31.361450 master-0 kubenswrapper[8606]: I1204 22:17:31.361360 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:17:31.784721 master-0 kubenswrapper[8606]: E1204 22:17:31.782948 8606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 22:17:31.785322 master-0 kubenswrapper[8606]: E1204 22:17:31.785224 8606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 22:17:31.788198 master-0 kubenswrapper[8606]: E1204 22:17:31.788093 8606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 22:17:31.788352 master-0 kubenswrapper[8606]: E1204 22:17:31.788213 8606 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" podUID="20e042ef-169e-4928-a98d-236282fe83a5" containerName="kube-multus-additional-cni-plugins" Dec 04 22:17:31.893752 master-0 kubenswrapper[8606]: I1204 22:17:31.893682 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8dbbb5754-c9fx2"] Dec 04 22:17:31.897583 master-0 kubenswrapper[8606]: W1204 22:17:31.897492 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34cad3de_8f3f_48cd_bd39_8745fad19e65.slice/crio-8322a14628afc897780b45b61f17a00da7b18029f93a7a74b52c8e380031ff4f WatchSource:0}: Error finding container 8322a14628afc897780b45b61f17a00da7b18029f93a7a74b52c8e380031ff4f: Status 404 returned error can't find the container with id 8322a14628afc897780b45b61f17a00da7b18029f93a7a74b52c8e380031ff4f Dec 04 22:17:31.954350 master-0 kubenswrapper[8606]: I1204 22:17:31.952725 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:31.954350 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:31.954350 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:31.954350 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:31.954350 master-0 kubenswrapper[8606]: I1204 22:17:31.952837 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:32.441747 master-0 kubenswrapper[8606]: I1204 22:17:32.441573 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" event={"ID":"34cad3de-8f3f-48cd-bd39-8745fad19e65","Type":"ContainerStarted","Data":"bbf9fb5a77c001a00a8ba9089cd2dbff84e9018cac8414c0fa2ee4f2f5ac52a2"} Dec 04 22:17:32.441747 master-0 kubenswrapper[8606]: I1204 22:17:32.441645 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" event={"ID":"34cad3de-8f3f-48cd-bd39-8745fad19e65","Type":"ContainerStarted","Data":"8322a14628afc897780b45b61f17a00da7b18029f93a7a74b52c8e380031ff4f"} Dec 04 22:17:32.944049 master-0 kubenswrapper[8606]: I1204 22:17:32.943959 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:32.944049 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:32.944049 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:32.944049 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:32.944658 master-0 kubenswrapper[8606]: I1204 22:17:32.944051 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:33.459894 master-0 kubenswrapper[8606]: I1204 22:17:33.459786 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" event={"ID":"34cad3de-8f3f-48cd-bd39-8745fad19e65","Type":"ContainerStarted","Data":"522ebc4422c9f26169d8e98928a3c5499603a3d90a45136b87f723bed13e8748"} Dec 04 22:17:33.502678 master-0 kubenswrapper[8606]: I1204 22:17:33.500823 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" podStartSLOduration=3.500792657 podStartE2EDuration="3.500792657s" podCreationTimestamp="2025-12-04 22:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:17:33.496476928 +0000 UTC m=+1018.306779183" watchObservedRunningTime="2025-12-04 22:17:33.500792657 +0000 UTC m=+1018.311094902" Dec 04 22:17:33.552459 master-0 kubenswrapper[8606]: I1204 22:17:33.549750 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb"] Dec 04 22:17:33.552459 master-0 kubenswrapper[8606]: I1204 22:17:33.550133 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" podUID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" containerName="multus-admission-controller" containerID="cri-o://7e4aac9f5c23b83a163652977c7ded79014cb793917982a18845f9835af09587" gracePeriod=30 Dec 04 22:17:33.552459 master-0 kubenswrapper[8606]: I1204 22:17:33.550792 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" podUID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" containerName="kube-rbac-proxy" containerID="cri-o://815be3bb78086065271ecb4d4b9b7c7f847598761d2c9ee58e7b745732e5f4f4" gracePeriod=30 Dec 04 22:17:33.944884 master-0 kubenswrapper[8606]: I1204 22:17:33.944703 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:33.944884 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:33.944884 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:33.944884 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:33.945211 master-0 kubenswrapper[8606]: I1204 22:17:33.944908 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:34.472006 master-0 kubenswrapper[8606]: I1204 22:17:34.471873 8606 generic.go:334] "Generic (PLEG): container finished" podID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" containerID="815be3bb78086065271ecb4d4b9b7c7f847598761d2c9ee58e7b745732e5f4f4" exitCode=0 Dec 04 22:17:34.473219 master-0 kubenswrapper[8606]: I1204 22:17:34.472028 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" event={"ID":"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf","Type":"ContainerDied","Data":"815be3bb78086065271ecb4d4b9b7c7f847598761d2c9ee58e7b745732e5f4f4"} Dec 04 22:17:34.944123 master-0 kubenswrapper[8606]: I1204 22:17:34.944025 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Dec 04 22:17:34.945882 master-0 kubenswrapper[8606]: I1204 22:17:34.945027 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:34.945882 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:34.945882 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:34.945882 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:34.945882 master-0 kubenswrapper[8606]: I1204 22:17:34.945134 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:34.945882 master-0 kubenswrapper[8606]: I1204 22:17:34.945658 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:34.949913 master-0 kubenswrapper[8606]: I1204 22:17:34.949837 8606 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-ckmlf" Dec 04 22:17:34.950561 master-0 kubenswrapper[8606]: I1204 22:17:34.950484 8606 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 22:17:34.971291 master-0 kubenswrapper[8606]: I1204 22:17:34.971211 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Dec 04 22:17:35.071632 master-0 kubenswrapper[8606]: I1204 22:17:35.071567 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:35.072217 master-0 kubenswrapper[8606]: I1204 22:17:35.072146 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49c55f04-2c89-4439-8171-6a586acc4db2-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:35.072358 master-0 kubenswrapper[8606]: I1204 22:17:35.072319 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:35.174229 master-0 kubenswrapper[8606]: I1204 22:17:35.174131 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:35.174606 master-0 kubenswrapper[8606]: I1204 22:17:35.174275 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:35.174606 master-0 kubenswrapper[8606]: I1204 22:17:35.174384 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:35.175127 master-0 kubenswrapper[8606]: I1204 22:17:35.174761 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49c55f04-2c89-4439-8171-6a586acc4db2-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:35.175263 master-0 kubenswrapper[8606]: I1204 22:17:35.175228 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:35.199321 master-0 kubenswrapper[8606]: I1204 22:17:35.199187 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49c55f04-2c89-4439-8171-6a586acc4db2-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:35.288759 master-0 kubenswrapper[8606]: I1204 22:17:35.288647 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:17:35.733906 master-0 kubenswrapper[8606]: I1204 22:17:35.733812 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Dec 04 22:17:35.946073 master-0 kubenswrapper[8606]: I1204 22:17:35.945915 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:35.946073 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:35.946073 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:35.946073 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:35.946073 master-0 kubenswrapper[8606]: I1204 22:17:35.946043 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:36.494446 master-0 kubenswrapper[8606]: I1204 22:17:36.494329 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"49c55f04-2c89-4439-8171-6a586acc4db2","Type":"ContainerStarted","Data":"4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6"} Dec 04 22:17:36.494446 master-0 kubenswrapper[8606]: I1204 22:17:36.494419 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"49c55f04-2c89-4439-8171-6a586acc4db2","Type":"ContainerStarted","Data":"d19066723bc19dac944f997eba7b40ba7eb1046f1ea930eac6393550ed81b491"} Dec 04 22:17:36.522485 master-0 kubenswrapper[8606]: I1204 22:17:36.522395 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podStartSLOduration=2.522363446 podStartE2EDuration="2.522363446s" podCreationTimestamp="2025-12-04 22:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:17:36.522023187 +0000 UTC m=+1021.332325472" watchObservedRunningTime="2025-12-04 22:17:36.522363446 +0000 UTC m=+1021.332665711" Dec 04 22:17:36.944468 master-0 kubenswrapper[8606]: I1204 22:17:36.944372 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:36.944468 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:36.944468 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:36.944468 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:36.945538 master-0 kubenswrapper[8606]: I1204 22:17:36.944485 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:37.944925 master-0 kubenswrapper[8606]: I1204 22:17:37.944450 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:37.944925 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:37.944925 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:37.944925 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:37.944925 master-0 kubenswrapper[8606]: I1204 22:17:37.944595 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:38.944077 master-0 kubenswrapper[8606]: I1204 22:17:38.944007 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:38.944077 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:38.944077 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:38.944077 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:38.944918 master-0 kubenswrapper[8606]: I1204 22:17:38.944098 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:39.944705 master-0 kubenswrapper[8606]: I1204 22:17:39.944595 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:39.944705 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:39.944705 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:39.944705 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:39.945886 master-0 kubenswrapper[8606]: I1204 22:17:39.944728 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:40.783071 master-0 kubenswrapper[8606]: I1204 22:17:40.782964 8606 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:17:40.783560 master-0 kubenswrapper[8606]: I1204 22:17:40.783449 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f" gracePeriod=30 Dec 04 22:17:40.783862 master-0 kubenswrapper[8606]: I1204 22:17:40.783592 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671" gracePeriod=30 Dec 04 22:17:40.783985 master-0 kubenswrapper[8606]: I1204 22:17:40.783618 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager" containerID="cri-o://4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa" gracePeriod=30 Dec 04 22:17:40.784079 master-0 kubenswrapper[8606]: I1204 22:17:40.783618 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" containerID="cri-o://318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628" gracePeriod=30 Dec 04 22:17:40.785646 master-0 kubenswrapper[8606]: I1204 22:17:40.785184 8606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:17:40.785759 master-0 kubenswrapper[8606]: E1204 22:17:40.785707 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager" Dec 04 22:17:40.785759 master-0 kubenswrapper[8606]: I1204 22:17:40.785738 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager" Dec 04 22:17:40.785897 master-0 kubenswrapper[8606]: E1204 22:17:40.785775 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-recovery-controller" Dec 04 22:17:40.785897 master-0 kubenswrapper[8606]: I1204 22:17:40.785792 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-recovery-controller" Dec 04 22:17:40.785897 master-0 kubenswrapper[8606]: E1204 22:17:40.785816 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-cert-syncer" Dec 04 22:17:40.785897 master-0 kubenswrapper[8606]: I1204 22:17:40.785833 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-cert-syncer" Dec 04 22:17:40.785897 master-0 kubenswrapper[8606]: E1204 22:17:40.785865 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.785897 master-0 kubenswrapper[8606]: I1204 22:17:40.785880 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.785897 master-0 kubenswrapper[8606]: E1204 22:17:40.785898 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager" Dec 04 22:17:40.786490 master-0 kubenswrapper[8606]: I1204 22:17:40.785916 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager" Dec 04 22:17:40.786490 master-0 kubenswrapper[8606]: E1204 22:17:40.785940 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.786490 master-0 kubenswrapper[8606]: I1204 22:17:40.785956 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.786490 master-0 kubenswrapper[8606]: E1204 22:17:40.785978 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.786490 master-0 kubenswrapper[8606]: I1204 22:17:40.785997 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.786490 master-0 kubenswrapper[8606]: E1204 22:17:40.786042 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-cert-syncer" Dec 04 22:17:40.786490 master-0 kubenswrapper[8606]: I1204 22:17:40.786059 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-cert-syncer" Dec 04 22:17:40.786968 master-0 kubenswrapper[8606]: I1204 22:17:40.786530 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.786968 master-0 kubenswrapper[8606]: I1204 22:17:40.786561 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager" Dec 04 22:17:40.786968 master-0 kubenswrapper[8606]: I1204 22:17:40.786598 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.786968 master-0 kubenswrapper[8606]: I1204 22:17:40.786619 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-cert-syncer" Dec 04 22:17:40.786968 master-0 kubenswrapper[8606]: I1204 22:17:40.786648 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager" Dec 04 22:17:40.786968 master-0 kubenswrapper[8606]: I1204 22:17:40.786680 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-cert-syncer" Dec 04 22:17:40.786968 master-0 kubenswrapper[8606]: I1204 22:17:40.786703 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.786968 master-0 kubenswrapper[8606]: I1204 22:17:40.786724 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="kube-controller-manager-recovery-controller" Dec 04 22:17:40.787452 master-0 kubenswrapper[8606]: E1204 22:17:40.786990 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.787452 master-0 kubenswrapper[8606]: I1204 22:17:40.787017 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.787452 master-0 kubenswrapper[8606]: E1204 22:17:40.787043 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.787452 master-0 kubenswrapper[8606]: I1204 22:17:40.787060 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.787452 master-0 kubenswrapper[8606]: I1204 22:17:40.787400 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.787452 master-0 kubenswrapper[8606]: I1204 22:17:40.787448 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad55397ac8e23f218f25cb714ea5b2b" containerName="cluster-policy-controller" Dec 04 22:17:40.896644 master-0 kubenswrapper[8606]: I1204 22:17:40.896557 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:40.896986 master-0 kubenswrapper[8606]: I1204 22:17:40.896697 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:40.944690 master-0 kubenswrapper[8606]: I1204 22:17:40.944586 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:40.944690 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:40.944690 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:40.944690 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:40.945939 master-0 kubenswrapper[8606]: I1204 22:17:40.944705 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:40.999155 master-0 kubenswrapper[8606]: I1204 22:17:40.999071 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:40.999334 master-0 kubenswrapper[8606]: I1204 22:17:40.999197 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:40.999334 master-0 kubenswrapper[8606]: I1204 22:17:40.999277 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:40.999334 master-0 kubenswrapper[8606]: I1204 22:17:40.999217 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:41.066966 master-0 kubenswrapper[8606]: I1204 22:17:41.066785 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager-cert-syncer/1.log" Dec 04 22:17:41.068990 master-0 kubenswrapper[8606]: I1204 22:17:41.068921 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/3.log" Dec 04 22:17:41.072338 master-0 kubenswrapper[8606]: I1204 22:17:41.072165 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager-cert-syncer/0.log" Dec 04 22:17:41.073871 master-0 kubenswrapper[8606]: I1204 22:17:41.073772 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:17:41.074110 master-0 kubenswrapper[8606]: I1204 22:17:41.073964 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:41.083311 master-0 kubenswrapper[8606]: I1204 22:17:41.083027 8606 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="fad55397ac8e23f218f25cb714ea5b2b" podUID="5859424d8ea4459c5b854f1ae5fd942c" Dec 04 22:17:41.101254 master-0 kubenswrapper[8606]: I1204 22:17:41.101148 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-resource-dir\") pod \"fad55397ac8e23f218f25cb714ea5b2b\" (UID: \"fad55397ac8e23f218f25cb714ea5b2b\") " Dec 04 22:17:41.101594 master-0 kubenswrapper[8606]: I1204 22:17:41.101300 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-cert-dir\") pod \"fad55397ac8e23f218f25cb714ea5b2b\" (UID: \"fad55397ac8e23f218f25cb714ea5b2b\") " Dec 04 22:17:41.101594 master-0 kubenswrapper[8606]: I1204 22:17:41.101369 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "fad55397ac8e23f218f25cb714ea5b2b" (UID: "fad55397ac8e23f218f25cb714ea5b2b"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:41.101594 master-0 kubenswrapper[8606]: I1204 22:17:41.101572 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "fad55397ac8e23f218f25cb714ea5b2b" (UID: "fad55397ac8e23f218f25cb714ea5b2b"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:41.102032 master-0 kubenswrapper[8606]: I1204 22:17:41.101978 8606 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:41.102032 master-0 kubenswrapper[8606]: I1204 22:17:41.102019 8606 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fad55397ac8e23f218f25cb714ea5b2b-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:41.345520 master-0 kubenswrapper[8606]: I1204 22:17:41.345303 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Dec 04 22:17:41.345786 master-0 kubenswrapper[8606]: I1204 22:17:41.345660 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podUID="49c55f04-2c89-4439-8171-6a586acc4db2" containerName="installer" containerID="cri-o://4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6" gracePeriod=30 Dec 04 22:17:41.404339 master-0 kubenswrapper[8606]: I1204 22:17:41.404255 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fad55397ac8e23f218f25cb714ea5b2b" path="/var/lib/kubelet/pods/fad55397ac8e23f218f25cb714ea5b2b/volumes" Dec 04 22:17:41.549191 master-0 kubenswrapper[8606]: I1204 22:17:41.549105 8606 generic.go:334] "Generic (PLEG): container finished" podID="986a4de7-3a54-48dc-9599-49cf19ba0ad5" containerID="42c592fcd97dd09de62f2c07d511eb1f7fedb875ba01c50a76d8a639e15849ae" exitCode=0 Dec 04 22:17:41.549596 master-0 kubenswrapper[8606]: I1204 22:17:41.549238 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"986a4de7-3a54-48dc-9599-49cf19ba0ad5","Type":"ContainerDied","Data":"42c592fcd97dd09de62f2c07d511eb1f7fedb875ba01c50a76d8a639e15849ae"} Dec 04 22:17:41.553273 master-0 kubenswrapper[8606]: I1204 22:17:41.553220 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager-cert-syncer/1.log" Dec 04 22:17:41.553920 master-0 kubenswrapper[8606]: I1204 22:17:41.553883 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/cluster-policy-controller/3.log" Dec 04 22:17:41.556981 master-0 kubenswrapper[8606]: I1204 22:17:41.556906 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager-cert-syncer/0.log" Dec 04 22:17:41.557783 master-0 kubenswrapper[8606]: I1204 22:17:41.557731 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/0.log" Dec 04 22:17:41.557905 master-0 kubenswrapper[8606]: I1204 22:17:41.557812 8606 generic.go:334] "Generic (PLEG): container finished" podID="fad55397ac8e23f218f25cb714ea5b2b" containerID="318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628" exitCode=0 Dec 04 22:17:41.557905 master-0 kubenswrapper[8606]: I1204 22:17:41.557845 8606 generic.go:334] "Generic (PLEG): container finished" podID="fad55397ac8e23f218f25cb714ea5b2b" containerID="51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671" exitCode=2 Dec 04 22:17:41.557905 master-0 kubenswrapper[8606]: I1204 22:17:41.557859 8606 generic.go:334] "Generic (PLEG): container finished" podID="fad55397ac8e23f218f25cb714ea5b2b" containerID="4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa" exitCode=0 Dec 04 22:17:41.557905 master-0 kubenswrapper[8606]: I1204 22:17:41.557875 8606 generic.go:334] "Generic (PLEG): container finished" podID="fad55397ac8e23f218f25cb714ea5b2b" containerID="7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f" exitCode=0 Dec 04 22:17:41.558177 master-0 kubenswrapper[8606]: I1204 22:17:41.557953 8606 scope.go:117] "RemoveContainer" containerID="318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628" Dec 04 22:17:41.558177 master-0 kubenswrapper[8606]: I1204 22:17:41.558060 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:41.582110 master-0 kubenswrapper[8606]: I1204 22:17:41.581990 8606 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="fad55397ac8e23f218f25cb714ea5b2b" podUID="5859424d8ea4459c5b854f1ae5fd942c" Dec 04 22:17:41.593611 master-0 kubenswrapper[8606]: I1204 22:17:41.593546 8606 scope.go:117] "RemoveContainer" containerID="51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671" Dec 04 22:17:41.619832 master-0 kubenswrapper[8606]: I1204 22:17:41.619760 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:17:41.639959 master-0 kubenswrapper[8606]: I1204 22:17:41.639897 8606 scope.go:117] "RemoveContainer" containerID="4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa" Dec 04 22:17:41.657726 master-0 kubenswrapper[8606]: I1204 22:17:41.657671 8606 scope.go:117] "RemoveContainer" containerID="7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f" Dec 04 22:17:41.675524 master-0 kubenswrapper[8606]: I1204 22:17:41.675439 8606 scope.go:117] "RemoveContainer" containerID="f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7" Dec 04 22:17:41.694561 master-0 kubenswrapper[8606]: I1204 22:17:41.694476 8606 scope.go:117] "RemoveContainer" containerID="dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5" Dec 04 22:17:41.725869 master-0 kubenswrapper[8606]: I1204 22:17:41.725818 8606 scope.go:117] "RemoveContainer" containerID="318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628" Dec 04 22:17:41.726385 master-0 kubenswrapper[8606]: E1204 22:17:41.726330 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628\": container with ID starting with 318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628 not found: ID does not exist" containerID="318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628" Dec 04 22:17:41.726465 master-0 kubenswrapper[8606]: I1204 22:17:41.726383 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628"} err="failed to get container status \"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628\": rpc error: code = NotFound desc = could not find container \"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628\": container with ID starting with 318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628 not found: ID does not exist" Dec 04 22:17:41.726465 master-0 kubenswrapper[8606]: I1204 22:17:41.726421 8606 scope.go:117] "RemoveContainer" containerID="51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671" Dec 04 22:17:41.727261 master-0 kubenswrapper[8606]: E1204 22:17:41.727202 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671\": container with ID starting with 51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671 not found: ID does not exist" containerID="51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671" Dec 04 22:17:41.727343 master-0 kubenswrapper[8606]: I1204 22:17:41.727258 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671"} err="failed to get container status \"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671\": rpc error: code = NotFound desc = could not find container \"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671\": container with ID starting with 51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671 not found: ID does not exist" Dec 04 22:17:41.727343 master-0 kubenswrapper[8606]: I1204 22:17:41.727293 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:17:41.727791 master-0 kubenswrapper[8606]: E1204 22:17:41.727741 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75\": container with ID starting with 7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75 not found: ID does not exist" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:17:41.727880 master-0 kubenswrapper[8606]: I1204 22:17:41.727786 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75"} err="failed to get container status \"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75\": rpc error: code = NotFound desc = could not find container \"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75\": container with ID starting with 7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75 not found: ID does not exist" Dec 04 22:17:41.727880 master-0 kubenswrapper[8606]: I1204 22:17:41.727817 8606 scope.go:117] "RemoveContainer" containerID="4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa" Dec 04 22:17:41.728159 master-0 kubenswrapper[8606]: E1204 22:17:41.728117 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa\": container with ID starting with 4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa not found: ID does not exist" containerID="4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa" Dec 04 22:17:41.728159 master-0 kubenswrapper[8606]: I1204 22:17:41.728147 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa"} err="failed to get container status \"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa\": rpc error: code = NotFound desc = could not find container \"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa\": container with ID starting with 4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa not found: ID does not exist" Dec 04 22:17:41.728296 master-0 kubenswrapper[8606]: I1204 22:17:41.728164 8606 scope.go:117] "RemoveContainer" containerID="7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f" Dec 04 22:17:41.728539 master-0 kubenswrapper[8606]: E1204 22:17:41.728467 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f\": container with ID starting with 7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f not found: ID does not exist" containerID="7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f" Dec 04 22:17:41.728539 master-0 kubenswrapper[8606]: I1204 22:17:41.728521 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f"} err="failed to get container status \"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f\": rpc error: code = NotFound desc = could not find container \"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f\": container with ID starting with 7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f not found: ID does not exist" Dec 04 22:17:41.728691 master-0 kubenswrapper[8606]: I1204 22:17:41.728543 8606 scope.go:117] "RemoveContainer" containerID="f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7" Dec 04 22:17:41.728917 master-0 kubenswrapper[8606]: E1204 22:17:41.728865 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7\": container with ID starting with f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7 not found: ID does not exist" containerID="f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7" Dec 04 22:17:41.728987 master-0 kubenswrapper[8606]: I1204 22:17:41.728910 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7"} err="failed to get container status \"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7\": rpc error: code = NotFound desc = could not find container \"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7\": container with ID starting with f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7 not found: ID does not exist" Dec 04 22:17:41.728987 master-0 kubenswrapper[8606]: I1204 22:17:41.728936 8606 scope.go:117] "RemoveContainer" containerID="dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5" Dec 04 22:17:41.729457 master-0 kubenswrapper[8606]: E1204 22:17:41.729414 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5\": container with ID starting with dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5 not found: ID does not exist" containerID="dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5" Dec 04 22:17:41.729457 master-0 kubenswrapper[8606]: I1204 22:17:41.729443 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5"} err="failed to get container status \"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5\": rpc error: code = NotFound desc = could not find container \"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5\": container with ID starting with dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5 not found: ID does not exist" Dec 04 22:17:41.729457 master-0 kubenswrapper[8606]: I1204 22:17:41.729459 8606 scope.go:117] "RemoveContainer" containerID="318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628" Dec 04 22:17:41.732394 master-0 kubenswrapper[8606]: I1204 22:17:41.732310 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628"} err="failed to get container status \"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628\": rpc error: code = NotFound desc = could not find container \"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628\": container with ID starting with 318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628 not found: ID does not exist" Dec 04 22:17:41.732394 master-0 kubenswrapper[8606]: I1204 22:17:41.732387 8606 scope.go:117] "RemoveContainer" containerID="51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671" Dec 04 22:17:41.732984 master-0 kubenswrapper[8606]: I1204 22:17:41.732894 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671"} err="failed to get container status \"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671\": rpc error: code = NotFound desc = could not find container \"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671\": container with ID starting with 51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671 not found: ID does not exist" Dec 04 22:17:41.732984 master-0 kubenswrapper[8606]: I1204 22:17:41.732926 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:17:41.733425 master-0 kubenswrapper[8606]: I1204 22:17:41.733351 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75"} err="failed to get container status \"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75\": rpc error: code = NotFound desc = could not find container \"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75\": container with ID starting with 7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75 not found: ID does not exist" Dec 04 22:17:41.733425 master-0 kubenswrapper[8606]: I1204 22:17:41.733411 8606 scope.go:117] "RemoveContainer" containerID="4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa" Dec 04 22:17:41.734018 master-0 kubenswrapper[8606]: I1204 22:17:41.733967 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa"} err="failed to get container status \"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa\": rpc error: code = NotFound desc = could not find container \"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa\": container with ID starting with 4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa not found: ID does not exist" Dec 04 22:17:41.734018 master-0 kubenswrapper[8606]: I1204 22:17:41.733994 8606 scope.go:117] "RemoveContainer" containerID="7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f" Dec 04 22:17:41.734346 master-0 kubenswrapper[8606]: I1204 22:17:41.734290 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f"} err="failed to get container status \"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f\": rpc error: code = NotFound desc = could not find container \"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f\": container with ID starting with 7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f not found: ID does not exist" Dec 04 22:17:41.734346 master-0 kubenswrapper[8606]: I1204 22:17:41.734332 8606 scope.go:117] "RemoveContainer" containerID="f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7" Dec 04 22:17:41.734662 master-0 kubenswrapper[8606]: I1204 22:17:41.734607 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7"} err="failed to get container status \"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7\": rpc error: code = NotFound desc = could not find container \"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7\": container with ID starting with f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7 not found: ID does not exist" Dec 04 22:17:41.734662 master-0 kubenswrapper[8606]: I1204 22:17:41.734646 8606 scope.go:117] "RemoveContainer" containerID="dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5" Dec 04 22:17:41.735007 master-0 kubenswrapper[8606]: I1204 22:17:41.734965 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5"} err="failed to get container status \"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5\": rpc error: code = NotFound desc = could not find container \"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5\": container with ID starting with dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5 not found: ID does not exist" Dec 04 22:17:41.735007 master-0 kubenswrapper[8606]: I1204 22:17:41.734989 8606 scope.go:117] "RemoveContainer" containerID="318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628" Dec 04 22:17:41.735296 master-0 kubenswrapper[8606]: I1204 22:17:41.735253 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628"} err="failed to get container status \"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628\": rpc error: code = NotFound desc = could not find container \"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628\": container with ID starting with 318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628 not found: ID does not exist" Dec 04 22:17:41.735296 master-0 kubenswrapper[8606]: I1204 22:17:41.735278 8606 scope.go:117] "RemoveContainer" containerID="51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671" Dec 04 22:17:41.735732 master-0 kubenswrapper[8606]: I1204 22:17:41.735660 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671"} err="failed to get container status \"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671\": rpc error: code = NotFound desc = could not find container \"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671\": container with ID starting with 51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671 not found: ID does not exist" Dec 04 22:17:41.735732 master-0 kubenswrapper[8606]: I1204 22:17:41.735724 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:17:41.736460 master-0 kubenswrapper[8606]: I1204 22:17:41.736375 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75"} err="failed to get container status \"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75\": rpc error: code = NotFound desc = could not find container \"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75\": container with ID starting with 7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75 not found: ID does not exist" Dec 04 22:17:41.736460 master-0 kubenswrapper[8606]: I1204 22:17:41.736418 8606 scope.go:117] "RemoveContainer" containerID="4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa" Dec 04 22:17:41.737022 master-0 kubenswrapper[8606]: I1204 22:17:41.736958 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa"} err="failed to get container status \"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa\": rpc error: code = NotFound desc = could not find container \"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa\": container with ID starting with 4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa not found: ID does not exist" Dec 04 22:17:41.737022 master-0 kubenswrapper[8606]: I1204 22:17:41.737008 8606 scope.go:117] "RemoveContainer" containerID="7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f" Dec 04 22:17:41.737371 master-0 kubenswrapper[8606]: I1204 22:17:41.737332 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f"} err="failed to get container status \"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f\": rpc error: code = NotFound desc = could not find container \"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f\": container with ID starting with 7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f not found: ID does not exist" Dec 04 22:17:41.737371 master-0 kubenswrapper[8606]: I1204 22:17:41.737356 8606 scope.go:117] "RemoveContainer" containerID="f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7" Dec 04 22:17:41.737749 master-0 kubenswrapper[8606]: I1204 22:17:41.737695 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7"} err="failed to get container status \"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7\": rpc error: code = NotFound desc = could not find container \"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7\": container with ID starting with f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7 not found: ID does not exist" Dec 04 22:17:41.737749 master-0 kubenswrapper[8606]: I1204 22:17:41.737736 8606 scope.go:117] "RemoveContainer" containerID="dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5" Dec 04 22:17:41.738309 master-0 kubenswrapper[8606]: I1204 22:17:41.738257 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5"} err="failed to get container status \"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5\": rpc error: code = NotFound desc = could not find container \"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5\": container with ID starting with dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5 not found: ID does not exist" Dec 04 22:17:41.738309 master-0 kubenswrapper[8606]: I1204 22:17:41.738296 8606 scope.go:117] "RemoveContainer" containerID="318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628" Dec 04 22:17:41.738739 master-0 kubenswrapper[8606]: I1204 22:17:41.738675 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628"} err="failed to get container status \"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628\": rpc error: code = NotFound desc = could not find container \"318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628\": container with ID starting with 318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628 not found: ID does not exist" Dec 04 22:17:41.738739 master-0 kubenswrapper[8606]: I1204 22:17:41.738732 8606 scope.go:117] "RemoveContainer" containerID="51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671" Dec 04 22:17:41.739211 master-0 kubenswrapper[8606]: I1204 22:17:41.739159 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671"} err="failed to get container status \"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671\": rpc error: code = NotFound desc = could not find container \"51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671\": container with ID starting with 51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671 not found: ID does not exist" Dec 04 22:17:41.739211 master-0 kubenswrapper[8606]: I1204 22:17:41.739198 8606 scope.go:117] "RemoveContainer" containerID="7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75" Dec 04 22:17:41.739820 master-0 kubenswrapper[8606]: I1204 22:17:41.739718 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75"} err="failed to get container status \"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75\": rpc error: code = NotFound desc = could not find container \"7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75\": container with ID starting with 7899c7c998937f57e4c4fb9c72b85ef7142c758e70918fd85b5266b1c68ffd75 not found: ID does not exist" Dec 04 22:17:41.739913 master-0 kubenswrapper[8606]: I1204 22:17:41.739816 8606 scope.go:117] "RemoveContainer" containerID="4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa" Dec 04 22:17:41.740327 master-0 kubenswrapper[8606]: I1204 22:17:41.740264 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa"} err="failed to get container status \"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa\": rpc error: code = NotFound desc = could not find container \"4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa\": container with ID starting with 4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa not found: ID does not exist" Dec 04 22:17:41.740327 master-0 kubenswrapper[8606]: I1204 22:17:41.740308 8606 scope.go:117] "RemoveContainer" containerID="7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f" Dec 04 22:17:41.740771 master-0 kubenswrapper[8606]: I1204 22:17:41.740715 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f"} err="failed to get container status \"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f\": rpc error: code = NotFound desc = could not find container \"7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f\": container with ID starting with 7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f not found: ID does not exist" Dec 04 22:17:41.740771 master-0 kubenswrapper[8606]: I1204 22:17:41.740753 8606 scope.go:117] "RemoveContainer" containerID="f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7" Dec 04 22:17:41.741143 master-0 kubenswrapper[8606]: I1204 22:17:41.741086 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7"} err="failed to get container status \"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7\": rpc error: code = NotFound desc = could not find container \"f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7\": container with ID starting with f2fe770cd51459e5d5291fb85d0ada58b5f7c6e4801ba62d59e5cc22d426fed7 not found: ID does not exist" Dec 04 22:17:41.741143 master-0 kubenswrapper[8606]: I1204 22:17:41.741130 8606 scope.go:117] "RemoveContainer" containerID="dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5" Dec 04 22:17:41.741557 master-0 kubenswrapper[8606]: I1204 22:17:41.741477 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5"} err="failed to get container status \"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5\": rpc error: code = NotFound desc = could not find container \"dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5\": container with ID starting with dcdf1fc380843af39a658804578812c3c889c53ed6646984fb0eb3f7f086eec5 not found: ID does not exist" Dec 04 22:17:41.782565 master-0 kubenswrapper[8606]: E1204 22:17:41.782423 8606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 22:17:41.784277 master-0 kubenswrapper[8606]: E1204 22:17:41.784187 8606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 22:17:41.786393 master-0 kubenswrapper[8606]: E1204 22:17:41.786315 8606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 22:17:41.786552 master-0 kubenswrapper[8606]: E1204 22:17:41.786392 8606 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" podUID="20e042ef-169e-4928-a98d-236282fe83a5" containerName="kube-multus-additional-cni-plugins" Dec 04 22:17:41.944181 master-0 kubenswrapper[8606]: I1204 22:17:41.943989 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:41.944181 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:41.944181 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:41.944181 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:41.944181 master-0 kubenswrapper[8606]: I1204 22:17:41.944101 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:42.945191 master-0 kubenswrapper[8606]: I1204 22:17:42.944948 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:42.945191 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:42.945191 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:42.945191 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:42.945191 master-0 kubenswrapper[8606]: I1204 22:17:42.945074 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:43.054920 master-0 kubenswrapper[8606]: I1204 22:17:43.054843 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:43.140621 master-0 kubenswrapper[8606]: I1204 22:17:43.140522 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kubelet-dir\") pod \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " Dec 04 22:17:43.140621 master-0 kubenswrapper[8606]: I1204 22:17:43.140599 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-var-lock\") pod \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " Dec 04 22:17:43.140621 master-0 kubenswrapper[8606]: I1204 22:17:43.140641 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kube-api-access\") pod \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\" (UID: \"986a4de7-3a54-48dc-9599-49cf19ba0ad5\") " Dec 04 22:17:43.141068 master-0 kubenswrapper[8606]: I1204 22:17:43.140732 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "986a4de7-3a54-48dc-9599-49cf19ba0ad5" (UID: "986a4de7-3a54-48dc-9599-49cf19ba0ad5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:43.141068 master-0 kubenswrapper[8606]: I1204 22:17:43.140840 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-var-lock" (OuterVolumeSpecName: "var-lock") pod "986a4de7-3a54-48dc-9599-49cf19ba0ad5" (UID: "986a4de7-3a54-48dc-9599-49cf19ba0ad5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:43.141341 master-0 kubenswrapper[8606]: I1204 22:17:43.141274 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:43.141434 master-0 kubenswrapper[8606]: I1204 22:17:43.141337 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/986a4de7-3a54-48dc-9599-49cf19ba0ad5-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:43.146291 master-0 kubenswrapper[8606]: I1204 22:17:43.146232 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "986a4de7-3a54-48dc-9599-49cf19ba0ad5" (UID: "986a4de7-3a54-48dc-9599-49cf19ba0ad5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:17:43.242241 master-0 kubenswrapper[8606]: I1204 22:17:43.242099 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/986a4de7-3a54-48dc-9599-49cf19ba0ad5-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:43.584297 master-0 kubenswrapper[8606]: I1204 22:17:43.584214 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"986a4de7-3a54-48dc-9599-49cf19ba0ad5","Type":"ContainerDied","Data":"1fee06d19aff5ef21eb8427a31bd857aa51bbbcb2fe5924a93729689e0a74832"} Dec 04 22:17:43.584297 master-0 kubenswrapper[8606]: I1204 22:17:43.584296 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fee06d19aff5ef21eb8427a31bd857aa51bbbcb2fe5924a93729689e0a74832" Dec 04 22:17:43.584819 master-0 kubenswrapper[8606]: I1204 22:17:43.584772 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:17:43.740069 master-0 kubenswrapper[8606]: I1204 22:17:43.740020 8606 scope.go:117] "RemoveContainer" containerID="b2de34afcf16d55af0ab629ff305e02bc4e8c470038e92112248dabc18c8bf30" Dec 04 22:17:43.944778 master-0 kubenswrapper[8606]: I1204 22:17:43.944599 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:43.944778 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:43.944778 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:43.944778 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:43.944778 master-0 kubenswrapper[8606]: I1204 22:17:43.944733 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:44.945076 master-0 kubenswrapper[8606]: I1204 22:17:44.944977 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:44.945076 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:44.945076 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:44.945076 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:44.946277 master-0 kubenswrapper[8606]: I1204 22:17:44.945117 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:45.942533 master-0 kubenswrapper[8606]: I1204 22:17:45.942311 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 04 22:17:45.942892 master-0 kubenswrapper[8606]: E1204 22:17:45.942855 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986a4de7-3a54-48dc-9599-49cf19ba0ad5" containerName="installer" Dec 04 22:17:45.942991 master-0 kubenswrapper[8606]: I1204 22:17:45.942891 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="986a4de7-3a54-48dc-9599-49cf19ba0ad5" containerName="installer" Dec 04 22:17:45.943165 master-0 kubenswrapper[8606]: I1204 22:17:45.943136 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="986a4de7-3a54-48dc-9599-49cf19ba0ad5" containerName="installer" Dec 04 22:17:45.943908 master-0 kubenswrapper[8606]: I1204 22:17:45.943878 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:45.945388 master-0 kubenswrapper[8606]: I1204 22:17:45.945333 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:45.945388 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:45.945388 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:45.945388 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:45.946293 master-0 kubenswrapper[8606]: I1204 22:17:45.945404 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:45.964308 master-0 kubenswrapper[8606]: I1204 22:17:45.963617 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 04 22:17:45.990229 master-0 kubenswrapper[8606]: I1204 22:17:45.990102 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37fd513d-caef-41da-8fa2-f08fca029805-kube-api-access\") pod \"installer-2-master-0\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:45.990491 master-0 kubenswrapper[8606]: I1204 22:17:45.990293 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-var-lock\") pod \"installer-2-master-0\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:45.990491 master-0 kubenswrapper[8606]: I1204 22:17:45.990368 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:46.092037 master-0 kubenswrapper[8606]: I1204 22:17:46.091935 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:46.092344 master-0 kubenswrapper[8606]: I1204 22:17:46.092132 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:46.092344 master-0 kubenswrapper[8606]: I1204 22:17:46.092315 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37fd513d-caef-41da-8fa2-f08fca029805-kube-api-access\") pod \"installer-2-master-0\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:46.092611 master-0 kubenswrapper[8606]: I1204 22:17:46.092567 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-var-lock\") pod \"installer-2-master-0\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:46.092611 master-0 kubenswrapper[8606]: I1204 22:17:46.092428 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-var-lock\") pod \"installer-2-master-0\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:46.127937 master-0 kubenswrapper[8606]: I1204 22:17:46.127839 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37fd513d-caef-41da-8fa2-f08fca029805-kube-api-access\") pod \"installer-2-master-0\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:46.285169 master-0 kubenswrapper[8606]: I1204 22:17:46.285079 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:46.840140 master-0 kubenswrapper[8606]: I1204 22:17:46.840075 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 04 22:17:46.945541 master-0 kubenswrapper[8606]: I1204 22:17:46.945465 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:46.945541 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:46.945541 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:46.945541 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:46.946326 master-0 kubenswrapper[8606]: I1204 22:17:46.945570 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:47.624059 master-0 kubenswrapper[8606]: I1204 22:17:47.623955 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"37fd513d-caef-41da-8fa2-f08fca029805","Type":"ContainerStarted","Data":"c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9"} Dec 04 22:17:47.624059 master-0 kubenswrapper[8606]: I1204 22:17:47.624043 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"37fd513d-caef-41da-8fa2-f08fca029805","Type":"ContainerStarted","Data":"2cc8aa5e2f3d465e0999004c3c83b7cae125246800f954e25db5ec674a7483b5"} Dec 04 22:17:47.655987 master-0 kubenswrapper[8606]: I1204 22:17:47.655156 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.655118674 podStartE2EDuration="2.655118674s" podCreationTimestamp="2025-12-04 22:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:17:47.650929838 +0000 UTC m=+1032.461232133" watchObservedRunningTime="2025-12-04 22:17:47.655118674 +0000 UTC m=+1032.465420929" Dec 04 22:17:47.945567 master-0 kubenswrapper[8606]: I1204 22:17:47.945338 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:47.945567 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:47.945567 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:47.945567 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:47.945567 master-0 kubenswrapper[8606]: I1204 22:17:47.945445 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:48.944708 master-0 kubenswrapper[8606]: I1204 22:17:48.944606 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:48.944708 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:48.944708 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:48.944708 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:48.945099 master-0 kubenswrapper[8606]: I1204 22:17:48.944770 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:49.945433 master-0 kubenswrapper[8606]: I1204 22:17:49.945370 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:49.945433 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:49.945433 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:49.945433 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:49.946669 master-0 kubenswrapper[8606]: I1204 22:17:49.946614 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:50.945133 master-0 kubenswrapper[8606]: I1204 22:17:50.945010 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:50.945133 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:50.945133 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:50.945133 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:50.945133 master-0 kubenswrapper[8606]: I1204 22:17:50.945117 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:51.783455 master-0 kubenswrapper[8606]: E1204 22:17:51.783340 8606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 22:17:51.786373 master-0 kubenswrapper[8606]: E1204 22:17:51.786250 8606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 22:17:51.788943 master-0 kubenswrapper[8606]: E1204 22:17:51.788859 8606 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 22:17:51.789106 master-0 kubenswrapper[8606]: E1204 22:17:51.788937 8606 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" podUID="20e042ef-169e-4928-a98d-236282fe83a5" containerName="kube-multus-additional-cni-plugins" Dec 04 22:17:51.945653 master-0 kubenswrapper[8606]: I1204 22:17:51.945532 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:51.945653 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:51.945653 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:51.945653 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:51.946693 master-0 kubenswrapper[8606]: I1204 22:17:51.945656 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:52.946231 master-0 kubenswrapper[8606]: I1204 22:17:52.946152 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:52.946231 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:52.946231 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:52.946231 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:52.947584 master-0 kubenswrapper[8606]: I1204 22:17:52.946234 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:53.391229 master-0 kubenswrapper[8606]: I1204 22:17:53.391121 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:53.424168 master-0 kubenswrapper[8606]: I1204 22:17:53.424092 8606 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="4b4a3b35-91d4-4c61-b051-a09da91443a7" Dec 04 22:17:53.424168 master-0 kubenswrapper[8606]: I1204 22:17:53.424153 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="4b4a3b35-91d4-4c61-b051-a09da91443a7" Dec 04 22:17:53.450118 master-0 kubenswrapper[8606]: I1204 22:17:53.450024 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:17:53.451340 master-0 kubenswrapper[8606]: I1204 22:17:53.451273 8606 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:53.464265 master-0 kubenswrapper[8606]: I1204 22:17:53.464160 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:17:53.471986 master-0 kubenswrapper[8606]: I1204 22:17:53.471936 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:17:53.480127 master-0 kubenswrapper[8606]: I1204 22:17:53.480061 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:17:53.684004 master-0 kubenswrapper[8606]: I1204 22:17:53.683926 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"4f2b4d655e0bf0ff49528fd7206e3f4bbadf9c2ae53c24cb531027c7c4811ac5"} Dec 04 22:17:53.949078 master-0 kubenswrapper[8606]: I1204 22:17:53.948890 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:53.949078 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:53.949078 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:53.949078 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:53.949078 master-0 kubenswrapper[8606]: I1204 22:17:53.948996 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:54.699831 master-0 kubenswrapper[8606]: I1204 22:17:54.699746 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"aab39ce7c056462df6f1a5933a3a5e925b99a0bd484dd0b16b296ab5327006ba"} Dec 04 22:17:54.700060 master-0 kubenswrapper[8606]: I1204 22:17:54.699844 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"51dfa6423a699c653fb4188616f00305edb215a14ee4fd1dcde5706013f4ee8d"} Dec 04 22:17:54.700060 master-0 kubenswrapper[8606]: I1204 22:17:54.699877 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"669f49b80171e40aea73e838597bed75920e67751d5f839f6934dbce1fedc710"} Dec 04 22:17:54.943964 master-0 kubenswrapper[8606]: I1204 22:17:54.943765 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:54.943964 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:54.943964 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:54.943964 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:54.943964 master-0 kubenswrapper[8606]: I1204 22:17:54.943886 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:55.543148 master-0 kubenswrapper[8606]: I1204 22:17:55.543097 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-zx64w_20e042ef-169e-4928-a98d-236282fe83a5/kube-multus-additional-cni-plugins/0.log" Dec 04 22:17:55.543705 master-0 kubenswrapper[8606]: I1204 22:17:55.543187 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:55.658839 master-0 kubenswrapper[8606]: I1204 22:17:55.658697 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/20e042ef-169e-4928-a98d-236282fe83a5-ready\") pod \"20e042ef-169e-4928-a98d-236282fe83a5\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " Dec 04 22:17:55.659048 master-0 kubenswrapper[8606]: I1204 22:17:55.658928 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20e042ef-169e-4928-a98d-236282fe83a5-tuning-conf-dir\") pod \"20e042ef-169e-4928-a98d-236282fe83a5\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " Dec 04 22:17:55.659048 master-0 kubenswrapper[8606]: I1204 22:17:55.659018 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20e042ef-169e-4928-a98d-236282fe83a5-cni-sysctl-allowlist\") pod \"20e042ef-169e-4928-a98d-236282fe83a5\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " Dec 04 22:17:55.659140 master-0 kubenswrapper[8606]: I1204 22:17:55.659050 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20e042ef-169e-4928-a98d-236282fe83a5-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "20e042ef-169e-4928-a98d-236282fe83a5" (UID: "20e042ef-169e-4928-a98d-236282fe83a5"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:55.659140 master-0 kubenswrapper[8606]: I1204 22:17:55.659085 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6znn\" (UniqueName: \"kubernetes.io/projected/20e042ef-169e-4928-a98d-236282fe83a5-kube-api-access-z6znn\") pod \"20e042ef-169e-4928-a98d-236282fe83a5\" (UID: \"20e042ef-169e-4928-a98d-236282fe83a5\") " Dec 04 22:17:55.659331 master-0 kubenswrapper[8606]: I1204 22:17:55.659221 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e042ef-169e-4928-a98d-236282fe83a5-ready" (OuterVolumeSpecName: "ready") pod "20e042ef-169e-4928-a98d-236282fe83a5" (UID: "20e042ef-169e-4928-a98d-236282fe83a5"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:17:55.659482 master-0 kubenswrapper[8606]: I1204 22:17:55.659443 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e042ef-169e-4928-a98d-236282fe83a5-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "20e042ef-169e-4928-a98d-236282fe83a5" (UID: "20e042ef-169e-4928-a98d-236282fe83a5"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:17:55.659724 master-0 kubenswrapper[8606]: I1204 22:17:55.659694 8606 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/20e042ef-169e-4928-a98d-236282fe83a5-ready\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:55.659724 master-0 kubenswrapper[8606]: I1204 22:17:55.659715 8606 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20e042ef-169e-4928-a98d-236282fe83a5-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:55.659862 master-0 kubenswrapper[8606]: I1204 22:17:55.659727 8606 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/20e042ef-169e-4928-a98d-236282fe83a5-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:55.662015 master-0 kubenswrapper[8606]: I1204 22:17:55.661955 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e042ef-169e-4928-a98d-236282fe83a5-kube-api-access-z6znn" (OuterVolumeSpecName: "kube-api-access-z6znn") pod "20e042ef-169e-4928-a98d-236282fe83a5" (UID: "20e042ef-169e-4928-a98d-236282fe83a5"). InnerVolumeSpecName "kube-api-access-z6znn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:17:55.709465 master-0 kubenswrapper[8606]: I1204 22:17:55.709380 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-zx64w_20e042ef-169e-4928-a98d-236282fe83a5/kube-multus-additional-cni-plugins/0.log" Dec 04 22:17:55.709695 master-0 kubenswrapper[8606]: I1204 22:17:55.709476 8606 generic.go:334] "Generic (PLEG): container finished" podID="20e042ef-169e-4928-a98d-236282fe83a5" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" exitCode=137 Dec 04 22:17:55.709695 master-0 kubenswrapper[8606]: I1204 22:17:55.709587 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" event={"ID":"20e042ef-169e-4928-a98d-236282fe83a5","Type":"ContainerDied","Data":"ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af"} Dec 04 22:17:55.709695 master-0 kubenswrapper[8606]: I1204 22:17:55.709622 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" event={"ID":"20e042ef-169e-4928-a98d-236282fe83a5","Type":"ContainerDied","Data":"8918274b67ec97aefc6e3c15e824021c73fe5c571b20580a3bbcad9aa8fc5b27"} Dec 04 22:17:55.709695 master-0 kubenswrapper[8606]: I1204 22:17:55.709639 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-zx64w" Dec 04 22:17:55.709868 master-0 kubenswrapper[8606]: I1204 22:17:55.709652 8606 scope.go:117] "RemoveContainer" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" Dec 04 22:17:55.714285 master-0 kubenswrapper[8606]: I1204 22:17:55.714221 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"f07a7e680c95366ba4f1d333748a37d99d6fcd0c588c7658749adf9e44cb7229"} Dec 04 22:17:55.733716 master-0 kubenswrapper[8606]: I1204 22:17:55.733601 8606 scope.go:117] "RemoveContainer" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" Dec 04 22:17:55.734485 master-0 kubenswrapper[8606]: E1204 22:17:55.734447 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af\": container with ID starting with ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af not found: ID does not exist" containerID="ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af" Dec 04 22:17:55.734593 master-0 kubenswrapper[8606]: I1204 22:17:55.734485 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af"} err="failed to get container status \"ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af\": rpc error: code = NotFound desc = could not find container \"ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af\": container with ID starting with ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af not found: ID does not exist" Dec 04 22:17:55.745003 master-0 kubenswrapper[8606]: I1204 22:17:55.744951 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.74493229 podStartE2EDuration="2.74493229s" podCreationTimestamp="2025-12-04 22:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:17:55.743563321 +0000 UTC m=+1040.553865546" watchObservedRunningTime="2025-12-04 22:17:55.74493229 +0000 UTC m=+1040.555234515" Dec 04 22:17:55.764422 master-0 kubenswrapper[8606]: I1204 22:17:55.764364 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6znn\" (UniqueName: \"kubernetes.io/projected/20e042ef-169e-4928-a98d-236282fe83a5-kube-api-access-z6znn\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:55.768259 master-0 kubenswrapper[8606]: I1204 22:17:55.767946 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zx64w"] Dec 04 22:17:55.772668 master-0 kubenswrapper[8606]: I1204 22:17:55.772372 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-zx64w"] Dec 04 22:17:55.944358 master-0 kubenswrapper[8606]: I1204 22:17:55.943895 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:55.944358 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:55.944358 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:55.944358 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:55.944358 master-0 kubenswrapper[8606]: I1204 22:17:55.944013 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:56.945542 master-0 kubenswrapper[8606]: I1204 22:17:56.945409 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:56.945542 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:56.945542 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:56.945542 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:56.946739 master-0 kubenswrapper[8606]: I1204 22:17:56.945553 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:57.402927 master-0 kubenswrapper[8606]: I1204 22:17:57.402840 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e042ef-169e-4928-a98d-236282fe83a5" path="/var/lib/kubelet/pods/20e042ef-169e-4928-a98d-236282fe83a5/volumes" Dec 04 22:17:57.943787 master-0 kubenswrapper[8606]: I1204 22:17:57.943708 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:57.943787 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:57.943787 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:57.943787 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:57.944274 master-0 kubenswrapper[8606]: I1204 22:17:57.943795 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:58.171031 master-0 kubenswrapper[8606]: I1204 22:17:58.170890 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 04 22:17:58.172793 master-0 kubenswrapper[8606]: I1204 22:17:58.171328 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="37fd513d-caef-41da-8fa2-f08fca029805" containerName="installer" containerID="cri-o://c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9" gracePeriod=30 Dec 04 22:17:58.717563 master-0 kubenswrapper[8606]: I1204 22:17:58.717194 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_37fd513d-caef-41da-8fa2-f08fca029805/installer/0.log" Dec 04 22:17:58.717563 master-0 kubenswrapper[8606]: I1204 22:17:58.717294 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:58.762186 master-0 kubenswrapper[8606]: I1204 22:17:58.761716 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37fd513d-caef-41da-8fa2-f08fca029805-kube-api-access\") pod \"37fd513d-caef-41da-8fa2-f08fca029805\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " Dec 04 22:17:58.762963 master-0 kubenswrapper[8606]: I1204 22:17:58.762854 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-kubelet-dir\") pod \"37fd513d-caef-41da-8fa2-f08fca029805\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " Dec 04 22:17:58.763088 master-0 kubenswrapper[8606]: I1204 22:17:58.763010 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-var-lock\") pod \"37fd513d-caef-41da-8fa2-f08fca029805\" (UID: \"37fd513d-caef-41da-8fa2-f08fca029805\") " Dec 04 22:17:58.763237 master-0 kubenswrapper[8606]: I1204 22:17:58.762897 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "37fd513d-caef-41da-8fa2-f08fca029805" (UID: "37fd513d-caef-41da-8fa2-f08fca029805"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:58.763444 master-0 kubenswrapper[8606]: I1204 22:17:58.763370 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-var-lock" (OuterVolumeSpecName: "var-lock") pod "37fd513d-caef-41da-8fa2-f08fca029805" (UID: "37fd513d-caef-41da-8fa2-f08fca029805"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:17:58.764539 master-0 kubenswrapper[8606]: I1204 22:17:58.764443 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:58.764539 master-0 kubenswrapper[8606]: I1204 22:17:58.764534 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37fd513d-caef-41da-8fa2-f08fca029805-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:58.793973 master-0 kubenswrapper[8606]: I1204 22:17:58.793365 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37fd513d-caef-41da-8fa2-f08fca029805-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "37fd513d-caef-41da-8fa2-f08fca029805" (UID: "37fd513d-caef-41da-8fa2-f08fca029805"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:17:58.800031 master-0 kubenswrapper[8606]: I1204 22:17:58.799973 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_37fd513d-caef-41da-8fa2-f08fca029805/installer/0.log" Dec 04 22:17:58.800031 master-0 kubenswrapper[8606]: I1204 22:17:58.800233 8606 generic.go:334] "Generic (PLEG): container finished" podID="37fd513d-caef-41da-8fa2-f08fca029805" containerID="c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9" exitCode=1 Dec 04 22:17:58.800447 master-0 kubenswrapper[8606]: I1204 22:17:58.800287 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"37fd513d-caef-41da-8fa2-f08fca029805","Type":"ContainerDied","Data":"c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9"} Dec 04 22:17:58.800447 master-0 kubenswrapper[8606]: I1204 22:17:58.800334 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"37fd513d-caef-41da-8fa2-f08fca029805","Type":"ContainerDied","Data":"2cc8aa5e2f3d465e0999004c3c83b7cae125246800f954e25db5ec674a7483b5"} Dec 04 22:17:58.800447 master-0 kubenswrapper[8606]: I1204 22:17:58.800336 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 04 22:17:58.800744 master-0 kubenswrapper[8606]: I1204 22:17:58.800356 8606 scope.go:117] "RemoveContainer" containerID="c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9" Dec 04 22:17:58.831224 master-0 kubenswrapper[8606]: I1204 22:17:58.831169 8606 scope.go:117] "RemoveContainer" containerID="c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9" Dec 04 22:17:58.831715 master-0 kubenswrapper[8606]: E1204 22:17:58.831661 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9\": container with ID starting with c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9 not found: ID does not exist" containerID="c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9" Dec 04 22:17:58.831812 master-0 kubenswrapper[8606]: I1204 22:17:58.831718 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9"} err="failed to get container status \"c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9\": rpc error: code = NotFound desc = could not find container \"c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9\": container with ID starting with c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9 not found: ID does not exist" Dec 04 22:17:58.856402 master-0 kubenswrapper[8606]: I1204 22:17:58.855770 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 04 22:17:58.862162 master-0 kubenswrapper[8606]: I1204 22:17:58.862115 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 04 22:17:58.865783 master-0 kubenswrapper[8606]: I1204 22:17:58.865728 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37fd513d-caef-41da-8fa2-f08fca029805-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:17:58.944876 master-0 kubenswrapper[8606]: I1204 22:17:58.944713 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:58.944876 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:58.944876 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:58.944876 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:58.944876 master-0 kubenswrapper[8606]: I1204 22:17:58.944814 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:17:59.411926 master-0 kubenswrapper[8606]: I1204 22:17:59.411793 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37fd513d-caef-41da-8fa2-f08fca029805" path="/var/lib/kubelet/pods/37fd513d-caef-41da-8fa2-f08fca029805/volumes" Dec 04 22:17:59.944193 master-0 kubenswrapper[8606]: I1204 22:17:59.944102 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:17:59.944193 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:17:59.944193 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:17:59.944193 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:17:59.944631 master-0 kubenswrapper[8606]: I1204 22:17:59.944220 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:00.688119 master-0 kubenswrapper[8606]: I1204 22:18:00.687934 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:18:00.827049 master-0 kubenswrapper[8606]: I1204 22:18:00.826905 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/5.log" Dec 04 22:18:00.828272 master-0 kubenswrapper[8606]: I1204 22:18:00.828171 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/4.log" Dec 04 22:18:00.829129 master-0 kubenswrapper[8606]: I1204 22:18:00.829064 8606 generic.go:334] "Generic (PLEG): container finished" podID="addddaac-a31a-4dbf-b78f-87225b11b463" containerID="35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e" exitCode=1 Dec 04 22:18:00.829260 master-0 kubenswrapper[8606]: I1204 22:18:00.829131 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerDied","Data":"35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e"} Dec 04 22:18:00.829260 master-0 kubenswrapper[8606]: I1204 22:18:00.829199 8606 scope.go:117] "RemoveContainer" containerID="dc3414f4621b229cda612586441e27b5df397606b90be0a37237d6080487c0a6" Dec 04 22:18:00.830113 master-0 kubenswrapper[8606]: I1204 22:18:00.830052 8606 scope.go:117] "RemoveContainer" containerID="35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e" Dec 04 22:18:00.830635 master-0 kubenswrapper[8606]: E1204 22:18:00.830578 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:18:00.945749 master-0 kubenswrapper[8606]: I1204 22:18:00.945557 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:00.945749 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:00.945749 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:00.945749 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:00.945749 master-0 kubenswrapper[8606]: I1204 22:18:00.945672 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:01.840704 master-0 kubenswrapper[8606]: I1204 22:18:01.840606 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/5.log" Dec 04 22:18:01.948969 master-0 kubenswrapper[8606]: I1204 22:18:01.948868 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:01.948969 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:01.948969 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:01.948969 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:01.949622 master-0 kubenswrapper[8606]: I1204 22:18:01.948991 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:01.956305 master-0 kubenswrapper[8606]: I1204 22:18:01.955867 8606 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 04 22:18:01.956936 master-0 kubenswrapper[8606]: E1204 22:18:01.956880 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37fd513d-caef-41da-8fa2-f08fca029805" containerName="installer" Dec 04 22:18:01.957000 master-0 kubenswrapper[8606]: I1204 22:18:01.956936 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="37fd513d-caef-41da-8fa2-f08fca029805" containerName="installer" Dec 04 22:18:01.957000 master-0 kubenswrapper[8606]: E1204 22:18:01.956984 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e042ef-169e-4928-a98d-236282fe83a5" containerName="kube-multus-additional-cni-plugins" Dec 04 22:18:01.957109 master-0 kubenswrapper[8606]: I1204 22:18:01.957003 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e042ef-169e-4928-a98d-236282fe83a5" containerName="kube-multus-additional-cni-plugins" Dec 04 22:18:01.957578 master-0 kubenswrapper[8606]: I1204 22:18:01.957496 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="37fd513d-caef-41da-8fa2-f08fca029805" containerName="installer" Dec 04 22:18:01.957655 master-0 kubenswrapper[8606]: I1204 22:18:01.957609 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e042ef-169e-4928-a98d-236282fe83a5" containerName="kube-multus-additional-cni-plugins" Dec 04 22:18:01.958439 master-0 kubenswrapper[8606]: I1204 22:18:01.958396 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:01.987874 master-0 kubenswrapper[8606]: I1204 22:18:01.987775 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 04 22:18:02.022410 master-0 kubenswrapper[8606]: I1204 22:18:02.022337 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:02.022873 master-0 kubenswrapper[8606]: I1204 22:18:02.022814 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:02.023034 master-0 kubenswrapper[8606]: I1204 22:18:02.022982 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:02.124450 master-0 kubenswrapper[8606]: I1204 22:18:02.124291 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:02.124711 master-0 kubenswrapper[8606]: I1204 22:18:02.124557 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:02.124711 master-0 kubenswrapper[8606]: I1204 22:18:02.124612 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:02.124949 master-0 kubenswrapper[8606]: I1204 22:18:02.124864 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:02.125009 master-0 kubenswrapper[8606]: I1204 22:18:02.124983 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:02.151189 master-0 kubenswrapper[8606]: I1204 22:18:02.151138 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:02.268298 master-0 kubenswrapper[8606]: E1204 22:18:02.268212 8606 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/dc29b93736ce6f134751ecd0ceb0f0d086c542e1fd238eab464d5f405bcbfcda/diff" to get inode usage: stat /var/lib/containers/storage/overlay/dc29b93736ce6f134751ecd0ceb0f0d086c542e1fd238eab464d5f405bcbfcda/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/1.log" to get inode usage: stat /var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_fad55397ac8e23f218f25cb714ea5b2b/kube-controller-manager/1.log: no such file or directory Dec 04 22:18:02.307280 master-0 kubenswrapper[8606]: I1204 22:18:02.307169 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:02.834523 master-0 kubenswrapper[8606]: I1204 22:18:02.834427 8606 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 04 22:18:02.854195 master-0 kubenswrapper[8606]: I1204 22:18:02.854114 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"dbe54b09-0399-4fbe-9f84-dd9dede0ab96","Type":"ContainerStarted","Data":"d377be8d0441b589958c5adc3aad9974e2610bf718707f9842352d0cb595d25f"} Dec 04 22:18:02.945066 master-0 kubenswrapper[8606]: I1204 22:18:02.944988 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:02.945066 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:02.945066 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:02.945066 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:02.945456 master-0 kubenswrapper[8606]: I1204 22:18:02.945084 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:03.473475 master-0 kubenswrapper[8606]: I1204 22:18:03.472532 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:18:03.473475 master-0 kubenswrapper[8606]: I1204 22:18:03.472604 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:18:03.473475 master-0 kubenswrapper[8606]: I1204 22:18:03.472626 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:18:03.473475 master-0 kubenswrapper[8606]: I1204 22:18:03.472636 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:18:03.478828 master-0 kubenswrapper[8606]: I1204 22:18:03.478786 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:18:03.480339 master-0 kubenswrapper[8606]: I1204 22:18:03.480279 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:18:03.867755 master-0 kubenswrapper[8606]: I1204 22:18:03.867645 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"dbe54b09-0399-4fbe-9f84-dd9dede0ab96","Type":"ContainerStarted","Data":"5a4d99a6b7149fd4133c1e3efcfd35582ffcb1582acaa62e903eb008119e1624"} Dec 04 22:18:03.878980 master-0 kubenswrapper[8606]: I1204 22:18:03.878928 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-7dfc5b745f-nk4gb_5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf/multus-admission-controller/0.log" Dec 04 22:18:03.879156 master-0 kubenswrapper[8606]: I1204 22:18:03.879028 8606 generic.go:334] "Generic (PLEG): container finished" podID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" containerID="7e4aac9f5c23b83a163652977c7ded79014cb793917982a18845f9835af09587" exitCode=137 Dec 04 22:18:03.879231 master-0 kubenswrapper[8606]: I1204 22:18:03.879171 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" event={"ID":"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf","Type":"ContainerDied","Data":"7e4aac9f5c23b83a163652977c7ded79014cb793917982a18845f9835af09587"} Dec 04 22:18:03.888217 master-0 kubenswrapper[8606]: I1204 22:18:03.888144 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:18:03.900639 master-0 kubenswrapper[8606]: I1204 22:18:03.897695 8606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=2.89766119 podStartE2EDuration="2.89766119s" podCreationTimestamp="2025-12-04 22:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:18:03.89481178 +0000 UTC m=+1048.705114055" watchObservedRunningTime="2025-12-04 22:18:03.89766119 +0000 UTC m=+1048.707963435" Dec 04 22:18:03.945180 master-0 kubenswrapper[8606]: I1204 22:18:03.945071 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:03.945180 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:03.945180 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:03.945180 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:03.945549 master-0 kubenswrapper[8606]: I1204 22:18:03.945177 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:04.004867 master-0 kubenswrapper[8606]: I1204 22:18:04.004751 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-7dfc5b745f-nk4gb_5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf/multus-admission-controller/0.log" Dec 04 22:18:04.004867 master-0 kubenswrapper[8606]: I1204 22:18:04.004872 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:18:04.057567 master-0 kubenswrapper[8606]: I1204 22:18:04.057383 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") pod \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " Dec 04 22:18:04.057567 master-0 kubenswrapper[8606]: I1204 22:18:04.057558 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wqqt\" (UniqueName: \"kubernetes.io/projected/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-kube-api-access-8wqqt\") pod \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\" (UID: \"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf\") " Dec 04 22:18:04.063781 master-0 kubenswrapper[8606]: I1204 22:18:04.063685 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:18:04.064104 master-0 kubenswrapper[8606]: I1204 22:18:04.063854 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-kube-api-access-8wqqt" (OuterVolumeSpecName: "kube-api-access-8wqqt") pod "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" (UID: "5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf"). InnerVolumeSpecName "kube-api-access-8wqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:18:04.159869 master-0 kubenswrapper[8606]: I1204 22:18:04.159651 8606 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-webhook-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:04.159869 master-0 kubenswrapper[8606]: I1204 22:18:04.159729 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wqqt\" (UniqueName: \"kubernetes.io/projected/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf-kube-api-access-8wqqt\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:04.889714 master-0 kubenswrapper[8606]: I1204 22:18:04.889641 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-7dfc5b745f-nk4gb_5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf/multus-admission-controller/0.log" Dec 04 22:18:04.890902 master-0 kubenswrapper[8606]: I1204 22:18:04.889766 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" event={"ID":"5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf","Type":"ContainerDied","Data":"a96deeb0726472a631562696723fe7dacd1bdfdf5107c1d25582d7805e92f14c"} Dec 04 22:18:04.890902 master-0 kubenswrapper[8606]: I1204 22:18:04.889860 8606 scope.go:117] "RemoveContainer" containerID="815be3bb78086065271ecb4d4b9b7c7f847598761d2c9ee58e7b745732e5f4f4" Dec 04 22:18:04.890902 master-0 kubenswrapper[8606]: I1204 22:18:04.890137 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb" Dec 04 22:18:04.899693 master-0 kubenswrapper[8606]: I1204 22:18:04.899612 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:18:04.917716 master-0 kubenswrapper[8606]: I1204 22:18:04.917580 8606 scope.go:117] "RemoveContainer" containerID="7e4aac9f5c23b83a163652977c7ded79014cb793917982a18845f9835af09587" Dec 04 22:18:04.944601 master-0 kubenswrapper[8606]: I1204 22:18:04.944486 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:04.944601 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:04.944601 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:04.944601 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:04.945112 master-0 kubenswrapper[8606]: I1204 22:18:04.944644 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:04.967220 master-0 kubenswrapper[8606]: I1204 22:18:04.967140 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb"] Dec 04 22:18:04.977546 master-0 kubenswrapper[8606]: I1204 22:18:04.977429 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-nk4gb"] Dec 04 22:18:05.418108 master-0 kubenswrapper[8606]: I1204 22:18:05.417998 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" path="/var/lib/kubelet/pods/5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf/volumes" Dec 04 22:18:05.944088 master-0 kubenswrapper[8606]: I1204 22:18:05.943991 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:05.944088 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:05.944088 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:05.944088 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:05.945125 master-0 kubenswrapper[8606]: I1204 22:18:05.944113 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:06.944382 master-0 kubenswrapper[8606]: I1204 22:18:06.944252 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:06.944382 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:06.944382 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:06.944382 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:06.944382 master-0 kubenswrapper[8606]: I1204 22:18:06.944370 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:07.354239 master-0 kubenswrapper[8606]: W1204 22:18:07.354131 8606 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-pod37fd513d_caef_41da_8fa2_f08fca029805.slice/crio-2cc8aa5e2f3d465e0999004c3c83b7cae125246800f954e25db5ec674a7483b5": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-pod37fd513d_caef_41da_8fa2_f08fca029805.slice/crio-2cc8aa5e2f3d465e0999004c3c83b7cae125246800f954e25db5ec674a7483b5: no such file or directory Dec 04 22:18:07.354239 master-0 kubenswrapper[8606]: W1204 22:18:07.354216 8606 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-pod37fd513d_caef_41da_8fa2_f08fca029805.slice/crio-conmon-c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-pod37fd513d_caef_41da_8fa2_f08fca029805.slice/crio-conmon-c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9.scope: no such file or directory Dec 04 22:18:07.354239 master-0 kubenswrapper[8606]: W1204 22:18:07.354234 8606 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-pod37fd513d_caef_41da_8fa2_f08fca029805.slice/crio-c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-pod37fd513d_caef_41da_8fa2_f08fca029805.slice/crio-c7833134edff1806b32d3ea6c53acbcc3770595ebd009cf7a56334e613cb4ee9.scope: no such file or directory Dec 04 22:18:07.454562 master-0 kubenswrapper[8606]: E1204 22:18:07.454435 8606 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod49c55f04_2c89_4439_8171_6a586acc4db2.slice/crio-4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-conmon-4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaddddaac_a31a_4dbf_b78f_87225b11b463.slice/crio-35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod986a4de7_3a54_48dc_9599_49cf19ba0ad5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod986a4de7_3a54_48dc_9599_49cf19ba0ad5.slice/crio-1fee06d19aff5ef21eb8427a31bd857aa51bbbcb2fe5924a93729689e0a74832\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e042ef_169e_4928_a98d_236282fe83a5.slice/crio-ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6bd2f1_3f34_40c2_b2cf_7e3881ba51bf.slice/crio-conmon-7e4aac9f5c23b83a163652977c7ded79014cb793917982a18845f9835af09587.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-conmon-7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6bd2f1_3f34_40c2_b2cf_7e3881ba51bf.slice/crio-a96deeb0726472a631562696723fe7dacd1bdfdf5107c1d25582d7805e92f14c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e042ef_169e_4928_a98d_236282fe83a5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-conmon-51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod986a4de7_3a54_48dc_9599_49cf19ba0ad5.slice/crio-conmon-42c592fcd97dd09de62f2c07d511eb1f7fedb875ba01c50a76d8a639e15849ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e042ef_169e_4928_a98d_236282fe83a5.slice/crio-8918274b67ec97aefc6e3c15e824021c73fe5c571b20580a3bbcad9aa8fc5b27\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-conmon-318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod986a4de7_3a54_48dc_9599_49cf19ba0ad5.slice/crio-42c592fcd97dd09de62f2c07d511eb1f7fedb875ba01c50a76d8a639e15849ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod49c55f04_2c89_4439_8171_6a586acc4db2.slice/crio-conmon-4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaddddaac_a31a_4dbf_b78f_87225b11b463.slice/crio-conmon-35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-9ccb30a6243f7a894b4b3551e9274749c93d506bae1a70db70653ccadedbb5f2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6bd2f1_3f34_40c2_b2cf_7e3881ba51bf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6bd2f1_3f34_40c2_b2cf_7e3881ba51bf.slice/crio-7e4aac9f5c23b83a163652977c7ded79014cb793917982a18845f9835af09587.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e042ef_169e_4928_a98d_236282fe83a5.slice/crio-conmon-ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af.scope\": RecentStats: unable to find data in memory cache]" Dec 04 22:18:07.455131 master-0 kubenswrapper[8606]: E1204 22:18:07.454825 8606 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod986a4de7_3a54_48dc_9599_49cf19ba0ad5.slice/crio-conmon-42c592fcd97dd09de62f2c07d511eb1f7fedb875ba01c50a76d8a639e15849ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e042ef_169e_4928_a98d_236282fe83a5.slice/crio-conmon-ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-9ccb30a6243f7a894b4b3551e9274749c93d506bae1a70db70653ccadedbb5f2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6bd2f1_3f34_40c2_b2cf_7e3881ba51bf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod986a4de7_3a54_48dc_9599_49cf19ba0ad5.slice/crio-1fee06d19aff5ef21eb8427a31bd857aa51bbbcb2fe5924a93729689e0a74832\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod986a4de7_3a54_48dc_9599_49cf19ba0ad5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaddddaac_a31a_4dbf_b78f_87225b11b463.slice/crio-conmon-35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod49c55f04_2c89_4439_8171_6a586acc4db2.slice/crio-conmon-4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-conmon-4a3ed1dd2d3cda33660642f2cfc51f686d2e52f60ffc0ec1e8c51962166d27fa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6bd2f1_3f34_40c2_b2cf_7e3881ba51bf.slice/crio-7e4aac9f5c23b83a163652977c7ded79014cb793917982a18845f9835af09587.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-conmon-318a3ff7594f660cf050bba444029afa96b10917b770be043ec586455fc79628.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e042ef_169e_4928_a98d_236282fe83a5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-conmon-51460e752788acf62244ed5eb96a6ac93508d35fb2a46a43225bd291982ad671.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6bd2f1_3f34_40c2_b2cf_7e3881ba51bf.slice/crio-conmon-7e4aac9f5c23b83a163652977c7ded79014cb793917982a18845f9835af09587.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e042ef_169e_4928_a98d_236282fe83a5.slice/crio-ed31aae0c715dd65997ff47ea4a1a9293914e847342b61ed1723ff612235a6af.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6bd2f1_3f34_40c2_b2cf_7e3881ba51bf.slice/crio-a96deeb0726472a631562696723fe7dacd1bdfdf5107c1d25582d7805e92f14c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e042ef_169e_4928_a98d_236282fe83a5.slice/crio-8918274b67ec97aefc6e3c15e824021c73fe5c571b20580a3bbcad9aa8fc5b27\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad55397ac8e23f218f25cb714ea5b2b.slice/crio-conmon-7734290aa3ab3b7fab88de258c9e3739e2101b3009b0d065d343f81b69e3221f.scope\": RecentStats: unable to find data in memory cache]" Dec 04 22:18:07.787849 master-0 kubenswrapper[8606]: I1204 22:18:07.787756 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_49c55f04-2c89-4439-8171-6a586acc4db2/installer/0.log" Dec 04 22:18:07.788097 master-0 kubenswrapper[8606]: I1204 22:18:07.787911 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:18:07.827487 master-0 kubenswrapper[8606]: I1204 22:18:07.827374 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49c55f04-2c89-4439-8171-6a586acc4db2-kube-api-access\") pod \"49c55f04-2c89-4439-8171-6a586acc4db2\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " Dec 04 22:18:07.827487 master-0 kubenswrapper[8606]: I1204 22:18:07.827496 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-var-lock\") pod \"49c55f04-2c89-4439-8171-6a586acc4db2\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " Dec 04 22:18:07.828247 master-0 kubenswrapper[8606]: I1204 22:18:07.827646 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-kubelet-dir\") pod \"49c55f04-2c89-4439-8171-6a586acc4db2\" (UID: \"49c55f04-2c89-4439-8171-6a586acc4db2\") " Dec 04 22:18:07.828247 master-0 kubenswrapper[8606]: I1204 22:18:07.827755 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-var-lock" (OuterVolumeSpecName: "var-lock") pod "49c55f04-2c89-4439-8171-6a586acc4db2" (UID: "49c55f04-2c89-4439-8171-6a586acc4db2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:18:07.828247 master-0 kubenswrapper[8606]: I1204 22:18:07.827967 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "49c55f04-2c89-4439-8171-6a586acc4db2" (UID: "49c55f04-2c89-4439-8171-6a586acc4db2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:18:07.828690 master-0 kubenswrapper[8606]: I1204 22:18:07.828639 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:07.828690 master-0 kubenswrapper[8606]: I1204 22:18:07.828674 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49c55f04-2c89-4439-8171-6a586acc4db2-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:07.842305 master-0 kubenswrapper[8606]: I1204 22:18:07.842186 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c55f04-2c89-4439-8171-6a586acc4db2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "49c55f04-2c89-4439-8171-6a586acc4db2" (UID: "49c55f04-2c89-4439-8171-6a586acc4db2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:18:07.929956 master-0 kubenswrapper[8606]: I1204 22:18:07.929851 8606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_49c55f04-2c89-4439-8171-6a586acc4db2/installer/0.log" Dec 04 22:18:07.930409 master-0 kubenswrapper[8606]: I1204 22:18:07.929984 8606 generic.go:334] "Generic (PLEG): container finished" podID="49c55f04-2c89-4439-8171-6a586acc4db2" containerID="4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6" exitCode=1 Dec 04 22:18:07.930409 master-0 kubenswrapper[8606]: I1204 22:18:07.930030 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49c55f04-2c89-4439-8171-6a586acc4db2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:07.930409 master-0 kubenswrapper[8606]: I1204 22:18:07.930050 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"49c55f04-2c89-4439-8171-6a586acc4db2","Type":"ContainerDied","Data":"4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6"} Dec 04 22:18:07.930409 master-0 kubenswrapper[8606]: I1204 22:18:07.930127 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"49c55f04-2c89-4439-8171-6a586acc4db2","Type":"ContainerDied","Data":"d19066723bc19dac944f997eba7b40ba7eb1046f1ea930eac6393550ed81b491"} Dec 04 22:18:07.930409 master-0 kubenswrapper[8606]: I1204 22:18:07.930138 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Dec 04 22:18:07.930409 master-0 kubenswrapper[8606]: I1204 22:18:07.930169 8606 scope.go:117] "RemoveContainer" containerID="4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6" Dec 04 22:18:07.944490 master-0 kubenswrapper[8606]: I1204 22:18:07.944417 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:07.944490 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:07.944490 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:07.944490 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:07.945526 master-0 kubenswrapper[8606]: I1204 22:18:07.944526 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:07.967955 master-0 kubenswrapper[8606]: I1204 22:18:07.967893 8606 scope.go:117] "RemoveContainer" containerID="4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6" Dec 04 22:18:07.968975 master-0 kubenswrapper[8606]: E1204 22:18:07.968916 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6\": container with ID starting with 4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6 not found: ID does not exist" containerID="4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6" Dec 04 22:18:07.969108 master-0 kubenswrapper[8606]: I1204 22:18:07.969002 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6"} err="failed to get container status \"4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6\": rpc error: code = NotFound desc = could not find container \"4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6\": container with ID starting with 4baabdcb1f501bcc7e204eccbda6b35b454d7bf7116e16d1b6eab124e1fec8d6 not found: ID does not exist" Dec 04 22:18:07.995170 master-0 kubenswrapper[8606]: I1204 22:18:07.994983 8606 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Dec 04 22:18:08.001264 master-0 kubenswrapper[8606]: I1204 22:18:08.001199 8606 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Dec 04 22:18:08.944082 master-0 kubenswrapper[8606]: I1204 22:18:08.943953 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:08.944082 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:08.944082 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:08.944082 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:08.944082 master-0 kubenswrapper[8606]: I1204 22:18:08.944060 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:09.404448 master-0 kubenswrapper[8606]: I1204 22:18:09.404273 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c55f04-2c89-4439-8171-6a586acc4db2" path="/var/lib/kubelet/pods/49c55f04-2c89-4439-8171-6a586acc4db2/volumes" Dec 04 22:18:09.948808 master-0 kubenswrapper[8606]: I1204 22:18:09.948756 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:09.948808 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:09.948808 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:09.948808 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:09.949471 master-0 kubenswrapper[8606]: I1204 22:18:09.948828 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:10.947039 master-0 kubenswrapper[8606]: I1204 22:18:10.946951 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:10.947039 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:10.947039 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:10.947039 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:10.947531 master-0 kubenswrapper[8606]: I1204 22:18:10.947067 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:11.943852 master-0 kubenswrapper[8606]: I1204 22:18:11.943768 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:11.943852 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:11.943852 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:11.943852 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:11.944927 master-0 kubenswrapper[8606]: I1204 22:18:11.944676 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:12.392359 master-0 kubenswrapper[8606]: I1204 22:18:12.392229 8606 scope.go:117] "RemoveContainer" containerID="35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e" Dec 04 22:18:12.393059 master-0 kubenswrapper[8606]: E1204 22:18:12.392768 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:18:12.944532 master-0 kubenswrapper[8606]: I1204 22:18:12.944420 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:12.944532 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:12.944532 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:12.944532 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:12.945391 master-0 kubenswrapper[8606]: I1204 22:18:12.944555 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:13.945020 master-0 kubenswrapper[8606]: I1204 22:18:13.944924 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:13.945020 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:13.945020 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:13.945020 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:13.946376 master-0 kubenswrapper[8606]: I1204 22:18:13.945049 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:14.943669 master-0 kubenswrapper[8606]: I1204 22:18:14.943598 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:14.943669 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:14.943669 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:14.943669 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:14.944033 master-0 kubenswrapper[8606]: I1204 22:18:14.943699 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:15.944842 master-0 kubenswrapper[8606]: I1204 22:18:15.944738 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:15.944842 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:15.944842 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:15.944842 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:15.944842 master-0 kubenswrapper[8606]: I1204 22:18:15.944836 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:16.944433 master-0 kubenswrapper[8606]: I1204 22:18:16.944324 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:16.944433 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:16.944433 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:16.944433 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:16.945792 master-0 kubenswrapper[8606]: I1204 22:18:16.944441 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:17.944375 master-0 kubenswrapper[8606]: I1204 22:18:17.944264 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:17.944375 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:17.944375 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:17.944375 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:17.945661 master-0 kubenswrapper[8606]: I1204 22:18:17.944406 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:18.944029 master-0 kubenswrapper[8606]: I1204 22:18:18.943941 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:18.944029 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:18.944029 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:18.944029 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:18.944627 master-0 kubenswrapper[8606]: I1204 22:18:18.944049 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:19.944741 master-0 kubenswrapper[8606]: I1204 22:18:19.944578 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:19.944741 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:19.944741 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:19.944741 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:19.945861 master-0 kubenswrapper[8606]: I1204 22:18:19.945654 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:20.944762 master-0 kubenswrapper[8606]: I1204 22:18:20.944604 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:20.944762 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:20.944762 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:20.944762 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:20.945788 master-0 kubenswrapper[8606]: I1204 22:18:20.944854 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:21.944553 master-0 kubenswrapper[8606]: I1204 22:18:21.944438 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:21.944553 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:21.944553 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:21.944553 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:21.945715 master-0 kubenswrapper[8606]: I1204 22:18:21.944557 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:22.945015 master-0 kubenswrapper[8606]: I1204 22:18:22.944946 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:22.945015 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:22.945015 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:22.945015 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:22.946108 master-0 kubenswrapper[8606]: I1204 22:18:22.945673 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:23.944341 master-0 kubenswrapper[8606]: I1204 22:18:23.944211 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:23.944341 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:23.944341 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:23.944341 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:23.944341 master-0 kubenswrapper[8606]: I1204 22:18:23.944320 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:24.943982 master-0 kubenswrapper[8606]: I1204 22:18:24.943884 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:24.943982 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:24.943982 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:24.943982 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:24.944432 master-0 kubenswrapper[8606]: I1204 22:18:24.943984 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:25.945439 master-0 kubenswrapper[8606]: I1204 22:18:25.945367 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:25.945439 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:25.945439 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:25.945439 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:25.946875 master-0 kubenswrapper[8606]: I1204 22:18:25.945458 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:26.944663 master-0 kubenswrapper[8606]: I1204 22:18:26.944560 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:26.944663 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:26.944663 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:26.944663 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:26.945168 master-0 kubenswrapper[8606]: I1204 22:18:26.944663 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:27.391938 master-0 kubenswrapper[8606]: I1204 22:18:27.391849 8606 scope.go:117] "RemoveContainer" containerID="35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e" Dec 04 22:18:27.392794 master-0 kubenswrapper[8606]: E1204 22:18:27.392234 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:18:27.944775 master-0 kubenswrapper[8606]: I1204 22:18:27.944645 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:27.944775 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:27.944775 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:27.944775 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:27.944775 master-0 kubenswrapper[8606]: I1204 22:18:27.944762 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:28.944289 master-0 kubenswrapper[8606]: I1204 22:18:28.944201 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:28.944289 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:28.944289 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:28.944289 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:28.945027 master-0 kubenswrapper[8606]: I1204 22:18:28.944341 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:29.943852 master-0 kubenswrapper[8606]: I1204 22:18:29.943741 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:29.943852 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:29.943852 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:29.943852 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:29.943852 master-0 kubenswrapper[8606]: I1204 22:18:29.943845 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:30.944077 master-0 kubenswrapper[8606]: I1204 22:18:30.944004 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:30.944077 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:30.944077 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:30.944077 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:30.944390 master-0 kubenswrapper[8606]: I1204 22:18:30.944108 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:31.945055 master-0 kubenswrapper[8606]: I1204 22:18:31.944938 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:31.945055 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:31.945055 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:31.945055 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:31.946041 master-0 kubenswrapper[8606]: I1204 22:18:31.945024 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:32.944993 master-0 kubenswrapper[8606]: I1204 22:18:32.944871 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:32.944993 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:32.944993 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:32.944993 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:32.945864 master-0 kubenswrapper[8606]: I1204 22:18:32.945013 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:33.945175 master-0 kubenswrapper[8606]: I1204 22:18:33.945078 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:33.945175 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:33.945175 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:33.945175 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:33.946261 master-0 kubenswrapper[8606]: I1204 22:18:33.945186 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:34.946728 master-0 kubenswrapper[8606]: I1204 22:18:34.946562 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:34.946728 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:34.946728 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:34.946728 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:34.946728 master-0 kubenswrapper[8606]: I1204 22:18:34.946719 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:35.944347 master-0 kubenswrapper[8606]: I1204 22:18:35.944285 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:35.944347 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:35.944347 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:35.944347 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:35.945088 master-0 kubenswrapper[8606]: I1204 22:18:35.945047 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:36.943677 master-0 kubenswrapper[8606]: I1204 22:18:36.943596 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:36.943677 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:36.943677 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:36.943677 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:36.944585 master-0 kubenswrapper[8606]: I1204 22:18:36.943693 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:37.943331 master-0 kubenswrapper[8606]: I1204 22:18:37.943246 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:37.943331 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:37.943331 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:37.943331 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:37.943681 master-0 kubenswrapper[8606]: I1204 22:18:37.943349 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:38.944890 master-0 kubenswrapper[8606]: I1204 22:18:38.944760 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:38.944890 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:38.944890 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:38.944890 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:38.945975 master-0 kubenswrapper[8606]: I1204 22:18:38.945638 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:39.392249 master-0 kubenswrapper[8606]: I1204 22:18:39.392139 8606 scope.go:117] "RemoveContainer" containerID="35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e" Dec 04 22:18:39.392654 master-0 kubenswrapper[8606]: E1204 22:18:39.392611 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:18:39.943882 master-0 kubenswrapper[8606]: I1204 22:18:39.943770 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:39.943882 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:39.943882 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:39.943882 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:39.944330 master-0 kubenswrapper[8606]: I1204 22:18:39.943876 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:40.944543 master-0 kubenswrapper[8606]: I1204 22:18:40.944402 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:40.944543 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:40.944543 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:40.944543 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:40.944543 master-0 kubenswrapper[8606]: I1204 22:18:40.944535 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:41.944030 master-0 kubenswrapper[8606]: I1204 22:18:41.943905 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:41.944030 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:41.944030 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:41.944030 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:41.944030 master-0 kubenswrapper[8606]: I1204 22:18:41.944028 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:42.944848 master-0 kubenswrapper[8606]: I1204 22:18:42.944735 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:42.944848 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:42.944848 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:42.944848 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:42.944848 master-0 kubenswrapper[8606]: I1204 22:18:42.944843 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:43.944938 master-0 kubenswrapper[8606]: I1204 22:18:43.944827 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:43.944938 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:43.944938 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:43.944938 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:43.946049 master-0 kubenswrapper[8606]: I1204 22:18:43.944949 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:44.944855 master-0 kubenswrapper[8606]: I1204 22:18:44.944789 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:44.944855 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:44.944855 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:44.944855 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:44.946116 master-0 kubenswrapper[8606]: I1204 22:18:44.945716 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:45.945201 master-0 kubenswrapper[8606]: I1204 22:18:45.945087 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:45.945201 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:45.945201 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:45.945201 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:45.946164 master-0 kubenswrapper[8606]: I1204 22:18:45.945221 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:46.944966 master-0 kubenswrapper[8606]: I1204 22:18:46.944846 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:46.944966 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:46.944966 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:46.944966 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:46.945783 master-0 kubenswrapper[8606]: I1204 22:18:46.944965 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:47.944347 master-0 kubenswrapper[8606]: I1204 22:18:47.944250 8606 patch_prober.go:28] interesting pod/router-default-5465c8b4db-8vm66 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 22:18:47.944347 master-0 kubenswrapper[8606]: [-]has-synced failed: reason withheld Dec 04 22:18:47.944347 master-0 kubenswrapper[8606]: [+]process-running ok Dec 04 22:18:47.944347 master-0 kubenswrapper[8606]: healthz check failed Dec 04 22:18:47.944347 master-0 kubenswrapper[8606]: I1204 22:18:47.944338 8606 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 22:18:47.945120 master-0 kubenswrapper[8606]: I1204 22:18:47.944403 8606 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:18:47.945269 master-0 kubenswrapper[8606]: I1204 22:18:47.945218 8606 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"0036ee313e2e8fbc7aa4a79880a6b001a94f998abd62378bddfdf0a04bcdd8e0"} pod="openshift-ingress/router-default-5465c8b4db-8vm66" containerMessage="Container router failed startup probe, will be restarted" Dec 04 22:18:47.945960 master-0 kubenswrapper[8606]: I1204 22:18:47.945272 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5465c8b4db-8vm66" podUID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerName="router" containerID="cri-o://0036ee313e2e8fbc7aa4a79880a6b001a94f998abd62378bddfdf0a04bcdd8e0" gracePeriod=3600 Dec 04 22:18:51.425818 master-0 kubenswrapper[8606]: I1204 22:18:51.425706 8606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 04 22:18:51.426783 master-0 kubenswrapper[8606]: E1204 22:18:51.426185 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" containerName="kube-rbac-proxy" Dec 04 22:18:51.426783 master-0 kubenswrapper[8606]: I1204 22:18:51.426212 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" containerName="kube-rbac-proxy" Dec 04 22:18:51.426783 master-0 kubenswrapper[8606]: E1204 22:18:51.426262 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c55f04-2c89-4439-8171-6a586acc4db2" containerName="installer" Dec 04 22:18:51.426783 master-0 kubenswrapper[8606]: I1204 22:18:51.426275 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c55f04-2c89-4439-8171-6a586acc4db2" containerName="installer" Dec 04 22:18:51.426783 master-0 kubenswrapper[8606]: E1204 22:18:51.426293 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" containerName="multus-admission-controller" Dec 04 22:18:51.426783 master-0 kubenswrapper[8606]: I1204 22:18:51.426309 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" containerName="multus-admission-controller" Dec 04 22:18:51.426783 master-0 kubenswrapper[8606]: I1204 22:18:51.426582 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" containerName="kube-rbac-proxy" Dec 04 22:18:51.426783 master-0 kubenswrapper[8606]: I1204 22:18:51.426630 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e6bd2f1-3f34-40c2-b2cf-7e3881ba51bf" containerName="multus-admission-controller" Dec 04 22:18:51.426783 master-0 kubenswrapper[8606]: I1204 22:18:51.426703 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c55f04-2c89-4439-8171-6a586acc4db2" containerName="installer" Dec 04 22:18:51.427484 master-0 kubenswrapper[8606]: I1204 22:18:51.427431 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.427632 master-0 kubenswrapper[8606]: I1204 22:18:51.427531 8606 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 04 22:18:51.428077 master-0 kubenswrapper[8606]: I1204 22:18:51.427992 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver" containerID="cri-o://5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10" gracePeriod=15 Dec 04 22:18:51.428077 master-0 kubenswrapper[8606]: I1204 22:18:51.428046 8606 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a" gracePeriod=15 Dec 04 22:18:51.428675 master-0 kubenswrapper[8606]: I1204 22:18:51.428591 8606 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 04 22:18:51.429691 master-0 kubenswrapper[8606]: E1204 22:18:51.429068 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver-insecure-readyz" Dec 04 22:18:51.429691 master-0 kubenswrapper[8606]: I1204 22:18:51.429102 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver-insecure-readyz" Dec 04 22:18:51.429691 master-0 kubenswrapper[8606]: E1204 22:18:51.429125 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver" Dec 04 22:18:51.429691 master-0 kubenswrapper[8606]: I1204 22:18:51.429141 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver" Dec 04 22:18:51.429691 master-0 kubenswrapper[8606]: E1204 22:18:51.429204 8606 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="setup" Dec 04 22:18:51.429691 master-0 kubenswrapper[8606]: I1204 22:18:51.429221 8606 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="setup" Dec 04 22:18:51.429691 master-0 kubenswrapper[8606]: I1204 22:18:51.429488 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="setup" Dec 04 22:18:51.429691 master-0 kubenswrapper[8606]: I1204 22:18:51.429568 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver" Dec 04 22:18:51.429691 master-0 kubenswrapper[8606]: I1204 22:18:51.429623 8606 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver-insecure-readyz" Dec 04 22:18:51.433321 master-0 kubenswrapper[8606]: I1204 22:18:51.433243 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.450800 master-0 kubenswrapper[8606]: I1204 22:18:51.450682 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.451031 master-0 kubenswrapper[8606]: I1204 22:18:51.450802 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.451031 master-0 kubenswrapper[8606]: I1204 22:18:51.451017 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.451183 master-0 kubenswrapper[8606]: I1204 22:18:51.451091 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.451285 master-0 kubenswrapper[8606]: I1204 22:18:51.451247 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.451370 master-0 kubenswrapper[8606]: I1204 22:18:51.451323 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.452143 master-0 kubenswrapper[8606]: I1204 22:18:51.451631 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.452143 master-0 kubenswrapper[8606]: I1204 22:18:51.451804 8606 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.517012 master-0 kubenswrapper[8606]: E1204 22:18:51.516896 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.554768 master-0 kubenswrapper[8606]: I1204 22:18:51.554646 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.554768 master-0 kubenswrapper[8606]: I1204 22:18:51.554708 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.554768 master-0 kubenswrapper[8606]: I1204 22:18:51.554744 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.554964 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.555053 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.555029 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.554986 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.555234 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.555292 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.555421 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.555471 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.555337 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.555625 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.555724 8606 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.555765 master-0 kubenswrapper[8606]: I1204 22:18:51.555759 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.556862 master-0 kubenswrapper[8606]: I1204 22:18:51.555850 8606 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.818762 master-0 kubenswrapper[8606]: I1204 22:18:51.818668 8606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:51.849381 master-0 kubenswrapper[8606]: W1204 22:18:51.849318 8606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb89698aa356a3bc32694e2b098f9a900.slice/crio-4f7c6491dc22a01b1d98b832f138731bbe081d987cf8c5e14ed87abcbbcb568a WatchSource:0}: Error finding container 4f7c6491dc22a01b1d98b832f138731bbe081d987cf8c5e14ed87abcbbcb568a: Status 404 returned error can't find the container with id 4f7c6491dc22a01b1d98b832f138731bbe081d987cf8c5e14ed87abcbbcb568a Dec 04 22:18:51.855009 master-0 kubenswrapper[8606]: E1204 22:18:51.854695 8606 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.187e2321c0e67fdc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:b89698aa356a3bc32694e2b098f9a900,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:18:51.853037532 +0000 UTC m=+1096.663339767,LastTimestamp:2025-12-04 22:18:51.853037532 +0000 UTC m=+1096.663339767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:18:52.337901 master-0 kubenswrapper[8606]: I1204 22:18:52.337664 8606 generic.go:334] "Generic (PLEG): container finished" podID="dbe54b09-0399-4fbe-9f84-dd9dede0ab96" containerID="5a4d99a6b7149fd4133c1e3efcfd35582ffcb1582acaa62e903eb008119e1624" exitCode=0 Dec 04 22:18:52.337901 master-0 kubenswrapper[8606]: I1204 22:18:52.337783 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"dbe54b09-0399-4fbe-9f84-dd9dede0ab96","Type":"ContainerDied","Data":"5a4d99a6b7149fd4133c1e3efcfd35582ffcb1582acaa62e903eb008119e1624"} Dec 04 22:18:52.340348 master-0 kubenswrapper[8606]: I1204 22:18:52.340256 8606 status_manager.go:851] "Failed to get status for pod" podUID="dbe54b09-0399-4fbe-9f84-dd9dede0ab96" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:18:52.341809 master-0 kubenswrapper[8606]: I1204 22:18:52.341738 8606 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f" exitCode=0 Dec 04 22:18:52.342017 master-0 kubenswrapper[8606]: I1204 22:18:52.341921 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerDied","Data":"84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f"} Dec 04 22:18:52.342139 master-0 kubenswrapper[8606]: I1204 22:18:52.342051 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"4f7c6491dc22a01b1d98b832f138731bbe081d987cf8c5e14ed87abcbbcb568a"} Dec 04 22:18:52.343598 master-0 kubenswrapper[8606]: E1204 22:18:52.343488 8606 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:52.343758 master-0 kubenswrapper[8606]: I1204 22:18:52.343565 8606 status_manager.go:851] "Failed to get status for pod" podUID="dbe54b09-0399-4fbe-9f84-dd9dede0ab96" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:18:52.346373 master-0 kubenswrapper[8606]: I1204 22:18:52.346293 8606 generic.go:334] "Generic (PLEG): container finished" podID="d75143d9bc4a2dc15781dc51ccff632a" containerID="50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a" exitCode=0 Dec 04 22:18:53.365837 master-0 kubenswrapper[8606]: I1204 22:18:53.363928 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321"} Dec 04 22:18:53.365837 master-0 kubenswrapper[8606]: I1204 22:18:53.363990 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c"} Dec 04 22:18:53.778909 master-0 kubenswrapper[8606]: I1204 22:18:53.778859 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:53.783260 master-0 kubenswrapper[8606]: I1204 22:18:53.783240 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:18:53.903797 master-0 kubenswrapper[8606]: I1204 22:18:53.903745 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 04 22:18:53.903970 master-0 kubenswrapper[8606]: I1204 22:18:53.903836 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 04 22:18:53.903970 master-0 kubenswrapper[8606]: I1204 22:18:53.903860 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:18:53.903970 master-0 kubenswrapper[8606]: I1204 22:18:53.903881 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 04 22:18:53.903970 master-0 kubenswrapper[8606]: I1204 22:18:53.903900 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 04 22:18:53.903970 master-0 kubenswrapper[8606]: I1204 22:18:53.903901 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:18:53.903970 master-0 kubenswrapper[8606]: I1204 22:18:53.903931 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs" (OuterVolumeSpecName: "logs") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:18:53.903970 master-0 kubenswrapper[8606]: I1204 22:18:53.903938 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " Dec 04 22:18:53.904250 master-0 kubenswrapper[8606]: I1204 22:18:53.903987 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:18:53.904250 master-0 kubenswrapper[8606]: I1204 22:18:53.904005 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 04 22:18:53.904250 master-0 kubenswrapper[8606]: I1204 22:18:53.904063 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock\") pod \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " Dec 04 22:18:53.904250 master-0 kubenswrapper[8606]: I1204 22:18:53.904085 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets" (OuterVolumeSpecName: "secrets") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:18:53.904250 master-0 kubenswrapper[8606]: I1204 22:18:53.904116 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 04 22:18:53.904250 master-0 kubenswrapper[8606]: I1204 22:18:53.904141 8606 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir\") pod \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " Dec 04 22:18:53.904250 master-0 kubenswrapper[8606]: I1204 22:18:53.904168 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config" (OuterVolumeSpecName: "config") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:18:53.904250 master-0 kubenswrapper[8606]: I1204 22:18:53.904210 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock" (OuterVolumeSpecName: "var-lock") pod "dbe54b09-0399-4fbe-9f84-dd9dede0ab96" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:18:53.904679 master-0 kubenswrapper[8606]: I1204 22:18:53.904246 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dbe54b09-0399-4fbe-9f84-dd9dede0ab96" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:18:53.904679 master-0 kubenswrapper[8606]: I1204 22:18:53.904432 8606 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:53.904679 master-0 kubenswrapper[8606]: I1204 22:18:53.904447 8606 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:53.904679 master-0 kubenswrapper[8606]: I1204 22:18:53.904456 8606 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:53.904679 master-0 kubenswrapper[8606]: I1204 22:18:53.904464 8606 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:53.904679 master-0 kubenswrapper[8606]: I1204 22:18:53.904474 8606 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:53.904679 master-0 kubenswrapper[8606]: I1204 22:18:53.904485 8606 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:53.904679 master-0 kubenswrapper[8606]: I1204 22:18:53.904493 8606 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:53.904679 master-0 kubenswrapper[8606]: I1204 22:18:53.904517 8606 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:53.908658 master-0 kubenswrapper[8606]: I1204 22:18:53.908619 8606 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dbe54b09-0399-4fbe-9f84-dd9dede0ab96" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:18:54.005674 master-0 kubenswrapper[8606]: I1204 22:18:54.005624 8606 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:18:54.401442 master-0 kubenswrapper[8606]: I1204 22:18:54.401286 8606 scope.go:117] "RemoveContainer" containerID="35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e" Dec 04 22:18:54.402006 master-0 kubenswrapper[8606]: E1204 22:18:54.401590 8606 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-qlkgh_openshift-ingress-operator(addddaac-a31a-4dbf-b78f-87225b11b463)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" podUID="addddaac-a31a-4dbf-b78f-87225b11b463" Dec 04 22:18:54.403829 master-0 kubenswrapper[8606]: I1204 22:18:54.403767 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"dbe54b09-0399-4fbe-9f84-dd9dede0ab96","Type":"ContainerDied","Data":"d377be8d0441b589958c5adc3aad9974e2610bf718707f9842352d0cb595d25f"} Dec 04 22:18:54.403829 master-0 kubenswrapper[8606]: I1204 22:18:54.403829 8606 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d377be8d0441b589958c5adc3aad9974e2610bf718707f9842352d0cb595d25f" Dec 04 22:18:54.403928 master-0 kubenswrapper[8606]: I1204 22:18:54.403835 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:18:54.436533 master-0 kubenswrapper[8606]: I1204 22:18:54.432814 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d"} Dec 04 22:18:54.436533 master-0 kubenswrapper[8606]: I1204 22:18:54.432866 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"5fcdcfec6584d4b81237f292d9431a4c2eccc32a97cb24f1ad54b8dd23d476bb"} Dec 04 22:18:54.436533 master-0 kubenswrapper[8606]: I1204 22:18:54.432880 8606 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65"} Dec 04 22:18:54.436533 master-0 kubenswrapper[8606]: I1204 22:18:54.433240 8606 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:18:54.441368 master-0 kubenswrapper[8606]: I1204 22:18:54.438065 8606 generic.go:334] "Generic (PLEG): container finished" podID="d75143d9bc4a2dc15781dc51ccff632a" containerID="5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10" exitCode=0 Dec 04 22:18:54.441368 master-0 kubenswrapper[8606]: I1204 22:18:54.438130 8606 scope.go:117] "RemoveContainer" containerID="50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a" Dec 04 22:18:54.441368 master-0 kubenswrapper[8606]: I1204 22:18:54.438334 8606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 04 22:18:54.469365 master-0 kubenswrapper[8606]: I1204 22:18:54.465679 8606 scope.go:117] "RemoveContainer" containerID="5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10" Dec 04 22:18:54.491999 master-0 kubenswrapper[8606]: I1204 22:18:54.491954 8606 scope.go:117] "RemoveContainer" containerID="2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8" Dec 04 22:18:54.562779 master-0 kubenswrapper[8606]: I1204 22:18:54.559368 8606 scope.go:117] "RemoveContainer" containerID="50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a" Dec 04 22:18:54.562779 master-0 kubenswrapper[8606]: E1204 22:18:54.559959 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a\": container with ID starting with 50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a not found: ID does not exist" containerID="50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a" Dec 04 22:18:54.562779 master-0 kubenswrapper[8606]: I1204 22:18:54.560004 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a"} err="failed to get container status \"50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a\": rpc error: code = NotFound desc = could not find container \"50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a\": container with ID starting with 50e628b7a06ec4928364d35ea9548a30bad3878bdefac3ce2a61e7d40c20112a not found: ID does not exist" Dec 04 22:18:54.562779 master-0 kubenswrapper[8606]: I1204 22:18:54.560036 8606 scope.go:117] "RemoveContainer" containerID="5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10" Dec 04 22:18:54.562779 master-0 kubenswrapper[8606]: E1204 22:18:54.560617 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10\": container with ID starting with 5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10 not found: ID does not exist" containerID="5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10" Dec 04 22:18:54.562779 master-0 kubenswrapper[8606]: I1204 22:18:54.560662 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10"} err="failed to get container status \"5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10\": rpc error: code = NotFound desc = could not find container \"5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10\": container with ID starting with 5cc246dee21694d6a91623953ee683137c74bac4e0a5ccbafe52b0787de0fe10 not found: ID does not exist" Dec 04 22:18:54.562779 master-0 kubenswrapper[8606]: I1204 22:18:54.560693 8606 scope.go:117] "RemoveContainer" containerID="2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8" Dec 04 22:18:54.562779 master-0 kubenswrapper[8606]: E1204 22:18:54.561002 8606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8\": container with ID starting with 2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8 not found: ID does not exist" containerID="2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8" Dec 04 22:18:54.562779 master-0 kubenswrapper[8606]: I1204 22:18:54.561024 8606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8"} err="failed to get container status \"2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8\": rpc error: code = NotFound desc = could not find container \"2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8\": container with ID starting with 2247eaf4bf8f7270552116fe6ea7c1a05a8316b535c3216640851ab25df6d1f8 not found: ID does not exist" Dec 04 22:18:55.400897 master-0 kubenswrapper[8606]: I1204 22:18:55.400830 8606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d75143d9bc4a2dc15781dc51ccff632a" path="/var/lib/kubelet/pods/d75143d9bc4a2dc15781dc51ccff632a/volumes" Dec 04 22:18:55.401903 master-0 kubenswrapper[8606]: I1204 22:18:55.401874 8606 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 04 22:18:56.076758 master-0 kubenswrapper[8606]: I1204 22:18:56.076516 8606 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 22:18:56.076896 master-0 systemd[1]: Stopping Kubernetes Kubelet... Dec 04 22:18:56.107013 master-0 systemd[1]: kubelet.service: Deactivated successfully. Dec 04 22:18:56.107662 master-0 systemd[1]: Stopped Kubernetes Kubelet. Dec 04 22:18:56.114021 master-0 systemd[1]: kubelet.service: Consumed 3min 8.218s CPU time. Dec 04 22:18:56.169235 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 04 22:18:56.339772 master-0 kubenswrapper[33572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 22:18:56.339772 master-0 kubenswrapper[33572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 04 22:18:56.339772 master-0 kubenswrapper[33572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 22:18:56.339772 master-0 kubenswrapper[33572]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 22:18:56.339772 master-0 kubenswrapper[33572]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 04 22:18:56.339772 master-0 kubenswrapper[33572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 22:18:56.340842 master-0 kubenswrapper[33572]: I1204 22:18:56.339895 33572 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 04 22:18:56.344616 master-0 kubenswrapper[33572]: W1204 22:18:56.344575 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 22:18:56.344616 master-0 kubenswrapper[33572]: W1204 22:18:56.344597 33572 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 22:18:56.344616 master-0 kubenswrapper[33572]: W1204 22:18:56.344604 33572 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344610 33572 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344658 33572 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344668 33572 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344676 33572 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344682 33572 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344688 33572 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344694 33572 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344701 33572 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344708 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344715 33572 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344720 33572 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344725 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344731 33572 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344736 33572 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344741 33572 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344757 33572 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344762 33572 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344768 33572 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 22:18:56.344940 master-0 kubenswrapper[33572]: W1204 22:18:56.344774 33572 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344779 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344784 33572 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344789 33572 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344794 33572 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344800 33572 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344805 33572 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344810 33572 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344815 33572 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344821 33572 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344826 33572 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344835 33572 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344841 33572 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344847 33572 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344854 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344861 33572 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344868 33572 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344874 33572 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344880 33572 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 22:18:56.346437 master-0 kubenswrapper[33572]: W1204 22:18:56.344886 33572 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344891 33572 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344896 33572 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344901 33572 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344908 33572 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344914 33572 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344919 33572 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344925 33572 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344930 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344935 33572 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344940 33572 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344948 33572 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344954 33572 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344959 33572 feature_gate.go:330] unrecognized feature gate: Example Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344965 33572 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344970 33572 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344975 33572 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344981 33572 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344986 33572 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344992 33572 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 22:18:56.348050 master-0 kubenswrapper[33572]: W1204 22:18:56.344997 33572 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345002 33572 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345007 33572 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345014 33572 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345020 33572 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345025 33572 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345030 33572 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345035 33572 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345041 33572 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345046 33572 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345051 33572 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: W1204 22:18:56.345056 33572 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345188 33572 flags.go:64] FLAG: --address="0.0.0.0" Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345201 33572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345212 33572 flags.go:64] FLAG: --anonymous-auth="true" Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345220 33572 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345228 33572 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345234 33572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345243 33572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345250 33572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345256 33572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345263 33572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 04 22:18:56.349746 master-0 kubenswrapper[33572]: I1204 22:18:56.345270 33572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345276 33572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345282 33572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345288 33572 flags.go:64] FLAG: --cgroup-root="" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345294 33572 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345300 33572 flags.go:64] FLAG: --client-ca-file="" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345306 33572 flags.go:64] FLAG: --cloud-config="" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345312 33572 flags.go:64] FLAG: --cloud-provider="" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345318 33572 flags.go:64] FLAG: --cluster-dns="[]" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345325 33572 flags.go:64] FLAG: --cluster-domain="" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345331 33572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345337 33572 flags.go:64] FLAG: --config-dir="" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345343 33572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345350 33572 flags.go:64] FLAG: --container-log-max-files="5" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345358 33572 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345364 33572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345370 33572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345376 33572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345382 33572 flags.go:64] FLAG: --contention-profiling="false" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345389 33572 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345395 33572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345401 33572 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345407 33572 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345415 33572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345421 33572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 04 22:18:56.351076 master-0 kubenswrapper[33572]: I1204 22:18:56.345427 33572 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345433 33572 flags.go:64] FLAG: --enable-load-reader="false" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345441 33572 flags.go:64] FLAG: --enable-server="true" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345447 33572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345458 33572 flags.go:64] FLAG: --event-burst="100" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345464 33572 flags.go:64] FLAG: --event-qps="50" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345470 33572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345477 33572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345483 33572 flags.go:64] FLAG: --eviction-hard="" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345491 33572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345497 33572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345507 33572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345514 33572 flags.go:64] FLAG: --eviction-soft="" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345520 33572 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345526 33572 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345556 33572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345567 33572 flags.go:64] FLAG: --experimental-mounter-path="" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345575 33572 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345582 33572 flags.go:64] FLAG: --fail-swap-on="true" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345589 33572 flags.go:64] FLAG: --feature-gates="" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345597 33572 flags.go:64] FLAG: --file-check-frequency="20s" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345603 33572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345609 33572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345616 33572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345622 33572 flags.go:64] FLAG: --healthz-port="10248" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345629 33572 flags.go:64] FLAG: --help="false" Dec 04 22:18:56.352220 master-0 kubenswrapper[33572]: I1204 22:18:56.345635 33572 flags.go:64] FLAG: --hostname-override="" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345641 33572 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345647 33572 flags.go:64] FLAG: --http-check-frequency="20s" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345653 33572 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345659 33572 flags.go:64] FLAG: --image-credential-provider-config="" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345665 33572 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345671 33572 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345677 33572 flags.go:64] FLAG: --image-service-endpoint="" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345683 33572 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345690 33572 flags.go:64] FLAG: --kube-api-burst="100" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345759 33572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345767 33572 flags.go:64] FLAG: --kube-api-qps="50" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345775 33572 flags.go:64] FLAG: --kube-reserved="" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345782 33572 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345787 33572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345793 33572 flags.go:64] FLAG: --kubelet-cgroups="" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345799 33572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345805 33572 flags.go:64] FLAG: --lock-file="" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345812 33572 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345818 33572 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345824 33572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345834 33572 flags.go:64] FLAG: --log-json-split-stream="false" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345841 33572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345848 33572 flags.go:64] FLAG: --log-text-split-stream="false" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345855 33572 flags.go:64] FLAG: --logging-format="text" Dec 04 22:18:56.353023 master-0 kubenswrapper[33572]: I1204 22:18:56.345860 33572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345867 33572 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345873 33572 flags.go:64] FLAG: --manifest-url="" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345879 33572 flags.go:64] FLAG: --manifest-url-header="" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345887 33572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345894 33572 flags.go:64] FLAG: --max-open-files="1000000" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345901 33572 flags.go:64] FLAG: --max-pods="110" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345907 33572 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345913 33572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345919 33572 flags.go:64] FLAG: --memory-manager-policy="None" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345925 33572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345931 33572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345937 33572 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345943 33572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345956 33572 flags.go:64] FLAG: --node-status-max-images="50" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345962 33572 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345969 33572 flags.go:64] FLAG: --oom-score-adj="-999" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345975 33572 flags.go:64] FLAG: --pod-cidr="" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345981 33572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345989 33572 flags.go:64] FLAG: --pod-manifest-path="" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.345995 33572 flags.go:64] FLAG: --pod-max-pids="-1" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.346001 33572 flags.go:64] FLAG: --pods-per-core="0" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.346007 33572 flags.go:64] FLAG: --port="10250" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.346014 33572 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 04 22:18:56.353806 master-0 kubenswrapper[33572]: I1204 22:18:56.346020 33572 flags.go:64] FLAG: --provider-id="" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346027 33572 flags.go:64] FLAG: --qos-reserved="" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346032 33572 flags.go:64] FLAG: --read-only-port="10255" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346038 33572 flags.go:64] FLAG: --register-node="true" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346044 33572 flags.go:64] FLAG: --register-schedulable="true" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346051 33572 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346065 33572 flags.go:64] FLAG: --registry-burst="10" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346071 33572 flags.go:64] FLAG: --registry-qps="5" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346077 33572 flags.go:64] FLAG: --reserved-cpus="" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346083 33572 flags.go:64] FLAG: --reserved-memory="" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346091 33572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346097 33572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346104 33572 flags.go:64] FLAG: --rotate-certificates="false" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346112 33572 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346118 33572 flags.go:64] FLAG: --runonce="false" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346124 33572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346131 33572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346137 33572 flags.go:64] FLAG: --seccomp-default="false" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346143 33572 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346149 33572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346155 33572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346162 33572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346168 33572 flags.go:64] FLAG: --storage-driver-password="root" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346174 33572 flags.go:64] FLAG: --storage-driver-secure="false" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346180 33572 flags.go:64] FLAG: --storage-driver-table="stats" Dec 04 22:18:56.354473 master-0 kubenswrapper[33572]: I1204 22:18:56.346186 33572 flags.go:64] FLAG: --storage-driver-user="root" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346194 33572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346201 33572 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346207 33572 flags.go:64] FLAG: --system-cgroups="" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346213 33572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346222 33572 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346228 33572 flags.go:64] FLAG: --tls-cert-file="" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346234 33572 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346242 33572 flags.go:64] FLAG: --tls-min-version="" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346248 33572 flags.go:64] FLAG: --tls-private-key-file="" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346254 33572 flags.go:64] FLAG: --topology-manager-policy="none" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346261 33572 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346266 33572 flags.go:64] FLAG: --topology-manager-scope="container" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346275 33572 flags.go:64] FLAG: --v="2" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346283 33572 flags.go:64] FLAG: --version="false" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346291 33572 flags.go:64] FLAG: --vmodule="" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346298 33572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: I1204 22:18:56.346305 33572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: W1204 22:18:56.346514 33572 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: W1204 22:18:56.346552 33572 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: W1204 22:18:56.346569 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: W1204 22:18:56.346578 33572 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: W1204 22:18:56.346585 33572 feature_gate.go:330] unrecognized feature gate: Example Dec 04 22:18:56.355134 master-0 kubenswrapper[33572]: W1204 22:18:56.346592 33572 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346597 33572 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346603 33572 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346608 33572 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346614 33572 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346619 33572 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346624 33572 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346630 33572 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346635 33572 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346642 33572 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346649 33572 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346654 33572 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346660 33572 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346666 33572 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346671 33572 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346677 33572 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346682 33572 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346689 33572 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346695 33572 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 22:18:56.355758 master-0 kubenswrapper[33572]: W1204 22:18:56.346702 33572 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346708 33572 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346713 33572 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346721 33572 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346728 33572 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346734 33572 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346740 33572 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346745 33572 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346752 33572 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346758 33572 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346765 33572 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346771 33572 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346776 33572 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346781 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346790 33572 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346875 33572 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346882 33572 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346888 33572 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346893 33572 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346898 33572 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 22:18:56.356316 master-0 kubenswrapper[33572]: W1204 22:18:56.346903 33572 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346909 33572 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346914 33572 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346919 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346925 33572 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346930 33572 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346936 33572 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346941 33572 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346947 33572 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346953 33572 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346958 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346964 33572 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346969 33572 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346974 33572 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346980 33572 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346987 33572 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346993 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.346998 33572 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.347003 33572 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.347009 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 22:18:56.356819 master-0 kubenswrapper[33572]: W1204 22:18:56.347019 33572 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.347025 33572 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.347032 33572 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.347038 33572 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.347043 33572 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.347048 33572 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.347053 33572 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.347059 33572 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: I1204 22:18:56.347067 33572 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: I1204 22:18:56.355975 33572 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: I1204 22:18:56.356029 33572 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.356153 33572 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.356172 33572 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.356184 33572 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 22:18:56.357384 master-0 kubenswrapper[33572]: W1204 22:18:56.356193 33572 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356205 33572 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356217 33572 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356226 33572 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356235 33572 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356244 33572 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356253 33572 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356263 33572 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356271 33572 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356279 33572 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356289 33572 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356299 33572 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356307 33572 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356315 33572 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356325 33572 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356333 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356341 33572 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356349 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356359 33572 feature_gate.go:330] unrecognized feature gate: Example Dec 04 22:18:56.357811 master-0 kubenswrapper[33572]: W1204 22:18:56.356368 33572 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356376 33572 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356383 33572 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356392 33572 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356400 33572 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356408 33572 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356416 33572 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356424 33572 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356431 33572 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356442 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356450 33572 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356459 33572 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356467 33572 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356474 33572 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356482 33572 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356490 33572 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356505 33572 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356513 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356521 33572 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356530 33572 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 22:18:56.358279 master-0 kubenswrapper[33572]: W1204 22:18:56.356559 33572 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356567 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356575 33572 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356583 33572 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356592 33572 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356600 33572 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356608 33572 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356616 33572 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356624 33572 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356632 33572 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356640 33572 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356648 33572 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356656 33572 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356664 33572 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356672 33572 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356682 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356691 33572 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356702 33572 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356713 33572 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356723 33572 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 22:18:56.358835 master-0 kubenswrapper[33572]: W1204 22:18:56.356731 33572 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.356741 33572 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.356749 33572 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.356757 33572 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.356765 33572 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.356774 33572 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.356781 33572 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.356789 33572 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.356797 33572 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.356805 33572 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: I1204 22:18:56.356818 33572 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.357060 33572 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.357072 33572 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.357080 33572 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.357088 33572 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 22:18:56.359391 master-0 kubenswrapper[33572]: W1204 22:18:56.357098 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357106 33572 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357117 33572 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357128 33572 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357136 33572 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357145 33572 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357155 33572 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357164 33572 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357174 33572 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357182 33572 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357190 33572 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357198 33572 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357206 33572 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357214 33572 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357222 33572 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357229 33572 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357243 33572 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357252 33572 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357261 33572 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 22:18:56.359847 master-0 kubenswrapper[33572]: W1204 22:18:56.357269 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357278 33572 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357288 33572 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357297 33572 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357305 33572 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357313 33572 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357321 33572 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357329 33572 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357337 33572 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357345 33572 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357353 33572 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357361 33572 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357369 33572 feature_gate.go:330] unrecognized feature gate: Example Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357377 33572 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357385 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357393 33572 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357401 33572 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357409 33572 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357418 33572 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357426 33572 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 22:18:56.360450 master-0 kubenswrapper[33572]: W1204 22:18:56.357434 33572 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357442 33572 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357450 33572 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357460 33572 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357471 33572 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357480 33572 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357488 33572 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357496 33572 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357510 33572 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357518 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357527 33572 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357556 33572 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357565 33572 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357575 33572 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357583 33572 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357591 33572 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357599 33572 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357607 33572 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357616 33572 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357624 33572 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 22:18:56.360953 master-0 kubenswrapper[33572]: W1204 22:18:56.357632 33572 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: W1204 22:18:56.357640 33572 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: W1204 22:18:56.357648 33572 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: W1204 22:18:56.357656 33572 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: W1204 22:18:56.357664 33572 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: W1204 22:18:56.357672 33572 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: W1204 22:18:56.357679 33572 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: W1204 22:18:56.357687 33572 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: W1204 22:18:56.357695 33572 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: I1204 22:18:56.357707 33572 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: I1204 22:18:56.357985 33572 server.go:940] "Client rotation is on, will bootstrap in background" Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: I1204 22:18:56.361327 33572 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 04 22:18:56.361491 master-0 kubenswrapper[33572]: I1204 22:18:56.361449 33572 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 04 22:18:56.361905 master-0 kubenswrapper[33572]: I1204 22:18:56.361872 33572 server.go:997] "Starting client certificate rotation" Dec 04 22:18:56.361905 master-0 kubenswrapper[33572]: I1204 22:18:56.361899 33572 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 04 22:18:56.362097 master-0 kubenswrapper[33572]: I1204 22:18:56.362041 33572 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-05 21:50:08 +0000 UTC, rotation deadline is 2025-12-05 14:40:58.193111216 +0000 UTC Dec 04 22:18:56.362097 master-0 kubenswrapper[33572]: I1204 22:18:56.362088 33572 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 16h22m1.831024833s for next certificate rotation Dec 04 22:18:56.363790 master-0 kubenswrapper[33572]: I1204 22:18:56.363485 33572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 22:18:56.366887 master-0 kubenswrapper[33572]: I1204 22:18:56.366838 33572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 22:18:56.370687 master-0 kubenswrapper[33572]: I1204 22:18:56.370625 33572 log.go:25] "Validated CRI v1 runtime API" Dec 04 22:18:56.375921 master-0 kubenswrapper[33572]: I1204 22:18:56.375866 33572 log.go:25] "Validated CRI v1 image API" Dec 04 22:18:56.377153 master-0 kubenswrapper[33572]: I1204 22:18:56.377107 33572 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 04 22:18:56.389718 master-0 kubenswrapper[33572]: I1204 22:18:56.389640 33572 fs.go:135] Filesystem UUIDs: map[4c52ad11-dbba-45ec-8a7c-4164b2d3de92:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Dec 04 22:18:56.391241 master-0 kubenswrapper[33572]: I1204 22:18:56.389683 33572 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/09c1c58555576a7a5ad0c40263cb1b24f2532f6f0895e3889a790f41c5622cf5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/09c1c58555576a7a5ad0c40263cb1b24f2532f6f0895e3889a790f41c5622cf5/userdata/shm major:0 minor:943 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0a8ac4004225e98679de5f00828ef4b72b059bfd913e3c0b107c1aef5ccb1667/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0a8ac4004225e98679de5f00828ef4b72b059bfd913e3c0b107c1aef5ccb1667/userdata/shm major:0 minor:1031 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0c849ebda1ef05c2e7568afd8bbf5411d8e51e42f17fd972708d247af11d0983/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0c849ebda1ef05c2e7568afd8bbf5411d8e51e42f17fd972708d247af11d0983/userdata/shm major:0 minor:1484 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/10dda04dd9f1ca247b35249d5e7333d86ebc7e3902573b98ac28839fb9bcb514/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/10dda04dd9f1ca247b35249d5e7333d86ebc7e3902573b98ac28839fb9bcb514/userdata/shm major:0 minor:1404 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1ed0b431491d7769d0a806c2775a07d37b29bbfb434d8d9d3536f46e64b03c26/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1ed0b431491d7769d0a806c2775a07d37b29bbfb434d8d9d3536f46e64b03c26/userdata/shm major:0 minor:1042 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1f07b0c4938e582ee1cf91b0ddf3d74df5a116aa0098efb24e115a5e8116176b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1f07b0c4938e582ee1cf91b0ddf3d74df5a116aa0098efb24e115a5e8116176b/userdata/shm major:0 minor:968 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1f66bf17e6a9a3d1673b91cfae48275a45a440300946a8bf5061bac66c63db97/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1f66bf17e6a9a3d1673b91cfae48275a45a440300946a8bf5061bac66c63db97/userdata/shm major:0 minor:602 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/21f00834ca375d484385e21696e2d8aa4483b916d5393969f0246cfd9dc0471a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/21f00834ca375d484385e21696e2d8aa4483b916d5393969f0246cfd9dc0471a/userdata/shm major:0 minor:761 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/264531cb97973b0deb400a67899ce39a8e7e6bd105e2fd0acd10b7958dc4add3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/264531cb97973b0deb400a67899ce39a8e7e6bd105e2fd0acd10b7958dc4add3/userdata/shm major:0 minor:924 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/28f1828f69f1c4d0d02aa7e938a400a27e1212d35ae3b194af92227cb1a24b54/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/28f1828f69f1c4d0d02aa7e938a400a27e1212d35ae3b194af92227cb1a24b54/userdata/shm major:0 minor:388 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2d1e4c21a00903c83707b4497502b9b73283be3d30ce9a5aaab7e54bf8f72dba/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2d1e4c21a00903c83707b4497502b9b73283be3d30ce9a5aaab7e54bf8f72dba/userdata/shm major:0 minor:928 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0/userdata/shm major:0 minor:341 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/331b7dcdf8e2d87b8a376fb7581bf5553e5209af627a95ddb5944fe5be12cb6d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/331b7dcdf8e2d87b8a376fb7581bf5553e5209af627a95ddb5944fe5be12cb6d/userdata/shm major:0 minor:1020 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3a3d3c261f26f43a69f444c9052f030961843ae0bbf89248b5cd01b597da7064/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3a3d3c261f26f43a69f444c9052f030961843ae0bbf89248b5cd01b597da7064/userdata/shm major:0 minor:369 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3aa682501427a1a26306dd6e6ffbe29276935fb92a5916c957736c383157a162/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3aa682501427a1a26306dd6e6ffbe29276935fb92a5916c957736c383157a162/userdata/shm major:0 minor:132 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/479597d06e399852cde3f4983981240e3b9a935772d2dd22d716d20e734ab158/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/479597d06e399852cde3f4983981240e3b9a935772d2dd22d716d20e734ab158/userdata/shm major:0 minor:601 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4c52307b147fc1f96631f9272147cbdbb3ffe8d871369692fc386dc96586c86f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4c52307b147fc1f96631f9272147cbdbb3ffe8d871369692fc386dc96586c86f/userdata/shm major:0 minor:677 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4d378b74b84c73d0247bb3e1b1ed1084c6d9b778481316b2ed8d047f73fdee53/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4d378b74b84c73d0247bb3e1b1ed1084c6d9b778481316b2ed8d047f73fdee53/userdata/shm major:0 minor:1041 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4e849604a662b099203e2c576ae634e61f44af1ebad609cae720f6f60b5023c0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4e849604a662b099203e2c576ae634e61f44af1ebad609cae720f6f60b5023c0/userdata/shm major:0 minor:1427 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4f2b4d655e0bf0ff49528fd7206e3f4bbadf9c2ae53c24cb531027c7c4811ac5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4f2b4d655e0bf0ff49528fd7206e3f4bbadf9c2ae53c24cb531027c7c4811ac5/userdata/shm major:0 minor:108 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4f7c6491dc22a01b1d98b832f138731bbe081d987cf8c5e14ed87abcbbcb568a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4f7c6491dc22a01b1d98b832f138731bbe081d987cf8c5e14ed87abcbbcb568a/userdata/shm major:0 minor:89 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4fb8b7eb82d3c4d48b5f1cf2512cb01480c4c7ee72d7b27dc0bc0ec05cc4c756/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4fb8b7eb82d3c4d48b5f1cf2512cb01480c4c7ee72d7b27dc0bc0ec05cc4c756/userdata/shm major:0 minor:455 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4fc051e954a566d97cf4dcb3626713517bc5479301f571be1eec860a1f2d884c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4fc051e954a566d97cf4dcb3626713517bc5479301f571be1eec860a1f2d884c/userdata/shm major:0 minor:435 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e/userdata/shm major:0 minor:316 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde/userdata/shm major:0 minor:335 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/56273487d972eb4bce7bb2a2d532d0ecc4790cf320f72d236deb00d6ee12d734/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/56273487d972eb4bce7bb2a2d532d0ecc4790cf320f72d236deb00d6ee12d734/userdata/shm major:0 minor:901 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f/userdata/shm major:0 minor:314 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5af5cfe128eaa351f012440567883f2b0f5ad3e1b0e50ea2b67166561450dd28/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5af5cfe128eaa351f012440567883f2b0f5ad3e1b0e50ea2b67166561450dd28/userdata/shm major:0 minor:554 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/60aac9ad737c32ff74467da7a19ec918b06f0d3f5c0137f4d12c177366392be7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/60aac9ad737c32ff74467da7a19ec918b06f0d3f5c0137f4d12c177366392be7/userdata/shm major:0 minor:373 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/650ca7f20c2d1cb1f57ba5643ad53b21f17eea7d93316d18d3c9ccbd27770c35/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/650ca7f20c2d1cb1f57ba5643ad53b21f17eea7d93316d18d3c9ccbd27770c35/userdata/shm major:0 minor:598 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878/userdata/shm major:0 minor:323 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b/userdata/shm major:0 minor:330 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/73496f020ec19048256b7ee616b5604b8f6faef21ddc2795a2639ad6cafa0a2c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/73496f020ec19048256b7ee616b5604b8f6faef21ddc2795a2639ad6cafa0a2c/userdata/shm major:0 minor:1488 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/773c5e8477795f70534d191a1a57bd8d7b1ab26d3d0db825f97bcaae1e3dd144/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/773c5e8477795f70534d191a1a57bd8d7b1ab26d3d0db825f97bcaae1e3dd144/userdata/shm major:0 minor:1402 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d02c679c1b193ea195c44f77ce5059c11b500930cda814d106399c1a88668f1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d02c679c1b193ea195c44f77ce5059c11b500930cda814d106399c1a88668f1/userdata/shm major:0 minor:538 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d740441407a329d552a22d88957f884c55899842a2703505cfb149663a1e6ff/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d740441407a329d552a22d88957f884c55899842a2703505cfb149663a1e6ff/userdata/shm major:0 minor:972 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7e789ca56ff169f6e94ee218684222ae21d64421f7ece2b51fa33799d9ab1ccc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7e789ca56ff169f6e94ee218684222ae21d64421f7ece2b51fa33799d9ab1ccc/userdata/shm major:0 minor:1070 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8322a14628afc897780b45b61f17a00da7b18029f93a7a74b52c8e380031ff4f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8322a14628afc897780b45b61f17a00da7b18029f93a7a74b52c8e380031ff4f/userdata/shm major:0 minor:1673 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/857d5516010228f43de819bef01594930c5dd807f2fd4da2fb1509f797fa1774/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/857d5516010228f43de819bef01594930c5dd807f2fd4da2fb1509f797fa1774/userdata/shm major:0 minor:1490 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8688dbdb4594010fcb7f8a4ca909c4672a49bcb4f5b75dd3a827d0833c1ed0fe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8688dbdb4594010fcb7f8a4ca909c4672a49bcb4f5b75dd3a827d0833c1ed0fe/userdata/shm major:0 minor:1568 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8ccadfcf02bee77f4b3f98d491b1ee8f4b7c03cb21fbe9104543e0f3a4c0e10a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8ccadfcf02bee77f4b3f98d491b1ee8f4b7c03cb21fbe9104543e0f3a4c0e10a/userdata/shm major:0 minor:1132 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8f9c23aa8f546cd0849cf585c0cd6540010999bcb6db0df49677dfec81935af9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8f9c23aa8f546cd0849cf585c0cd6540010999bcb6db0df49677dfec81935af9/userdata/shm major:0 minor:1237 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a/userdata/shm major:0 minor:327 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/99da9d5b3d27d57501f5191969d7c3ca653c3d4bf3252f476bdc359e5ff9e271/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/99da9d5b3d27d57501f5191969d7c3ca653c3d4bf3252f476bdc359e5ff9e271/userdata/shm major:0 minor:476 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9c65ccceea882cd7d898803d924bbc08d4f3c8fef9c388b0db802fee0be2d9fc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9c65ccceea882cd7d898803d924bbc08d4f3c8fef9c388b0db802fee0be2d9fc/userdata/shm major:0 minor:831 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9ff695d5f754bc1ab4c8a5e3ad28cb942f185e2cf70cdd2d8a2eeb6d3f679b39/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9ff695d5f754bc1ab4c8a5e3ad28cb942f185e2cf70cdd2d8a2eeb6d3f679b39/userdata/shm major:0 minor:1400 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376/userdata/shm major:0 minor:333 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aab8d5d7d7caaf80016fb84803d68f187962ac87f50be8e340ea0edecd46547b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aab8d5d7d7caaf80016fb84803d68f187962ac87f50be8e340ea0edecd46547b/userdata/shm major:0 minor:1024 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b/userdata/shm major:0 minor:319 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b5aeee4aff8bc78bf6e36ba938b9609781a2d34417a21d03fdc2c6a101065131/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b5aeee4aff8bc78bf6e36ba938b9609781a2d34417a21d03fdc2c6a101065131/userdata/shm major:0 minor:593 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b9ac4ee53782e9fd4b340ed2b43fd3025db3cb82bd0881252f116248836951ce/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b9ac4ee53782e9fd4b340ed2b43fd3025db3cb82bd0881252f116248836951ce/userdata/shm major:0 minor:600 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c33c6e351b6426a43cd389bbd81cef5f132f38999fc440de5ea48da556537499/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c33c6e351b6426a43cd389bbd81cef5f132f38999fc440de5ea48da556537499/userdata/shm major:0 minor:1111 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c382efd8856765d7e3a7c1f5148c4c397e023bc0d7fa9282b9bc8277f0af2687/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c382efd8856765d7e3a7c1f5148c4c397e023bc0d7fa9282b9bc8277f0af2687/userdata/shm major:0 minor:160 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c83e316239457de6d2cf065ee11c69192c6233457017b9e9bdae1e03d84ad9fc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c83e316239457de6d2cf065ee11c69192c6233457017b9e9bdae1e03d84ad9fc/userdata/shm major:0 minor:536 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c9c2e653f3b9114eb61aa0f9377a8f57b59a4f433c04f9168d0a7788bc429f4a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c9c2e653f3b9114eb61aa0f9377a8f57b59a4f433c04f9168d0a7788bc429f4a/userdata/shm major:0 minor:67 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/caf36cfc8384a756669c5effc9f040f914b8e0fafbb77841a2ef74350bfc51bf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/caf36cfc8384a756669c5effc9f040f914b8e0fafbb77841a2ef74350bfc51bf/userdata/shm major:0 minor:491 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cf7bfab376e6cd33db16a88818ab6aaf3d6cee23a9706c34952502952f7ad2f6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cf7bfab376e6cd33db16a88818ab6aaf3d6cee23a9706c34952502952f7ad2f6/userdata/shm major:0 minor:799 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cf87cc00ba78c6e3cc8680200b1afa8d433e342bc7744db35e1f64b4a3e5a078/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cf87cc00ba78c6e3cc8680200b1afa8d433e342bc7744db35e1f64b4a3e5a078/userdata/shm major:0 minor:540 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d20bd8f7df5ba066210630b0736a496814188135f419c1b214059e7501c8fbf9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d20bd8f7df5ba066210630b0736a496814188135f419c1b214059e7501c8fbf9/userdata/shm major:0 minor:187 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b/userdata/shm major:0 minor:161 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc/userdata/shm major:0 minor:143 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/db48b3f4e1a29b857cceceb534352169660eed12f652161e8b97983c91525c06/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/db48b3f4e1a29b857cceceb534352169660eed12f652161e8b97983c91525c06/userdata/shm major:0 minor:1043 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/de08a4b22951aecb57c029d3a74e638dc3b7212560569cf21e54820113aad20f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/de08a4b22951aecb57c029d3a74e638dc3b7212560569cf21e54820113aad20f/userdata/shm major:0 minor:1397 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df11c4a8f3347747aecb87e080a5126c781276285c92708ad28d5159ae4229dc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df11c4a8f3347747aecb87e080a5126c781276285c92708ad28d5159ae4229dc/userdata/shm major:0 minor:481 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223/userdata/shm major:0 minor:337 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e976d29655bd6a4804e44aefca912f97ea12da492eb732dd0d56f5c8ee61e225/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e976d29655bd6a4804e44aefca912f97ea12da492eb732dd0d56f5c8ee61e225/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2/userdata/shm major:0 minor:322 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ebcf83d7998d4cc60b59e0a4ee1b7d80a2c88668db04583317334cbd65922154/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ebcf83d7998d4cc60b59e0a4ee1b7d80a2c88668db04583317334cbd65922154/userdata/shm major:0 minor:226 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ee18ee29a901424dbc24deaf0f12aa2ff23c84383c4ff6df05066daea9fbaebe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ee18ee29a901424dbc24deaf0f12aa2ff23c84383c4ff6df05066daea9fbaebe/userdata/shm major:0 minor:64 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f09554581756c0f7969c0e40141de26184513807dc2c20e4d5041730923d9d0c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f09554581756c0f7969c0e40141de26184513807dc2c20e4d5041730923d9d0c/userdata/shm major:0 minor:747 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f19df2e06dca5a2f80ab8037e49629477ed1cac1328bfc7445b4bdab076568fc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f19df2e06dca5a2f80ab8037e49629477ed1cac1328bfc7445b4bdab076568fc/userdata/shm major:0 minor:993 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0/userdata/shm major:0 minor:339 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fc5560b4f5b75417c091bf8b734a41efa2795ce2d8cceb8a89a66960f1ba3320/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fc5560b4f5b75417c091bf8b734a41efa2795ce2d8cceb8a89a66960f1ba3320/userdata/shm major:0 minor:596 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fe41c35d4fc12b10f7c0380ded0175f838a7cb9e3aad0aa5a08446be17e65126/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fe41c35d4fc12b10f7c0380ded0175f838a7cb9e3aad0aa5a08446be17e65126/userdata/shm major:0 minor:555 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0173b8a7-07b4-407a-80b6-d86754072fd8/volumes/kubernetes.io~projected/kube-api-access-z4cdh:{mountpoint:/var/lib/kubelet/pods/0173b8a7-07b4-407a-80b6-d86754072fd8/volumes/kubernetes.io~projected/kube-api-access-z4cdh major:0 minor:475 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0a726f44-a509-46b3-a6d5-70afe3b55e9f/volumes/kubernetes.io~projected/kube-api-access-k8jmv:{mountpoint:/var/lib/kubelet/pods/0a726f44-a509-46b3-a6d5-70afe3b55e9f/volumes/kubernetes.io~projected/kube-api-access-k8jmv major:0 minor:1481 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0a726f44-a509-46b3-a6d5-70afe3b55e9f/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/0a726f44-a509-46b3-a6d5-70afe3b55e9f/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config major:0 minor:1479 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0a726f44-a509-46b3-a6d5-70afe3b55e9f/volumes/kubernetes.io~secret/node-exporter-tls:{mountpoint:/var/lib/kubelet/pods/0a726f44-a509-46b3-a6d5-70afe3b55e9f/volumes/kubernetes.io~secret/node-exporter-tls major:0 minor:1475 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~projected/kube-api-access-dvrr5:{mountpoint:/var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~projected/kube-api-access-dvrr5 major:0 minor:311 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:534 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:532 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/17912746-74eb-4c78-8c1b-2f66e7ce4299/volumes/kubernetes.io~projected/kube-api-access-smtnh:{mountpoint:/var/lib/kubelet/pods/17912746-74eb-4c78-8c1b-2f66e7ce4299/volumes/kubernetes.io~projected/kube-api-access-smtnh major:0 minor:1482 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/17912746-74eb-4c78-8c1b-2f66e7ce4299/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/17912746-74eb-4c78-8c1b-2f66e7ce4299/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config major:0 minor:1477 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/17912746-74eb-4c78-8c1b-2f66e7ce4299/volumes/kubernetes.io~secret/openshift-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/17912746-74eb-4c78-8c1b-2f66e7ce4299/volumes/kubernetes.io~secret/openshift-state-metrics-tls major:0 minor:1478 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~projected/kube-api-access major:0 minor:329 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~secret/serving-cert major:0 minor:299 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29828f55-427b-4fe3-8713-03bcd6ac9dec/volumes/kubernetes.io~projected/kube-api-access-t9rxt:{mountpoint:/var/lib/kubelet/pods/29828f55-427b-4fe3-8713-03bcd6ac9dec/volumes/kubernetes.io~projected/kube-api-access-t9rxt major:0 minor:1124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae/volumes/kubernetes.io~projected/kube-api-access-mwx5k:{mountpoint:/var/lib/kubelet/pods/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae/volumes/kubernetes.io~projected/kube-api-access-mwx5k major:0 minor:1120 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2d142201-6e77-4828-b86b-05d4144a2f08/volumes/kubernetes.io~projected/kube-api-access-6cblk:{mountpoint:/var/lib/kubelet/pods/2d142201-6e77-4828-b86b-05d4144a2f08/volumes/kubernetes.io~projected/kube-api-access-6cblk major:0 minor:1029 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2d142201-6e77-4828-b86b-05d4144a2f08/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/2d142201-6e77-4828-b86b-05d4144a2f08/volumes/kubernetes.io~secret/serving-cert major:0 minor:1028 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/34cad3de-8f3f-48cd-bd39-8745fad19e65/volumes/kubernetes.io~projected/kube-api-access-tq55v:{mountpoint:/var/lib/kubelet/pods/34cad3de-8f3f-48cd-bd39-8745fad19e65/volumes/kubernetes.io~projected/kube-api-access-tq55v major:0 minor:1672 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/34cad3de-8f3f-48cd-bd39-8745fad19e65/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/34cad3de-8f3f-48cd-bd39-8745fad19e65/volumes/kubernetes.io~secret/webhook-certs major:0 minor:1671 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:308 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/kube-api-access-m4cct:{mountpoint:/var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/kube-api-access-m4cct major:0 minor:321 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:589 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~projected/kube-api-access-9wh6b:{mountpoint:/var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~projected/kube-api-access-9wh6b major:0 minor:157 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:156 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~projected/kube-api-access-jwk6f:{mountpoint:/var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~projected/kube-api-access-jwk6f major:0 minor:291 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~secret/serving-cert major:0 minor:288 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~projected/kube-api-access-4mttq:{mountpoint:/var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~projected/kube-api-access-4mttq major:0 minor:293 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:286 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d68dcb1-efe4-425f-9b28-1e5575548a32/volumes/kubernetes.io~projected/kube-api-access-r6vjb:{mountpoint:/var/lib/kubelet/pods/4d68dcb1-efe4-425f-9b28-1e5575548a32/volumes/kubernetes.io~projected/kube-api-access-r6vjb major:0 minor:490 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d68dcb1-efe4-425f-9b28-1e5575548a32/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/4d68dcb1-efe4-425f-9b28-1e5575548a32/volumes/kubernetes.io~secret/signing-key major:0 minor:489 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/volumes/kubernetes.io~projected/kube-api-access-vd6d8:{mountpoint:/var/lib/kubelet/pods/4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/volumes/kubernetes.io~projected/kube-api-access-vd6d8 major:0 minor:480 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/510a595a-21bf-48fc-85cd-707bc8f5536f/volumes/kubernetes.io~projected/kube-api-access-gfhgj:{mountpoint:/var/lib/kubelet/pods/510a595a-21bf-48fc-85cd-707bc8f5536f/volumes/kubernetes.io~projected/kube-api-access-gfhgj major:0 minor:408 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5/volumes/kubernetes.io~projected/kube-api-access-g5nkh:{mountpoint:/var/lib/kubelet/pods/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5/volumes/kubernetes.io~projected/kube-api-access-g5nkh major:0 minor:306 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:586 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5598683a-cd32-486d-8839-205829d55cc2/volumes/kubernetes.io~projected/kube-api-access-s2lwr:{mountpoint:/var/lib/kubelet/pods/5598683a-cd32-486d-8839-205829d55cc2/volumes/kubernetes.io~projected/kube-api-access-s2lwr major:0 minor:990 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5598683a-cd32-486d-8839-205829d55cc2/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/5598683a-cd32-486d-8839-205829d55cc2/volumes/kubernetes.io~secret/cert major:0 minor:989 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/55c4f1e1-1b78-45ec-915d-8055ab3e2786/volumes/kubernetes.io~projected/kube-api-access-b8g99:{mountpoint:/var/lib/kubelet/pods/55c4f1e1-1b78-45ec-915d-8055ab3e2786/volumes/kubernetes.io~projected/kube-api-access-b8g99 major:0 minor:1052 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/55c4f1e1-1b78-45ec-915d-8055ab3e2786/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/55c4f1e1-1b78-45ec-915d-8055ab3e2786/volumes/kubernetes.io~secret/proxy-tls major:0 minor:1051 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~projected/kube-api-access major:0 minor:312 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~secret/serving-cert major:0 minor:295 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~projected/kube-api-access-d4rft:{mountpoint:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~projected/kube-api-access-d4rft major:0 minor:159 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:158 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~projected/kube-api-access-bfcv9:{mountpoint:/var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~projected/kube-api-access-bfcv9 major:0 minor:564 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~secret/encryption-config major:0 minor:562 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~secret/etcd-client major:0 minor:563 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~secret/serving-cert major:0 minor:621 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5dac8e25-0f51-4c04-929c-060479689a9d/volumes/kubernetes.io~projected/kube-api-access-ch6s4:{mountpoint:/var/lib/kubelet/pods/5dac8e25-0f51-4c04-929c-060479689a9d/volumes/kubernetes.io~projected/kube-api-access-ch6s4 major:0 minor:146 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5dac8e25-0f51-4c04-929c-060479689a9d/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/5dac8e25-0f51-4c04-929c-060479689a9d/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:73 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~projected/kube-api-access-xgt75:{mountpoint:/var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~projected/kube-api-access-xgt75 major:0 minor:186 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~secret/webhook-cert major:0 minor:185 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/651c0fad-1577-4a7f-8718-ec2fd2f06c3e/volumes/kubernetes.io~projected/kube-api-access-n8gh2:{mountpoint:/var/lib/kubelet/pods/651c0fad-1577-4a7f-8718-ec2fd2f06c3e/volumes/kubernetes.io~projected/kube-api-access-n8gh2 major:0 minor:387 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/651c0fad-1577-4a7f-8718-ec2fd2f06c3e/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/651c0fad-1577-4a7f-8718-ec2fd2f06c3e/volumes/kubernetes.io~secret/cert major:0 minor:386 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6684358b-d7a6-4396-9b4f-ea67d85e4517/volumes/kubernetes.io~projected/kube-api-access-qkg8s:{mountpoint:/var/lib/kubelet/pods/6684358b-d7a6-4396-9b4f-ea67d85e4517/volumes/kubernetes.io~projected/kube-api-access-qkg8s major:0 minor:1395 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6684358b-d7a6-4396-9b4f-ea67d85e4517/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/6684358b-d7a6-4396-9b4f-ea67d85e4517/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config major:0 minor:141 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6684358b-d7a6-4396-9b4f-ea67d85e4517/volumes/kubernetes.io~secret/prometheus-operator-tls:{mountpoint:/var/lib/kubelet/pods/6684358b-d7a6-4396-9b4f-ea67d85e4517/volumes/kubernetes.io~secret/prometheus-operator-tls major:0 minor:140 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~projected/kube-api-access-wsxkk:{mountpoint:/var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~projected/kube-api-access-wsxkk major:0 minor:325 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~secret/serving-cert major:0 minor:297 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6c8c45e0-2342-499b-aa6b-339b6a722a87/volumes/kubernetes.io~projected/kube-api-access-gcgg9:{mountpoint:/var/lib/kubelet/pods/6c8c45e0-2342-499b-aa6b-339b6a722a87/volumes/kubernetes.io~projected/kube-api-access-gcgg9 major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~projected/kube-api-access-w82st:{mountpoint:/var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~projected/kube-api-access-w82st major:0 minor:1567 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~secret/client-ca-bundle:{mountpoint:/var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~secret/client-ca-bundle major:0 minor:1565 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~secret/secret-metrics-client-certs:{mountpoint:/var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~secret/secret-metrics-client-certs major:0 minor:1564 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~secret/secret-metrics-server-tls:{mountpoint:/var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~secret/secret-metrics-server-tls major:0 minor:1566 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/74197c50-9a41-40e8-9289-c7e6afbd3737/volumes/kubernetes.io~projected/kube-api-access-hq44d:{mountpoint:/var/lib/kubelet/pods/74197c50-9a41-40e8-9289-c7e6afbd3737/volumes/kubernetes.io~projected/kube-api-access-hq44d major:0 minor:47 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/74197c50-9a41-40e8-9289-c7e6afbd3737/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/74197c50-9a41-40e8-9289-c7e6afbd3737/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:46 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/volumes/kubernetes.io~projected/kube-api-access-4g7n9:{mountpoint:/var/lib/kubelet/pods/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/volumes/kubernetes.io~projected/kube-api-access-4g7n9 major:0 minor:1098 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:1097 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76fd9f44-4365-4271-8772-025655c50334/volumes/kubernetes.io~projected/kube-api-access-9j8fr:{mountpoint:/var/lib/kubelet/pods/76fd9f44-4365-4271-8772-025655c50334/volumes/kubernetes.io~projected/kube-api-access-9j8fr major:0 minor:142 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~projected/kube-api-access-s5t2f:{mountpoint:/var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~projected/kube-api-access-s5t2f major:0 minor:310 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~secret/serving-cert major:0 minor:300 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/800f436c-145d-4281-8d4d-644ba2cb0ebb/volumes/kubernetes.io~projected/kube-api-access-ngkqz:{mountpoint:/var/lib/kubelet/pods/800f436c-145d-4281-8d4d-644ba2cb0ebb/volumes/kubernetes.io~projected/kube-api-access-ngkqz major:0 minor:967 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/800f436c-145d-4281-8d4d-644ba2cb0ebb/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/800f436c-145d-4281-8d4d-644ba2cb0ebb/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:966 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/810c363b-a4c7-428d-a2fb-285adc29f477/volumes/kubernetes.io~projected/kube-api-access-2nrj9:{mountpoint:/var/lib/kubelet/pods/810c363b-a4c7-428d-a2fb-285adc29f477/volumes/kubernetes.io~projected/kube-api-access-2nrj9 major:0 minor:1023 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/810c363b-a4c7-428d-a2fb-285adc29f477/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/810c363b-a4c7-428d-a2fb-285adc29f477/volumes/kubernetes.io~secret/serving-cert major:0 minor:1022 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/813f3ee7-35b5-4ee8-b453-00d16d910eae/volumes/kubernetes.io~projected/kube-api-access-8w592:{mountpoint:/var/lib/kubelet/pods/813f3ee7-35b5-4ee8-b453-00d16d910eae/volumes/kubernetes.io~projected/kube-api-access-8w592 major:0 minor:294 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/813f3ee7-35b5-4ee8-b453-00d16d910eae/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/813f3ee7-35b5-4ee8-b453-00d16d910eae/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:638 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~projected/kube-api-access-lclkg:{mountpoint:/var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~projected/kube-api-access-lclkg major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~secret/metrics-tls major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/volumes/kubernetes.io~projected/kube-api-access-b9z4k:{mountpoint:/var/lib/kubelet/pods/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/volumes/kubernetes.io~projected/kube-api-access-b9z4k major:0 minor:1483 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config major:0 minor:1480 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/volumes/kubernetes.io~secret/kube-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/volumes/kubernetes.io~secret/kube-state-metrics-tls major:0 minor:1476 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb/volumes/kubernetes.io~projected/kube-api-access-jc47q:{mountpoint:/var/lib/kubelet/pods/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb/volumes/kubernetes.io~projected/kube-api-access-jc47q major:0 minor:1129 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb/volumes/kubernetes.io~secret/proxy-tls major:0 minor:1128 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/967bf4ac-f025-4296-8ed9-183a345f6b7c/volumes/kubernetes.io~projected/kube-api-access-hsk29:{mountpoint:/var/lib/kubelet/pods/967bf4ac-f025-4296-8ed9-183a345f6b7c/volumes/kubernetes.io~projected/kube-api-access-hsk29 major:0 minor:1015 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/967bf4ac-f025-4296-8ed9-183a345f6b7c/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/967bf4ac-f025-4296-8ed9-183a345f6b7c/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:1013 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/967bf4ac-f025-4296-8ed9-183a345f6b7c/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/967bf4ac-f025-4296-8ed9-183a345f6b7c/volumes/kubernetes.io~secret/srv-cert major:0 minor:1014 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~projected/kube-api-access-7fmp4:{mountpoint:/var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~projected/kube-api-access-7fmp4 major:0 minor:627 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~secret/encryption-config major:0 minor:558 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~secret/etcd-client major:0 minor:556 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~secret/serving-cert major:0 minor:559 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a043ea49-97f9-4ae6-83b9-733f12754d94/volumes/kubernetes.io~projected/kube-api-access-6jlvp:{mountpoint:/var/lib/kubelet/pods/a043ea49-97f9-4ae6-83b9-733f12754d94/volumes/kubernetes.io~projected/kube-api-access-6jlvp major:0 minor:1034 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a043ea49-97f9-4ae6-83b9-733f12754d94/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/a043ea49-97f9-4ae6-83b9-733f12754d94/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:1033 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3899a38-39b8-4b48-81e5-4d8854ecc8ab/volumes/kubernetes.io~projected/kube-api-access-pt2jq:{mountpoint:/var/lib/kubelet/pods/a3899a38-39b8-4b48-81e5-4d8854ecc8ab/volumes/kubernetes.io~projected/kube-api-access-pt2jq major:0 minor:1005 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3899a38-39b8-4b48-81e5-4d8854ecc8ab/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/a3899a38-39b8-4b48-81e5-4d8854ecc8ab/volumes/kubernetes.io~secret/cert major:0 minor:1003 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3899a38-39b8-4b48-81e5-4d8854ecc8ab/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/a3899a38-39b8-4b48-81e5-4d8854ecc8ab/volumes/kubernetes.io~secret/cluster-baremetal-operat Dec 04 22:18:56.391884 master-0 kubenswrapper[33572]: or-tls major:0 minor:1004 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~projected/kube-api-access-scht6:{mountpoint:/var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~projected/kube-api-access-scht6 major:0 minor:309 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~secret/serving-cert major:0 minor:302 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a5c2d3b8-41c0-4531-b770-57b7c567fe30/volumes/kubernetes.io~projected/kube-api-access-5vpbl:{mountpoint:/var/lib/kubelet/pods/a5c2d3b8-41c0-4531-b770-57b7c567fe30/volumes/kubernetes.io~projected/kube-api-access-5vpbl major:0 minor:796 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a5c2d3b8-41c0-4531-b770-57b7c567fe30/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/a5c2d3b8-41c0-4531-b770-57b7c567fe30/volumes/kubernetes.io~secret/metrics-tls major:0 minor:762 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b/volumes/kubernetes.io~projected/kube-api-access-g2ghk:{mountpoint:/var/lib/kubelet/pods/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b/volumes/kubernetes.io~projected/kube-api-access-g2ghk major:0 minor:971 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:848 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a8636bd7-fa9e-44b9-82df-9d37b398736d/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/a8636bd7-fa9e-44b9-82df-9d37b398736d/volumes/kubernetes.io~projected/kube-api-access major:0 minor:795 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a8636bd7-fa9e-44b9-82df-9d37b398736d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a8636bd7-fa9e-44b9-82df-9d37b398736d/volumes/kubernetes.io~secret/serving-cert major:0 minor:792 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:332 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/kube-api-access-lr65l:{mountpoint:/var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/kube-api-access-lr65l major:0 minor:317 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~secret/metrics-tls major:0 minor:588 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae107ad4-104c-4264-9844-afb3af28b19e/volumes/kubernetes.io~projected/kube-api-access-9gj4j:{mountpoint:/var/lib/kubelet/pods/ae107ad4-104c-4264-9844-afb3af28b19e/volumes/kubernetes.io~projected/kube-api-access-9gj4j major:0 minor:1127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b86ff0e8-2c72-4dc6-ac55-3c21940d044f/volumes/kubernetes.io~projected/kube-api-access-2vpxd:{mountpoint:/var/lib/kubelet/pods/b86ff0e8-2c72-4dc6-ac55-3c21940d044f/volumes/kubernetes.io~projected/kube-api-access-2vpxd major:0 minor:927 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b86ff0e8-2c72-4dc6-ac55-3c21940d044f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b86ff0e8-2c72-4dc6-ac55-3c21940d044f/volumes/kubernetes.io~secret/serving-cert major:0 minor:926 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b966c210-5415-4fa5-88ab-c85aba979b28/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/b966c210-5415-4fa5-88ab-c85aba979b28/volumes/kubernetes.io~secret/tls-certificates major:0 minor:1394 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bda1cb0d-26cf-4b94-b359-432492112888/volumes/kubernetes.io~projected/kube-api-access-8r2fn:{mountpoint:/var/lib/kubelet/pods/bda1cb0d-26cf-4b94-b359-432492112888/volumes/kubernetes.io~projected/kube-api-access-8r2fn major:0 minor:1399 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~projected/kube-api-access-pctsn:{mountpoint:/var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~projected/kube-api-access-pctsn major:0 minor:1398 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~secret/default-certificate major:0 minor:1392 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~secret/metrics-certs major:0 minor:1393 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~secret/stats-auth major:0 minor:1396 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c2279404-fa75-4de2-a302-d7b15ead5232/volumes/kubernetes.io~projected/kube-api-access-dd5zx:{mountpoint:/var/lib/kubelet/pods/c2279404-fa75-4de2-a302-d7b15ead5232/volumes/kubernetes.io~projected/kube-api-access-dd5zx major:0 minor:760 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c3863c74-8f22-4c67-bef5-2d0d39df4abd/volumes/kubernetes.io~projected/kube-api-access-pc4z5:{mountpoint:/var/lib/kubelet/pods/c3863c74-8f22-4c67-bef5-2d0d39df4abd/volumes/kubernetes.io~projected/kube-api-access-pc4z5 major:0 minor:732 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c3863c74-8f22-4c67-bef5-2d0d39df4abd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c3863c74-8f22-4c67-bef5-2d0d39df4abd/volumes/kubernetes.io~secret/serving-cert major:0 minor:506 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d/volumes/kubernetes.io~projected/kube-api-access-gpksd:{mountpoint:/var/lib/kubelet/pods/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d/volumes/kubernetes.io~projected/kube-api-access-gpksd major:0 minor:1123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:1121 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d/volumes/kubernetes.io~secret/webhook-cert major:0 minor:1122 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c6a5d14d-0409-4024-b0a8-200fa2594185/volumes/kubernetes.io~projected/kube-api-access-bfklr:{mountpoint:/var/lib/kubelet/pods/c6a5d14d-0409-4024-b0a8-200fa2594185/volumes/kubernetes.io~projected/kube-api-access-bfklr major:0 minor:289 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c6a5d14d-0409-4024-b0a8-200fa2594185/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/c6a5d14d-0409-4024-b0a8-200fa2594185/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:587 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce6002bb-4948-45ab-bb1d-ed65e86b6466/volumes/kubernetes.io~projected/kube-api-access-87gv4:{mountpoint:/var/lib/kubelet/pods/ce6002bb-4948-45ab-bb1d-ed65e86b6466/volumes/kubernetes.io~projected/kube-api-access-87gv4 major:0 minor:1130 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce6b5a46-172b-4575-ba22-ff3c6ea4207f/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/ce6b5a46-172b-4575-ba22-ff3c6ea4207f/volumes/kubernetes.io~projected/ca-certs major:0 minor:560 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce6b5a46-172b-4575-ba22-ff3c6ea4207f/volumes/kubernetes.io~projected/kube-api-access-2cfhv:{mountpoint:/var/lib/kubelet/pods/ce6b5a46-172b-4575-ba22-ff3c6ea4207f/volumes/kubernetes.io~projected/kube-api-access-2cfhv major:0 minor:626 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~projected/kube-api-access-c7l9n:{mountpoint:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~projected/kube-api-access-c7l9n major:0 minor:307 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/etcd-client major:0 minor:296 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/serving-cert major:0 minor:298 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c/volumes/kubernetes.io~projected/kube-api-access-w2ndk:{mountpoint:/var/lib/kubelet/pods/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c/volumes/kubernetes.io~projected/kube-api-access-w2ndk major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/volumes/kubernetes.io~projected/kube-api-access-kcw8f:{mountpoint:/var/lib/kubelet/pods/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/volumes/kubernetes.io~projected/kube-api-access-kcw8f major:0 minor:305 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/volumes/kubernetes.io~secret/metrics-tls major:0 minor:590 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~projected/kube-api-access major:0 minor:313 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~secret/serving-cert major:0 minor:301 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e37d318a-5bf8-46ed-b6de-494102738da7/volumes/kubernetes.io~projected/kube-api-access-r57bb:{mountpoint:/var/lib/kubelet/pods/e37d318a-5bf8-46ed-b6de-494102738da7/volumes/kubernetes.io~projected/kube-api-access-r57bb major:0 minor:292 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7a7f632-2442-4837-b068-c22b03c71fb0/volumes/kubernetes.io~projected/kube-api-access-c7d9j:{mountpoint:/var/lib/kubelet/pods/e7a7f632-2442-4837-b068-c22b03c71fb0/volumes/kubernetes.io~projected/kube-api-access-c7d9j major:0 minor:1030 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7a7f632-2442-4837-b068-c22b03c71fb0/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/e7a7f632-2442-4837-b068-c22b03c71fb0/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:1027 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7a7f632-2442-4837-b068-c22b03c71fb0/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/e7a7f632-2442-4837-b068-c22b03c71fb0/volumes/kubernetes.io~secret/srv-cert major:0 minor:1026 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/eb4d8477-c3b5-4e88-aaa9-222ad56d974c/volumes/kubernetes.io~projected/kube-api-access-2w8vs:{mountpoint:/var/lib/kubelet/pods/eb4d8477-c3b5-4e88-aaa9-222ad56d974c/volumes/kubernetes.io~projected/kube-api-access-2w8vs major:0 minor:1453 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/eb4d8477-c3b5-4e88-aaa9-222ad56d974c/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/eb4d8477-c3b5-4e88-aaa9-222ad56d974c/volumes/kubernetes.io~secret/certs major:0 minor:1452 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/eb4d8477-c3b5-4e88-aaa9-222ad56d974c/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/eb4d8477-c3b5-4e88-aaa9-222ad56d974c/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:1451 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee/volumes/kubernetes.io~projected/kube-api-access-xkrvr:{mountpoint:/var/lib/kubelet/pods/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee/volumes/kubernetes.io~projected/kube-api-access-xkrvr major:0 minor:1236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee/volumes/kubernetes.io~secret/proxy-tls major:0 minor:1235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa/volumes/kubernetes.io~projected/kube-api-access-xdbpk:{mountpoint:/var/lib/kubelet/pods/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa/volumes/kubernetes.io~projected/kube-api-access-xdbpk major:0 minor:153 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa/volumes/kubernetes.io~secret/metrics-certs major:0 minor:592 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f1534e25-7add-46a1-8f4e-0065c232aa4e/volumes/kubernetes.io~projected/kube-api-access-4d7pj:{mountpoint:/var/lib/kubelet/pods/f1534e25-7add-46a1-8f4e-0065c232aa4e/volumes/kubernetes.io~projected/kube-api-access-4d7pj major:0 minor:923 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f1534e25-7add-46a1-8f4e-0065c232aa4e/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/f1534e25-7add-46a1-8f4e-0065c232aa4e/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:922 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~projected/kube-api-access-g8d54:{mountpoint:/var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~projected/kube-api-access-g8d54 major:0 minor:290 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~secret/serving-cert major:0 minor:287 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb0274dc-fac1-41f9-b3e5-77253d851fdf/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/fb0274dc-fac1-41f9-b3e5-77253d851fdf/volumes/kubernetes.io~projected/ca-certs major:0 minor:561 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb0274dc-fac1-41f9-b3e5-77253d851fdf/volumes/kubernetes.io~projected/kube-api-access-r4czl:{mountpoint:/var/lib/kubelet/pods/fb0274dc-fac1-41f9-b3e5-77253d851fdf/volumes/kubernetes.io~projected/kube-api-access-r4czl major:0 minor:625 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb0274dc-fac1-41f9-b3e5-77253d851fdf/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/fb0274dc-fac1-41f9-b3e5-77253d851fdf/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:557 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fbb8e73f-7e50-451b-b400-e88a86b51e09/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/fbb8e73f-7e50-451b-b400-e88a86b51e09/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:577 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fbb8e73f-7e50-451b-b400-e88a86b51e09/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/fbb8e73f-7e50-451b-b400-e88a86b51e09/volumes/kubernetes.io~empty-dir/tmp major:0 minor:576 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fbb8e73f-7e50-451b-b400-e88a86b51e09/volumes/kubernetes.io~projected/kube-api-access-c6b4p:{mountpoint:/var/lib/kubelet/pods/fbb8e73f-7e50-451b-b400-e88a86b51e09/volumes/kubernetes.io~projected/kube-api-access-c6b4p major:0 minor:578 fsType:tmpfs blockSize:0} overlay_0-1001:{mountpoint:/var/lib/containers/storage/overlay/5f5ea99e8b570ca894ef371020cb7453cb05195581fe5da0ffa377e3863359ac/merged major:0 minor:1001 fsType:overlay blockSize:0} overlay_0-103:{mountpoint:/var/lib/containers/storage/overlay/5ff948d3cf864a88c726cea5545f3c8fc0dd0c3d629e967faaef6847cc8acc6b/merged major:0 minor:103 fsType:overlay blockSize:0} overlay_0-1035:{mountpoint:/var/lib/containers/storage/overlay/c801ba228d4770225e33a36204559b27fc3dfd3a4ace848e349920ec63a16324/merged major:0 minor:1035 fsType:overlay blockSize:0} overlay_0-1037:{mountpoint:/var/lib/containers/storage/overlay/3ce73e612ec1f2beee3c05c90a0000ca8dbf5605600463d6826fc72c2c0792e0/merged major:0 minor:1037 fsType:overlay blockSize:0} overlay_0-1039:{mountpoint:/var/lib/containers/storage/overlay/834d7cdd9adee3b4b9e5d66aa34ccd8fe31fb14480473d9ead95a70c100a2041/merged major:0 minor:1039 fsType:overlay blockSize:0} overlay_0-1047:{mountpoint:/var/lib/containers/storage/overlay/f24b383e7accfbea057e1627b6853551121e62a09e28b62e5faba7f7e5e3fb05/merged major:0 minor:1047 fsType:overlay blockSize:0} overlay_0-1049:{mountpoint:/var/lib/containers/storage/overlay/7f3ac19a3f3cfbfffa62ee7a92dc496c94b8bb8d399bdc0213d12bd45c26c352/merged major:0 minor:1049 fsType:overlay blockSize:0} overlay_0-105:{mountpoint:/var/lib/containers/storage/overlay/d608c5f5ff0229eecd19e9aad38ff91eb0775b298dd2a5be32917daf52e2bc41/merged major:0 minor:105 fsType:overlay blockSize:0} overlay_0-1066:{mountpoint:/var/lib/containers/storage/overlay/6dfdd7c405de2ac209b72e9a0fb0c0e373c265ff9a5fc1481403c74f16bd18de/merged major:0 minor:1066 fsType:overlay blockSize:0} overlay_0-1078:{mountpoint:/var/lib/containers/storage/overlay/ac912a17edc3a6bc6e03ef1b11a2db9d9a770356c11c2be62ba0fae4104691b7/merged major:0 minor:1078 fsType:overlay blockSize:0} overlay_0-1082:{mountpoint:/var/lib/containers/storage/overlay/c5b91ee4f252dcf3e02a7094a4ba854eedcffa9a6f46b6b3cb1c0abd6e61e01d/merged major:0 minor:1082 fsType:overlay blockSize:0} overlay_0-1084:{mountpoint:/var/lib/containers/storage/overlay/686a29dff8e510bed28bcebeb8c8fe4502e9eb87162dfa1bc98de79e1889313d/merged major:0 minor:1084 fsType:overlay blockSize:0} overlay_0-1086:{mountpoint:/var/lib/containers/storage/overlay/69a41c0cf154b6a2c3d27a60c6b08c62d343dfb189199f44971b4b6113a350ac/merged major:0 minor:1086 fsType:overlay blockSize:0} overlay_0-1088:{mountpoint:/var/lib/containers/storage/overlay/53e807b80de0e83ea3a848e9c47f86f3c5437cb9ee29bde268d28b4e312db21d/merged major:0 minor:1088 fsType:overlay blockSize:0} overlay_0-1099:{mountpoint:/var/lib/containers/storage/overlay/797e7e05baeec3c7ee860402804046230c24166759d1cca513040af94fcbb37d/merged major:0 minor:1099 fsType:overlay blockSize:0} overlay_0-1101:{mountpoint:/var/lib/containers/storage/overlay/2691ff52d6e2107e3caeeeee2e6255a83282fa4a560841b73e70e5a89e65d058/merged major:0 minor:1101 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/33d7a5b61813d5ed96b775a4f10904b11d2117caf39f483d4a468177f5468a6b/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-1110:{mountpoint:/var/lib/containers/storage/overlay/0d27d048cab51b5d07dd100235cc7b586257890b58863e17756a896982b3e9a6/merged major:0 minor:1110 fsType:overlay blockSize:0} overlay_0-112:{mountpoint:/var/lib/containers/storage/overlay/8490b9f24127a33a008717cae4b36f08e13e8cd0fd9611699abb2fba1c7d7b62/merged major:0 minor:112 fsType:overlay blockSize:0} overlay_0-1125:{mountpoint:/var/lib/containers/storage/overlay/e1fcbcfd02ecddafbf8d30d06fa69a6dcbce10b731eff3cbac345fa717e15fd2/merged major:0 minor:1125 fsType:overlay blockSize:0} overlay_0-1139:{mountpoint:/var/lib/containers/storage/overlay/59264c86dcc6c19db7779281bae45bf0845606f7a18a5b49a7f77956b68fb3aa/merged major:0 minor:1139 fsType:overlay blockSize:0} overlay_0-1142:{mountpoint:/var/lib/containers/storage/overlay/8a783d8ba4838f95916cb9db0e21dd47787e1ca4558bb28d7d53e96534e7c125/merged major:0 minor:1142 fsType:overlay blockSize:0} overlay_0-1144:{mountpoint:/var/lib/containers/storage/overlay/810d9c83c35560439e9d383d8801a509874d2b683ae3e4fe30323df5c2bfb909/merged major:0 minor:1144 fsType:overlay blockSize:0} overlay_0-1152:{mountpoint:/var/lib/containers/storage/overlay/023176c792d91c2ad552837bec7947a45dfa2ffd7741864a6b2c852c8f7cc57a/merged major:0 minor:1152 fsType:overlay blockSize:0} overlay_0-1154:{mountpoint:/var/lib/containers/storage/overlay/f44e51ab500912da0469d9700079172a9933c9a4acadef46647d8c271137cc0d/merged major:0 minor:1154 fsType:overlay blockSize:0} overlay_0-1161:{mountpoint:/var/lib/containers/storage/overlay/a45601402a698511bace24bcb5f4964d1cf044e76bed5ba29b6e246b89c2d6a6/merged major:0 minor:1161 fsType:overlay blockSize:0} overlay_0-118:{mountpoint:/var/lib/containers/storage/overlay/263806c8592f3b1fbff96e4f89ccfb0a715ab68c0753bcf02f574128dd2ca286/merged major:0 minor:118 fsType:overlay blockSize:0} overlay_0-1182:{mountpoint:/var/lib/containers/storage/overlay/beb5470be10344615e66afcd9a1dd4aeddb5e1f2092ae9a6bc3dad9b0d4bef04/merged major:0 minor:1182 fsType:overlay blockSize:0} overlay_0-1186:{mountpoint:/var/lib/containers/storage/overlay/d222b2e39fcdb39aa4dd9ccbbb8ca13f619f9839ea0de5c89f720959e1abaf67/merged major:0 minor:1186 fsType:overlay blockSize:0} overlay_0-1191:{mountpoint:/var/lib/containers/storage/overlay/35e1731b2e9f683c96e00cab6c905ae546ae41e172deda71767f00d7bab7ebdd/merged major:0 minor:1191 fsType:overlay blockSize:0} overlay_0-120:{mountpoint:/var/lib/containers/storage/overlay/079f0e817f091493dfb34fea597de725f86259ae9132bd64a8d0ddf6504f636a/merged major:0 minor:120 fsType:overlay blockSize:0} overlay_0-1201:{mountpoint:/var/lib/containers/storage/overlay/0afea8b92b174f9081f4e72312ba9d7a9e40921034bd8b8284fc83c39647b912/merged major:0 minor:1201 fsType:overlay blockSize:0} overlay_0-1203:{mountpoint:/var/lib/containers/storage/overlay/6d1ae3ecb1f37b34aa47992ff993d0d78f9f5bfb482417a1f9ea39abfab43fbb/merged major:0 minor:1203 fsType:overlay blockSize:0} overlay_0-125:{mountpoint:/var/lib/containers/storage/overlay/28fdeb3ae1eff8dc7121fd5a2d01f31e7906da051b9b8313f582f757d3de61c6/merged major:0 minor:125 fsType:overlay blockSize:0} overlay_0-1251:{mountpoint:/var/lib/containers/storage/overlay/ea35620cd2e620dc7252d7778c858460e2c1a3c3dd6510ab8a85b4df6ce66baa/merged major:0 minor:1251 fsType:overlay blockSize:0} overlay_0-1264:{mountpoint:/var/lib/containers/storage/overlay/14dcb1737ad4e75c93414e1089f7b085297d6741494f9b30ff196c91cf004025/merged major:0 minor:1264 fsType:overlay blockSize:0} overlay_0-1268:{mountpoint:/var/lib/containers/storage/overlay/a465f19c84b05a1d317f3e3aceb7a9f84fc2aedfb54e0c91e683821f4171e1b8/merged major:0 minor:1268 fsType:overlay blockSize:0} overlay_0-1270:{mountpoint:/var/lib/containers/storage/overlay/b62b992e3bd45bfce63c7e6aeab1b97a9abf444246a735db3ae6cd2579c6d9ce/merged major:0 minor:1270 fsType:overlay blockSize:0} overlay_0-1271:{mountpoint:/var/lib/containers/storage/overlay/f48b5740bd24491f1383b70d336d6a5be408d8bfd155aff210b82b403afdc2d2/merged major:0 minor:1271 fsType:overlay blockSize:0} overlay_0-1273:{mountpoint:/var/lib/containers/storage/overlay/17581e18fc8ef8546e32c4d03190e20a33af56f9494236e509b0476ba8c57109/merged major:0 minor:1273 fsType:overlay blockSize:0} overlay_0-1275:{mountpoint:/var/lib/containers/storage/overlay/ae2ebe827a354b319f554ac7688d7b99fd09ef1ae386c46411c53cb046c5f0e3/merged major:0 minor:1275 fsType:overlay blockSize:0} overlay_0-1277:{mountpoint:/var/lib/containers/storage/overlay/b1f61cc6b9451e9735f9ee5730fc49ed87ccf1475995023adc16d184081f227e/merged major:0 minor:1277 fsType:overlay blockSize:0} overlay_0-1279:{mountpoint:/var/lib/containers/storage/overlay/ac0840f7b6055c8db7b74d861bfdaa3f00260c0de2d046762a1d5abb33271162/merged major:0 minor:1279 fsType:overlay blockSize:0} overlay_0-1281:{mountpoint:/var/lib/containers/storage/overlay/8fbd3322102c85cc14317da5570ab509ffdd1cec75b67e3aa5deb1fa6fcf66bb/merged major:0 minor:1281 fsType:overlay blockSize:0} overlay_0-1288:{mountpoint:/var/lib/containers/storage/overlay/14f9039d0fbc7a3f4d67146264c14feeaf2def4a50489c6c58efd9a1d447d4d1/merged major:0 minor:1288 fsType:overlay blockSize:0} overlay_0-1290:{mountpoint:/var/lib/containers/storage/overlay/e3cae5478c522658ef28cc2f5ed704b6da114bd2a0ba083fe56ca5810cf1b6ba/merged major:0 minor:1290 fsType:overlay blockSize:0} overlay_0-1292:{mountpoint:/var/lib/containers/storage/overlay/aa84670abf802225c1beba32a40a1ded8985af8a7b593ccd87f1237723e26e1f/merged major:0 minor:1292 fsType:overlay blockSize:0} overlay_0-1294:{mountpoint:/var/lib/containers/storage/overlay/3eb53c15f455118939db3cc1bf7d828fef80b97667ca5ef0f6a63624fa618f57/merged major:0 minor:1294 fsType:overlay blockSize:0} overlay_0-130:{mountpoint:/var/lib/containers/storage/overlay/dfc8d550ce331c21a66175be38544deb4f5250e9dd2d631d18072f3cd4ff37ed/merged major:0 minor:130 fsType:overlay blockSize:0} overlay_0-1302:{mountpoint:/var/lib/containers/storage/overlay/194eb1ae350a040712b1b9117ebd23de35c9654020405e71b86d332bb8936f6b/merged major:0 minor:1302 fsType:overlay blockSize:0} overlay_0-1304:{mountpoint:/var/lib/containers/storage/overlay/0bae1f43446a2f76d6a271e14c51835ffe001a6992e4cf6f650b44b1e407c3fd/merged major:0 minor:1304 fsType:overlay blockSize:0} overlay_0-1306:{mountpoint:/var/lib/containers/storage/overlay/628273a2df571956911111d2d43ed661d01ae7a7b6eb2c2cd6698a3dc8e956a0/merged major:0 minor:1306 fsType:overlay blockSize:0} overlay_0-1308:{mountpoint:/var/lib/containers/storage/overlay/701b184abea9eec0f1fdf1b6ad0931a6d6ce8a15900c2a81fb77e706f3ac02e9/merged major:0 minor:1308 fsType:overlay blockSize:0} overlay_0-1310:{mountpoint:/var/lib/containers/storage/overlay/7f655b125fff31057e328d6944ae4ce9a493dfc8e7122e0446aceacd7fcf959e/merged major:0 minor:1310 fsType:overlay blockSize:0} overlay_0-1313:{mountpoint:/var/lib/containers/storage/overlay/dbf5c13871c15433f81aa7ec273b046c5516cb8cc9a749b46fc428d3a7866de4/merged major:0 minor:1313 fsType:overlay blockSize:0} overlay_0-1329:{mountpoint:/var/lib/containers/storage/overlay/ec6b4c67428fe7e401a36b2712c6b4edf0357129572e29e8f18a90c6621f9f9a/merged major:0 minor:1329 fsType:overlay blockSize:0} overlay_0-1331:{mountpoint:/var/lib/containers/storage/overlay/e15c0b8511c604670a5b3e75197e4d9905f74e0af22c88fb4c0037c24d0c69aa/merged major:0 minor:1331 fsType:overlay blockSize:0} overlay_0-1333:{mountpoint:/var/lib/containers/storage/overlay/edc19fd2818e3cc46092edfa3091a2bebb5fd4e1300f7dd85995729747cf6a3e/merged major:0 minor:1333 fsType:overlay blockSize:0} overlay_0-1335:{mountpoint:/var/lib/containers/storage/overlay/7eae7b85f3262e8feaf46eb3ba49686ae2aeec71d7734d16e357e2c25b4be33b/merged major:0 minor:1335 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/143be072f31494cac37a0c0b482054aad894eef15bafddbe1c92739645139696/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/5867eae42264c04151f652cbc01027b0a3b8b60aa7855ad52523b8caec306ea8/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-1375:{mountpoint:/var/lib/containers/storage/overlay/a0e54ec3bf6dfa1db95efd7f3cd1b16e76c1b2b9f1ce096b5bfbda92a133b7cc/merged major:0 minor:1375 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/93aa984adf9326a73e761b6329bf2013fea18d93fc3a7f2f7bbe6e30793efe7c/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-1384:{mountpoint:/var/lib/containers/storage/overlay/c4ade1b7599d1772055d6a488899d5fe3564474fb3272937c140f749effdbef8/merged major:0 minor:1384 fsType:overlay blockSize:0} overlay_0-1406:{mountpoint:/var/lib/containers/storage/overlay/c36c68eb26e18d404fde25c7e0ca0ad0a07ae451dd5f24f56900b97c86334c98/merged major:0 minor:1406 fsType:overlay blockSize:0} overlay_0-1408:{mountpoint:/var/lib/containers/storage/overlay/c2bcdfc50b990281787f6068ee193cfee25e5f6242e4ac33439771ad54412b8d/merged major:0 minor:1408 fsType:overlay blockSize:0} overlay_0-1410:{mountpoint:/var/lib/containers/storage/overlay/f1eeea5533ac511846d0889a23ad3df061f9610a1a10f0e3a0f9bba209c91999/merged major:0 minor:1410 fsType:overlay blockSize:0} overlay_0-1412:{mountpoint:/var/lib/containers/storage/overlay/bddb4d463e3e1db2448d697bc18b79bc4d658b697abd512ec995dfd90869f9ab/merged major:0 minor:1412 fsType:overlay blockSize:0} overlay_0-1414:{mountpoint:/var/lib/containers/storage/overlay/47756bfe3e976025d7d6e95d5311da752d04b3f4ef91157806671ba8ed173e9a/merged major:0 minor:1414 fsType:overlay blockSize:0} overlay_0-1417:{mountpoint:/var/lib/containers/storage/overlay/e8a848782518cce8f559f24491a2c8ddd7acc5749bccf1bdbd40397ddb05cb07/merged major:0 minor:1417 fsType:overlay blockSize:0} overlay_0-1420:{mountpoint:/var/lib/containers/storage/overlay/b4010edfe9ba97c7f65b64e63a59b58a21d7bc0ce461b35a327a59ee623be417/merged major:0 minor:1420 fsType:overlay blockSize:0} overlay_0-1429:{mountpoint:/var/lib/containers/storage/overlay/dafb7a6cfab593c0b4bf5b45d0318aeea06cb3cd8c62b00b90324e7e9cb3d69b/merged major:0 minor:1429 fsType:overlay blockSize:0} overlay_0-1431:{mountpoint:/var/lib/containers/storage/overlay/31fe4331720a2c384df3697f99d793eeea4060e317d1f68cfa616cacd9a485f0/merged major:0 minor:1431 fsType:overlay blockSize:0} overlay_0-1434:{mountpoint:/var/lib/containers/storage/overlay/510ab06180cde3f5f0b9eef30981d9034f856557d8896fbd6c73712c316828b1/merged major:0 minor:1434 fsType:overlay blockSize:0} overlay_0-1436:{mountpoint:/var/lib/containers/storage/overlay/6ebc14ce985f2e6beb1c06e7cdb2483c26b0d4cb9145ae7ba38d52775ad2a699/merged major:0 minor:1436 fsType:overlay blockSize:0} overlay_0-145:{mountpoint:/var/lib/containers/storage/overlay/4134eea505e316ad2ea5386b1510474205d0e5bc303592975dcff1ea4f042607/merged major:0 minor:145 fsType:overlay blockSize:0} overlay_0-1467:{mountpoint:/var/lib/containers/storage/overlay/dcf5ee2501c36651b5e5054424339d9a2441a164873b05f3220bca08c1151397/merged major:0 minor:1467 fsType:overlay blockSize:0} overlay_0-147:{mountpoint:/var/lib/containers/storage/overlay/b6bc0eb32eca48dd8026ab1fffcc0813342b66bd56cfb93d66a9f90fbd6f04ca/merged major:0 minor:147 fsType:overlay blockSize:0} overlay_0-1486:{mountpoint:/var/lib/containers/storage/overlay/a23c96f78eda25535c6cc9445fda6662db0c335bc75f341d3500ec444ad6c388/merged major:0 minor:1486 fsType:overlay blockSize:0} overlay_0-149:{mountpoint:/var/lib/containers/storage/overlay/6ceb8732b8ed8472175c3adb49111a910ac0312247b04021d5522efc4f610fd3/merged major:0 minor:149 fsType:overlay blockSize:0} overlay_0-1492:{mountpoint:/var/lib/containers/storage/overlay/09d7f5f25862565f31cfd2e5363aa9bd655de0f4ac7ef13f3380acec38ccfe39/merged major:0 minor:1492 fsType:overlay blockSize:0} overlay_0-1494:{mountpoint:/var/lib/containers/storage/overlay/d9df8c451d032cf1ae9d707b0d68549afd026bdeee9ed708d0b34356a6f44401/merged major:0 minor:1494 fsType:overlay blockSize:0} overlay_0-1496:{mountpoint:/var/lib/containers/storage/overlay/013fcf928a89929f31c45ea00aa8a3167f5f99e3638380bc7f126506ba590c5e/merged major:0 minor:1496 fsType:overlay blockSize:0} overlay_0-1505:{mountpoint:/var/lib/containers/storage/overlay/3651a9dd527256021bbba2376047c9fcd232b886104259c60bd346434280f6ac/merged major:0 minor:1505 fsType:overlay blockSize:0} overlay_0-151:{mountpoint:/var/lib/containers/storage/overlay/fee79ac5a938651008db112ee7ca95888119fe41aa822c018f9ed2bd7edbb6cc/merged major:0 minor:151 fsType:overlay blockSize:0} overlay_0-1513:{mountpoint:/var/lib/containers/storage/overlay/6eaed5ee7ce8efb0eaeae6a90e3c05422e89324917e2d2e8f002e53d6dbbe6ab/merged major:0 minor:1513 fsType:overlay blockSize:0} overlay_0-1515:{mountpoint:/var/lib/containers/storage/overlay/9bffe4140528426bc4e90b9da8337cfca66caf3ff4cdf2ea966e1ae5efb1bb9f/merged major:0 minor:1515 fsType:overlay blockSize:0} overlay_0-1517:{mountpoint:/var/lib/containers/storage/overlay/9dc09f28b22a9f11d42671ce5015a81ec7dfd17073fca3be2e2cfed4ad034020/merged major:0 minor:1517 fsType:overlay blockSize:0} overlay_0-1519:{mountpoint:/var/lib/containers/storage/overlay/a3eb2d2648f5fa48a6c6bfcf43487ee187f382ade74397734e0642c6a9fbe415/merged major:0 minor:1519 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/178dc1361ce0116e7e7561760bf9a7aadea81b2e34c94af6e28c1cc88da51034/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-1540:{mountpoint:/var/lib/containers/storage/overlay/c4e776bb7f500fbcbe316bdb0d62e82b39e98ea8c193c4058b1bddc8c9c852c1/merged major:0 minor:1540 fsType:overlay blockSize:0} overlay_0-1542:{mountpoint:/var/lib/containers/storage/overlay/b3cd3fa82a94c19b7a8a43eedcd727e01f4fbc11d9ab5b8b1089d7970015af39/merged major:0 minor:1542 fsType:overlay blockSize:0} overlay_0-1556:{mountpoint:/var/lib/containers/storage/overlay/c6d5fd4fdcb0aa25f42741dfa930b92facf76b6452e4fc5e02a421491237e568/merged major:0 minor:1556 fsType:overlay blockSize:0} overlay_0-1570:{mountpoint:/var/lib/containers/storage/overlay/d821912684ddde186dcd234d55712bfa3e3ad0c476a395ec4e0ae34b52c91e8b/merged major:0 minor:1570 fsType:overlay blockSize:0} overlay_0-1572:{mountpoint:/var/lib/containers/storage/overlay/4418c8112abd7b2d32062c16dbf91efd71be1b23a529fb24bcb51fe09bfeb493/merged major:0 minor:1572 fsType:overlay blockSize:0} overlay_0-1588:{mountpoint:/var/lib/containers/storage/overlay/8099930fb8b4ad6939268453c2d215165fef4b0d17bca4ef583944786bb06604/merged major:0 minor:1588 fsType:overlay blockSize:0} overlay_0-1593:{mountpoint:/var/lib/containers/storage/overlay/f6f47482c8cc7fcc32b57d13ee7bd94cc8d2b617b381d83c10055575fd8f9e36/merged major:0 minor:1593 fsType:overlay blockSize:0} overlay_0-1598:{mountpoint:/var/lib/containers/storage/overlay/88e6ba9a0604ea45d6840eaac95107179a4d71115516e372e33e0c481b27cbde/merged major:0 minor:1598 fsType:overlay blockSize:0} overlay_0-1601:{mountpoint:/var/lib/containers/storage/overlay/ec1da8400b65799efff3c48b7fcf96f60afafb06306c3803727b3828ae664e7a/merged major:0 minor:1601 fsType:overlay blockSize:0} overlay_0-1605:{mountpoint:/var/lib/containers/storage/overlay/a9cc24683c966e722cbf1df4867f43de43967dd0e989b8309d0c301d2979b1b1/merged major:0 minor:1605 fsType:overlay blockSize:0} overlay_0-163:{mountpoint:/var/lib/containers/storage/overlay/7025889c9756e7b7a56bc2b62ce9f6ce434447c46280b1dd16b16f8fab5a0b9e/merged major:0 minor:163 fsType:overlay blockSize:0} overlay_0-1630:{mountpoint:/var/lib/containers/storage/overlay/736b6ddba2991f59fe6042aae70af61bf9ec5a840b1693a88bfad4a9fce657ce/merged major:0 minor:1630 fsType:overlay blockSize:0} overlay_0-1636:{mountpoint:/var/lib/containers/storage/overlay/f691ed6758cbbc271ccf2293b57d2277d5f7c132d9781ffea19463fbfe71a496/merged major:0 minor:1636 fsType:overlay blockSize:0} overlay_0-1639:{mountpoint:/var/lib/containers/storage/overlay/e5883abaa272aa4edb4a7c9d0e94c0bc5442886e36cbaafe7c08b86846d4a663/merged major:0 minor:1639 fsType:overlay blockSize:0} overlay_0-1653:{mountpoint:/var/lib/containers/storage/overlay/d48a04b4cfeb43724362659629735d3aad90e2bbe72840949418c7ea8eaf41e2/merged major:0 minor:1653 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/e036da790c16d99c571d0ccacf07ff209a590ded05c2c88c351cda4fc6c1f718/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-1675:{mountpoint:/var/lib/containers/storage/overlay/e242c148b009ec8bd284d1be906dff966010d769b9e079b2135a4558b66fee37/merged major:0 minor:1675 fsType:overlay blockSize:0} overlay_0-1677:{mountpoint:/var/lib/containers/storage/overlay/326e1047178fd1797ff6f535cad4b778e8dcfade1a30ef8993ebb0ec305e13c2/merged major:0 minor:1677 fsType:overlay blockSize:0} overlay_0-168:{mountpoint:/var/lib/containers/storage/overlay/52f4840dd9e5a899a3ee79d39704b79e4edb54be50c17350b49fc9809c4f3095/merged major:0 minor:168 fsType:overlay blockSize:0} overlay_0-1684:{mountpoint:/var/lib/containers/storage/overlay/17ce94a1adb9ba08cc25da951fbd7707309ded631bfe0f8a386ae6798a53a605/merged major:0 minor:1684 fsType:overlay blockSize:0} overlay_0-1688:{mountpoint:/var/lib/containers/storage/overlay/88cfbb54a096432994f1c7e3cd8e39fea408f27114944954dad0869b11e08092/merged major:0 minor:1688 fsType:overlay blockSize:0} overlay_0-170:{mountpoint:/var/lib/containers/storage/overlay/4f133cacce459fa2f412e70fcc6adc4ae73461c58f7611fc17293fe1f6dc2a02/merged major:0 minor:170 fsType:overlay blockSize:0} overlay_0-1706:{mountpoint:/var/lib/containers/storage/overlay/906cc34225648bbc979402fd3dffce8a7a859dd194c6aa737431146a83ac361e/merged major:0 minor:1706 fsType:overlay blockSize:0} overlay_0-172:{mountpoint:/var/lib/containers/storage/overlay/4b66f1eecd742a97bdf220db36e499d95d58a7c91fe0450b37cb9839d96cac49/merged major:0 minor:172 fsType:overlay blockSize:0} overlay_0-183:{mountpoint:/var/lib/containers/storage/overlay/b6fb4a3972573514ee7adda30080134ced7360f824574c3beb772b2d5bf0b2b2/merged major:0 minor:183 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/1de4e36a9fb8c54e483655ce1d9fad13a355fdba43f22f36d87b6496502b064b/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-191:{mountpoint:/var/lib/containers/storage/overlay/238453336d53163b6ad011610393a602302770663bb4fba04782bb4959247937/merged major:0 minor:191 fsType:overlay blockSize:0} overlay_0-193:{mountpoint:/var/lib/containers/storage/overlay/b20a4a76220f680e5e829ef9a31223a016d69fe44ad6bc492f5bd6a647847c6c/merged major:0 minor:193 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/4827550be80f141f4fe77e793c97f9d8ffbfbbe90a6f1e93c0d86ce93fcf4c34/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-197:{mountpoint:/var/lib/containers/storage/overlay/fd10eb9c73111aabfa99eb8d934c0c065407fba0f614e0b3789d2dc7145a9d61/merged major:0 minor:197 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/3426e6abfcf11fa2e587a0d33f91b15868e1fadfe224b04cbb5c462525491987/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-205:{mountpoint:/var/lib/containers/storage/overlay/321d620025f8336ddb34a44f22618c0bf228dbd71a420f2dda16297ec37d3ac7/merged major:0 minor:205 fsType:overlay blockSize:0} overlay_0-216:{mountpoint:/var/lib/containers/storage/overlay/9901c0e30efb35c00269bb5de9842fde7d85b7f5e7ad5463c6f885ef6639efc9/merged major:0 minor:216 fsType:overlay blockSize:0} overlay_0-218:{mountpoint:/var/lib/containers/storage/overlay/66dfb4a6f65f3d235dc90f3c9c1e7af4874f21d37c8003abcc6dfbb7ccfac93c/merged major:0 minor:218 fsType:overlay blockSize:0} overlay_0-219:{mountpoint:/var/lib/containers/storage/overlay/cd10eedbc21111dd07a1c752114380b0a3494294861a925ee2a930ee0bf6486e/merged major:0 minor:219 fsType:overlay blockSize:0} overlay_0-223:{mountpoint:/var/lib/containers/storage/overlay/ec032cec1ba6d94e756a8e7f070c14cb169f19285e4bcc88bb06948ded909ab9/merged major:0 minor:223 fsType:overlay blockSize:0} overlay_0-225:{mountpoint:/var/lib/containers/storage/overlay/10826892b44494868f9e471c4c2ca00330eea62d5d1e052ef33445f14b1f0b9b/merged major:0 minor:225 fsType:overlay blockSize:0} overlay_0-236:{mountpoint:/var/lib/containers/storage/overlay/8ee3997f740be65c4b819200c7009fd86111958c3b4022066357c77de2b63f4a/merged major:0 minor:236 fsType:overlay blockSize:0} overlay_0-244:{mountpoint:/var/lib/containers/storage/overlay/47caf304b739434b5c3d68259af6b1aaad2d29c64ef44138fcad186441a604d7/merged major:0 minor:244 fsType:overlay blockSize:0} overlay_0-252:{mountpoint:/var/lib/containers/storage/overlay/d6108c34ebd84742ab5d7650dddcb2dc457dbcd66fee7f0a58192a4608223934/merged major:0 minor:252 fsType:overlay blockSize:0} overlay_0-260:{mountpoint:/var/lib/containers/storage/overlay/7fab26e146b5809a5f8981fc47152e23ef8d023f7a51bcc3c07fe2aa2e99d0c9/merged major:0 minor:260 fsType:overlay blockSize:0} overlay_0-268:{mountpoint:/var/lib/containers/storage/overlay/c23e7a1856b497dace33ddd9584837bc5600a59437b9a5195174d18d488c48a5/merged major:0 minor:268 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/c0b3c8faa570433b6642ccd89196235f72b302109a37392b86e6c83e5adcc070/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/91c5135f53c8f2a07ee8b61d091d11a1e72e8d9012f5db0e4ac43d8e0915efee/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-343:{mountpoint:/var/lib/containers/storage/overlay/48b4ffa86c676051747d6e369a64026c0c3e8b0a4b99e9cd95ca257b351ffa14/merged major:0 minor:343 fsType:overlay blockSize:0} overlay_0-345:{mountpoint:/var/lib/containers/storage/overlay/1da5ad97261b9dd9c25181207ab61a55e69fd0f79f5db7fb45cd005231e69f8a/merged major:0 minor:345 fsType:overlay blockSize:0} overlay_0-347:{mountpoint:/var/lib/containers/storage/overlay/2368206871b6a8dd13aecc109feee0e66f00e7f886283538b4e9933be0783c95/merged major:0 minor:347 fsType:overlay blockSize:0} overlay_0-349:{mountpoint:/var/lib/containers/storage/overlay/45ecf6be646a849cec4e2e13cee338ae315048eb379cf33cefd82c689039a2b6/merged major:0 minor:349 fsType:overlay blockSize:0} overlay_0-351:{mountpoint:/var/lib/containers/storage/overlay/6157af232b538c0bdf0224963d07c09c58591db64e7eb7a50883b2cc1ab890b8/merged major:0 minor:351 fsType:overlay blockSize:0} overlay_0-353:{mountpoint:/var/lib/containers/storage/overlay/31f62a2f9b893e038b35620b3fc1a20d382cc0218a5e3bec0ad06a4453546362/merged major:0 minor:353 fsType:overlay blockSize:0} overlay_0-355:{mountpoint:/var/lib/containers/storage/overlay/e6fb70dc932059572abf4d9473fb40d052e35d40dc52d23049d085da25e6da8b/merged major:0 minor:355 fsType:overlay blockSize:0} overlay_0-357:{mountpoint:/var/lib/containers/storage/overlay/5d5425bd31c7ed35089c40dc8b77f1477b7b549064c42d43c048306900105c1f/merged major:0 minor:357 fsType:overlay blockSize:0} overlay_0-359:{mountpoint:/var/lib/containers/storage/overlay/1379fb856aa57ed4051cd51b76f6881b79fab88039881964945e677e20e73353/merged major:0 minor:359 fsType:overlay blockSize:0} overlay_0-361:{mountpoint:/var/lib/containers/storage/overlay/8afc0a3f6e0d2b9bb420a233e1ea5e3e1ae2de777b0fb0a9e6d003a54a72fc61/merged major:0 minor:361 fsType:overlay blockSize:0} overlay_0-363:{mountpoint:/var/lib/containers/storage/overlay/535c981b56b7b8ef1da0d3fd2beedc778545473359fc06161f1fbf1966957626/merged major:0 minor:363 fsType:overlay blockSize:0} overlay_0-365:{mountpoint:/var/lib/containers/storage/overlay/028f9f1952e5e1a2dcee26a80c14ba041f08a22223ef6923434bb8be210d0ed2/merged major:0 minor:365 fsType:overlay blockSize:0} overlay_0-367:{mountpoint:/var/lib/containers/storage/overlay/f324571e919c8ae087a65d30ed2ca7127f89a378fc713bb4f9a151d5c34da7ee/merged major:0 minor:367 fsType:overlay blockSize:0} overlay_0-374:{mountpoint:/var/lib/containers/storage/overlay/87ecc6cc6f5f05d0fa913fe725969df179d6a8a3c3c2e06674367228976cbaa0/merged major:0 minor:374 fsType:overlay blockSize:0} overlay_0-378:{mountpoint:/var/lib/containers/storage/overlay/fa4d19e39c5509a2fe96e28adc9c711ec28afd4a995ddfa4cd27c2a64446535b/merged major:0 minor:378 fsType:overlay blockSize:0} overlay_0-380:{mountpoint:/var/lib/containers/storage/overlay/e578eeda600ea079f03ffb8d5885ddbefc09f5650e42c1e170c4e091cc290619/merged major:0 minor:380 fsType:overlay blockSize:0} overlay_0-382:{mountpoint:/var/lib/containers/storage/overlay/79013fd610979b208f9c69375aff45392a0a641076b8b52fd27c3111d49e2458/merged major:0 minor:382 fsType:overlay blockSize:0} overlay_0-384:{mountpoint:/var/lib/containers/storage/overlay/d6a8763b5123dc06ccde4cf75a44620aaef73d15bf4778f15bcc9a08f0dfa665/merged major:0 minor:384 fsType:overlay blockSize:0} overlay_0-390:{mountpoint:/var/lib/containers/storage/overlay/fba5ef1ea66563d5c078a4a8faf1c8905a5a2e1c1f346be9a366e762b7cfca4a/merged major:0 minor:390 fsType:overlay blockSize:0} overlay_0-392:{mountpoint:/var/lib/containers/storage/overlay/8dc1be2b801f5bae3412571ecaba9047fff911052a1d59dc9ca8bd47bf18895f/merged major:0 minor:392 fsType:overlay blockSize:0} overlay_0-395:{mountpoint:/var/lib/containers/storage/overlay/ab27bdd81508bc5f61b69936014825da089c1105c833f3a8845e86e09b1c460c/merged major:0 minor:395 fsType:overlay blockSize:0} overlay_0-397:{mountpoint:/var/lib/containers/storage/overlay/b64fd6c4d59401a29f3219631f49b6ea7ed09631116af93297d3bbce8cc977df/merged major:0 minor:397 fsType:overlay blockSize:0} overlay_0-399:{mountpoint:/var/lib/containers/storage/overlay/a4c888db8573e75c7a437d4a058d1d940ff79326d79d7d4d441f78e3ec700f28/merged major:0 minor:399 fsType:overlay blockSize:0} overlay_0-402:{mountpoint:/var/lib/containers/storage/overlay/c82b3c79da6589dd196e13e3e71a122a989ba0d3d6c478c691c9022bb6767a0c/merged major:0 minor:402 fsType:overlay blockSize:0} overlay_0-404:{mountpoint:/var/lib/containers/storage/overlay/53d0f7a9261c1139004b8457a31737d28da6200820102954b17091aab770db6e/merged major:0 minor:404 fsType:overlay blockSize:0} overlay_0-406:{mountpoint:/var/lib/containers/storage/overlay/9dab10b2bc1560bed68d8f61827e2da7e170a13a0f7f677c91d5303d7577226f/merged major:0 minor:406 fsType:overlay blockSize:0} overlay_0-409:{mountpoint:/var/lib/containers/storage/overlay/a70c1187575110ae1024baa4b3a020d749d718d93ab1656524b20bb1f28cf204/merged major:0 minor:409 fsType:overlay blockSize:0} overlay_0-411:{mountpoint:/var/lib/containers/storage/overlay/3472ce43713fc506031c206683dcb18a0a791d2787e30b089946aa2ae0190ec9/merged major:0 minor:411 fsType:overlay blockSize:0} overlay_0-413:{mountpoint:/var/lib/containers/storage/overlay/866458ef188f83f1a6ee54617c4716f427edbe99c06ced05c9929913579d5b0d/merged major:0 minor:413 fsType:overlay blockSize:0} overlay_0-416:{mountpoint:/var/lib/containers/storage/overlay/c35a8de0d1c1c76a3513ed5c2a29ee7192566e74e72c1b6fd019715d9e9b7fd5/merged major:0 minor:416 fsType:overlay blockSize:0} overlay_0-417:{mountpoint:/var/lib/containers/storage/overlay/64e35686e3e6a6c0514fa613e2c9d8168f1f882aec3213893f511c8842b0a371/merged major:0 minor:417 fsType:overlay blockSize:0} overlay_0-420:{mountpoint:/var/lib/containers/storage/overlay/92a2813fc11578f39c31ac54132e8c1ae62b5449a72c8067eff464ef03bb049f/merged major:0 minor:420 fsType:overlay blockSize:0} overlay_0-422:{mountpoint:/var/lib/containers/storage/overlay/054363813b4ae14ab64e2959e7ab5ea665541286343453ddc8b5723b12d8b630/merged major:0 minor:422 fsType:overlay blockSize:0} overlay_0-424:{mountpoint:/var/lib/containers/storage/overlay/c328eb013da9c92b40504f34230045f4bafa41c2db76254f965d360fed432b3d/merged major:0 minor:424 fsType:overlay blockSize:0} overlay_0-425:{mountpoint:/var/lib/containers/storage/overlay/95e108e638cdf4b84cd3fabb88b982759336309e08f07a76a73aab0fabc52606/merged major:0 minor:425 fsType:overlay blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/9a5f215847a4d1d24951a091674bdf5602f9b6a85ef6127011ed8fbe88a3d7fc/merged major:0 minor:43 fsType:overlay blockSize:0} overlay_0-431:{mountpoint:/var/lib/containers/storage/overlay/2d0847e2134b95bed2f987709a7190d6fc9ddfb680912f0772212eea17ae7ce8/merged major:0 minor:431 fsType:overlay blockSize:0} overlay_0-442:{mountpoint:/var/lib/containers/storage/overlay/33206feeac51475fd0272d9692d9256a9571a853db431942c3d8cae6c1d1d892/merged major:0 minor:442 fsType:overlay blockSize:0} overlay_0-450:{mountpoint:/var/lib/containers/storage/overlay/a03d3566120c7631cc332d8043334d0365eadfb2c61c7c8117c64a984980205e/merged major:0 minor:450 fsType:overlay blockSize:0} overlay_0-452:{mountpoint:/var/lib/containers/storage/overlay/6553f5daa31cbfdd538716c8bda68676094e9a21f839c3bc93d545cee9e50ea7/merged major:0 minor:452 fsType:overlay blockSize:0} overlay_0-454:{mountpoint:/var/lib/containers/storage/overlay/6c051a5d556822fd7d7bae5fa10fc24316520c410eea400d3b88f52602cb7640/merged major:0 minor:454 fsType:overlay blockSize:0} overlay_0-457:{mountpoint:/var/lib/containers/storage/overlay/57c52ac16feab618cd9a2c2fbfc6f9f09d793008d5414a530de436b60e0fb5ad/merged major:0 minor:457 fsType:overlay blockSize:0} overlay_0-462:{mountpoint:/var/lib/containers/storage/overlay/f16b4d339039bc884e68b980c0eb278698a872242d526eeec9cf055a0b5e0863/merged major:0 minor:462 fsType:overlay blockSize:0} overlay_0-464:{mountpoint:/var/lib/containers/storage/overlay/7846570148cde8287d33d3cf700f062dc5b1e3ef10b0e084426f7a13ab194719/merged major:0 minor:464 fsType:overlay blockSize:0} overlay_0-473:{mountpoint:/var/lib/containers/storage/overlay/2de484efac3d4417c671732b58e86dd57b98f6bb875cb986fd5210c5733bbc2e/merged major:0 minor:473 fsType:overlay blockSize:0} overlay_0-478:{mountpoint:/var/lib/containers/storage/overlay/3bc7262eb3c7451d3dc9f695079cb7bd37268933c1e97d745fb4dc0c02dd801f/merged major:0 minor:478 fsType:overlay blockSize:0} overlay_0-483:{mountpoint:/var/lib/containers/storage/overlay/4dfab9424b64574c781f5bb861ad147621c342c5bd0c4a404d5d9d9a899e8340/merged major:0 minor:483 fsType:overlay blockSize:0} overlay_0-488:{mountpoint:/var/lib/containers/storage Dec 04 22:18:56.392278 master-0 kubenswrapper[33572]: /overlay/5980fa0fb9de668c53c85e724f788cbdefe7b70010fbd11c677890e864b7b5a6/merged major:0 minor:488 fsType:overlay blockSize:0} overlay_0-493:{mountpoint:/var/lib/containers/storage/overlay/2f7019160085340fb6eb07706e4de4fb13dc253a774844f7ef6670d9cb891f70/merged major:0 minor:493 fsType:overlay blockSize:0} overlay_0-495:{mountpoint:/var/lib/containers/storage/overlay/9f68a44475e9fec8ea6c110639444d95d7933ac466abf0404946778106f0df3e/merged major:0 minor:495 fsType:overlay blockSize:0} overlay_0-504:{mountpoint:/var/lib/containers/storage/overlay/e1d65ba982f48a41d6dba3cc7b3b546e7f0908c6731cd4ad2b28b6be228ad0c9/merged major:0 minor:504 fsType:overlay blockSize:0} overlay_0-511:{mountpoint:/var/lib/containers/storage/overlay/267911f690289bc3a28f83dc594c53417096dba60fd3d097de58083b42334608/merged major:0 minor:511 fsType:overlay blockSize:0} overlay_0-513:{mountpoint:/var/lib/containers/storage/overlay/f52dfca265b4c9c60213623ff7b113b24e299b6d44cee7d90577e030a3b0a4e2/merged major:0 minor:513 fsType:overlay blockSize:0} overlay_0-518:{mountpoint:/var/lib/containers/storage/overlay/8882a2cb21b4f917e87fd6949107447d66927d8a7914a702fe1c78f30c6703db/merged major:0 minor:518 fsType:overlay blockSize:0} overlay_0-520:{mountpoint:/var/lib/containers/storage/overlay/a13079543ee53c1a99f4bb261ed6f2d9d99e644c660ada6ccaa7bb93e8278d47/merged major:0 minor:520 fsType:overlay blockSize:0} overlay_0-522:{mountpoint:/var/lib/containers/storage/overlay/cb608506a9fb6375ee54cbfa01deca55677b3f12e32e7d42630935ac12b03454/merged major:0 minor:522 fsType:overlay blockSize:0} overlay_0-526:{mountpoint:/var/lib/containers/storage/overlay/579dde6ebd40be69937ff2dee7f4ffb291fb5e299544e3b8fec31fc57f742efc/merged major:0 minor:526 fsType:overlay blockSize:0} overlay_0-53:{mountpoint:/var/lib/containers/storage/overlay/9a49669821220e46624c0b7be88aab3134782d8200903c9a4d70f2aef67df2e8/merged major:0 minor:53 fsType:overlay blockSize:0} overlay_0-531:{mountpoint:/var/lib/containers/storage/overlay/961d540c5dcf1bf14576ebd3d053685628723036ea80b365b48feee205fc1562/merged major:0 minor:531 fsType:overlay blockSize:0} overlay_0-541:{mountpoint:/var/lib/containers/storage/overlay/46a5dea95e0a742c2fe1c1198a58db11090528889459aec1bb4c842f51976972/merged major:0 minor:541 fsType:overlay blockSize:0} overlay_0-543:{mountpoint:/var/lib/containers/storage/overlay/f939422ab548179c540e8d3cba0b96524c0bbe8eeae2bdb6ea9b814fec305f04/merged major:0 minor:543 fsType:overlay blockSize:0} overlay_0-544:{mountpoint:/var/lib/containers/storage/overlay/b766f32654cccca3b2dbb5bca591a632226889897f3f406f34df63773b9cf4c8/merged major:0 minor:544 fsType:overlay blockSize:0} overlay_0-546:{mountpoint:/var/lib/containers/storage/overlay/2a2b8995bc8a13462d338afeae5c882b5f61610661f693e5b00c67b702e0db8f/merged major:0 minor:546 fsType:overlay blockSize:0} overlay_0-548:{mountpoint:/var/lib/containers/storage/overlay/1ddb88b7700ac62bc0713e70a824da12655e967759220689e520fbf608106ed4/merged major:0 minor:548 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/8eada09f42d293801f30419a1812d32cb48aa2f43d5571105287346854140adc/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-567:{mountpoint:/var/lib/containers/storage/overlay/19a20463eb246c44fd256de67193ef18518e57f6338fd0384d244f1e486fcca3/merged major:0 minor:567 fsType:overlay blockSize:0} overlay_0-579:{mountpoint:/var/lib/containers/storage/overlay/8924bbe75c89b277fd69be013ed672f8248ae6d3ad58f6b4734715b847810470/merged major:0 minor:579 fsType:overlay blockSize:0} overlay_0-581:{mountpoint:/var/lib/containers/storage/overlay/476b21fdf76e0f9ab01488cc80281dfdd4958ddfbaaa296ee038aff688d7d8f4/merged major:0 minor:581 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/7402445059ddb74cf9554f7d02fca9ae6790f97fbafae5bab9690f419fe0897d/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-607:{mountpoint:/var/lib/containers/storage/overlay/e12ff699f7065e131c91db1698eee7c6cb3996e50a9be47286a27595647224c1/merged major:0 minor:607 fsType:overlay blockSize:0} overlay_0-609:{mountpoint:/var/lib/containers/storage/overlay/1689eeb29d437975bdb5722453714f3bb55bfee21ff9a4d6818e5c35ba62c6e1/merged major:0 minor:609 fsType:overlay blockSize:0} overlay_0-611:{mountpoint:/var/lib/containers/storage/overlay/634c99095ffe74f3b9c247b59dfd1cd4d6fd9122cff039d4fdedcd95c853d9a3/merged major:0 minor:611 fsType:overlay blockSize:0} overlay_0-613:{mountpoint:/var/lib/containers/storage/overlay/af4eda5120d431e4bad5ddb7df4df2dbecaeff5731cb1264ae68545b11c068ae/merged major:0 minor:613 fsType:overlay blockSize:0} overlay_0-615:{mountpoint:/var/lib/containers/storage/overlay/f2e7e7ede65f4c9e0762df791f2bca45e35675ea0300e41dfe37791e420c0929/merged major:0 minor:615 fsType:overlay blockSize:0} overlay_0-617:{mountpoint:/var/lib/containers/storage/overlay/57993ec41459a732d2aba5c45e3d3205c35afe81978cb56c1aec2f15fafee40d/merged major:0 minor:617 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/a1ff79be2a9af4f05c318a91bea09228927657d6ae5919f810e08bab8d6c4ad0/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-623:{mountpoint:/var/lib/containers/storage/overlay/3eac1da20ab3771cbf349d460c00e44427eced340477e6d2dbfb76ee0c158e16/merged major:0 minor:623 fsType:overlay blockSize:0} overlay_0-624:{mountpoint:/var/lib/containers/storage/overlay/bd12c7dc9cae311b8ecb2035af7341fcc6b78db17aa2b038677cd72f6c5b4256/merged major:0 minor:624 fsType:overlay blockSize:0} overlay_0-632:{mountpoint:/var/lib/containers/storage/overlay/c7232e959b4f2fd92c524dfb49336acf8e7a2a2a4ba728de42d71eea51437514/merged major:0 minor:632 fsType:overlay blockSize:0} overlay_0-635:{mountpoint:/var/lib/containers/storage/overlay/3a563414196cf91227a55bf4f4363ea4cf233127fdf8aa604095a066034ef748/merged major:0 minor:635 fsType:overlay blockSize:0} overlay_0-636:{mountpoint:/var/lib/containers/storage/overlay/feff988f5ffc79f06bea18a298d052bc83065ff9c75ebe1a7df82dadd8c76ecf/merged major:0 minor:636 fsType:overlay blockSize:0} overlay_0-639:{mountpoint:/var/lib/containers/storage/overlay/1eca60d051d1bcfeb67653552bcd980c2f9c2055d87b18eb8f3530ce7fc8dd43/merged major:0 minor:639 fsType:overlay blockSize:0} overlay_0-641:{mountpoint:/var/lib/containers/storage/overlay/02ae421b455edaf2a770ef4bf70fe94913ce92d503d7d56bbe99a010f7a55afa/merged major:0 minor:641 fsType:overlay blockSize:0} overlay_0-644:{mountpoint:/var/lib/containers/storage/overlay/5e12ca8414e00b9bfb1173b49e42130d6f2adcec36f2163c5680e6c8b33c8ba0/merged major:0 minor:644 fsType:overlay blockSize:0} overlay_0-646:{mountpoint:/var/lib/containers/storage/overlay/490cfcfe260c35ccaaa97dbf4bc8dd014cce7953d8751eac7cf3e9fac747f3b5/merged major:0 minor:646 fsType:overlay blockSize:0} overlay_0-65:{mountpoint:/var/lib/containers/storage/overlay/04d26b5714c48da763e99523da66571d77c39a1185a7e3e60a068fc6ca4cc36c/merged major:0 minor:65 fsType:overlay blockSize:0} overlay_0-663:{mountpoint:/var/lib/containers/storage/overlay/e46c84d705dc788dad192f5ec6cfbe2ed729c5621b9e0057198ecdf99d0aaaea/merged major:0 minor:663 fsType:overlay blockSize:0} overlay_0-675:{mountpoint:/var/lib/containers/storage/overlay/e70700bb9f00aa4a005245645411b5a13b90295146a9d9a7a3095e56f105e801/merged major:0 minor:675 fsType:overlay blockSize:0} overlay_0-69:{mountpoint:/var/lib/containers/storage/overlay/4222256fbfe4eaeefbd850a616cfca241f79833bf57f47beb24aa456e7349911/merged major:0 minor:69 fsType:overlay blockSize:0} overlay_0-706:{mountpoint:/var/lib/containers/storage/overlay/34e9b5c20aa7aafcc8c6db300c8d98996e2fd1729d37cc907d0a5d17254e5d02/merged major:0 minor:706 fsType:overlay blockSize:0} overlay_0-708:{mountpoint:/var/lib/containers/storage/overlay/03523dba8eee3dcccc58223100e9ee6315be682bfa134a1ed97f97ef3f231b85/merged major:0 minor:708 fsType:overlay blockSize:0} overlay_0-71:{mountpoint:/var/lib/containers/storage/overlay/e95dbd6184475f7b258d3c8a1bbf6d91bc3850e1a186b132e018cca64c65b3c0/merged major:0 minor:71 fsType:overlay blockSize:0} overlay_0-710:{mountpoint:/var/lib/containers/storage/overlay/d157a6ccb17ab70d0cc2c8ba208e53892d55f4c00c8be8466def59205e19fbf8/merged major:0 minor:710 fsType:overlay blockSize:0} overlay_0-712:{mountpoint:/var/lib/containers/storage/overlay/157b9f113e4bc35911a6eb8e2b74c744df9e2d95c473588550b4f084f5509d9d/merged major:0 minor:712 fsType:overlay blockSize:0} overlay_0-714:{mountpoint:/var/lib/containers/storage/overlay/19b204ff1a858ce4c6c90eb2791fa72c0a658a4553eab870d183972333ae4ecf/merged major:0 minor:714 fsType:overlay blockSize:0} overlay_0-716:{mountpoint:/var/lib/containers/storage/overlay/198f13cf072f26c10851645794b1b5532e48bf7de7cbf43b048f46e55389d496/merged major:0 minor:716 fsType:overlay blockSize:0} overlay_0-720:{mountpoint:/var/lib/containers/storage/overlay/a07ee3d6b8caeae095846a1710c77fbe8fbae6303ab0223afd21b6f0d6550f0a/merged major:0 minor:720 fsType:overlay blockSize:0} overlay_0-722:{mountpoint:/var/lib/containers/storage/overlay/d3d6b793a4c167584947f744fae16fccc6c31712ec5981ae7df2614f5397b126/merged major:0 minor:722 fsType:overlay blockSize:0} overlay_0-723:{mountpoint:/var/lib/containers/storage/overlay/39fe69514ec688f9da445ed55761c7810f66f2361b277f4097b9da2c98d98a1f/merged major:0 minor:723 fsType:overlay blockSize:0} overlay_0-725:{mountpoint:/var/lib/containers/storage/overlay/1f5ecbd07f9a0591d77f8581d883e12bceff4666768d16be211baa8f5a5cf845/merged major:0 minor:725 fsType:overlay blockSize:0} overlay_0-728:{mountpoint:/var/lib/containers/storage/overlay/c0e71200024f309d3dd17fab27321113b021573aa099ee1fd8394033ccfb2279/merged major:0 minor:728 fsType:overlay blockSize:0} overlay_0-729:{mountpoint:/var/lib/containers/storage/overlay/7a92082568a31aa030052556ae5c934f8ff064d70dcbe32c7a3e80064db32939/merged major:0 minor:729 fsType:overlay blockSize:0} overlay_0-733:{mountpoint:/var/lib/containers/storage/overlay/bd9aa8de314c5327e5b16a518f49b5525d9db9d3cedac9cb5cac7c87c7cce53b/merged major:0 minor:733 fsType:overlay blockSize:0} overlay_0-741:{mountpoint:/var/lib/containers/storage/overlay/941afb26bc10e63e11b11e658e97aec3d519d2c709c2ac0312cb469d988b0649/merged major:0 minor:741 fsType:overlay blockSize:0} overlay_0-745:{mountpoint:/var/lib/containers/storage/overlay/9b6a030691f844199c18e8aab4673bc17895efc049713eb539c6ed42413652f8/merged major:0 minor:745 fsType:overlay blockSize:0} overlay_0-774:{mountpoint:/var/lib/containers/storage/overlay/a84a503dd8a9a589ed460a7a9502b4aec3d6d2866b7353278170e4cd8904a11c/merged major:0 minor:774 fsType:overlay blockSize:0} overlay_0-776:{mountpoint:/var/lib/containers/storage/overlay/049619b3178fbb9106c63b4b8b90b6502be641cd972ab219a18b9b66f6537e69/merged major:0 minor:776 fsType:overlay blockSize:0} overlay_0-778:{mountpoint:/var/lib/containers/storage/overlay/614e748730c63fc2bd7968cce9055c289290352ed083fa6ba60eb77f307f7035/merged major:0 minor:778 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/945939c9af2e852759978803ed4225df0b5c235488fd30ea11b54fce057469e6/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-788:{mountpoint:/var/lib/containers/storage/overlay/0ed2323269bdc9faa08e43301524c678672d2cf8808ebec28570a664c78624f5/merged major:0 minor:788 fsType:overlay blockSize:0} overlay_0-801:{mountpoint:/var/lib/containers/storage/overlay/98cae696d8ea6bf837edb74a412d99088fcd59d1b96fab08229b5844fa0538fe/merged major:0 minor:801 fsType:overlay blockSize:0} overlay_0-803:{mountpoint:/var/lib/containers/storage/overlay/89e44df5dd9f7f67967434bcdaf692ea188f9ec6f4bc0f0d822bd3bf74a4136c/merged major:0 minor:803 fsType:overlay blockSize:0} overlay_0-827:{mountpoint:/var/lib/containers/storage/overlay/55f09068520801008f6e8eadd4df0c3b1bcf7ad7c0f55acd36fd0b5ca625a81f/merged major:0 minor:827 fsType:overlay blockSize:0} overlay_0-829:{mountpoint:/var/lib/containers/storage/overlay/b0966b157eb1beb9caaae60cbfe0a5d7aac93b1515bd3aad9b3d518e01d2ae37/merged major:0 minor:829 fsType:overlay blockSize:0} overlay_0-833:{mountpoint:/var/lib/containers/storage/overlay/0e82d0c8feb0fca24b7d17dca0345edba083190b6dccc2fe1068fbd2d636a61d/merged major:0 minor:833 fsType:overlay blockSize:0} overlay_0-840:{mountpoint:/var/lib/containers/storage/overlay/0fdc3691e2c758bd973ca91c8418466b80b9a6b6f9c1f8863616b77738026fc2/merged major:0 minor:840 fsType:overlay blockSize:0} overlay_0-843:{mountpoint:/var/lib/containers/storage/overlay/16a35359e1cdb536d3865f7e945eee895e9d87c3c8a8aa181ca070adbf4d52f8/merged major:0 minor:843 fsType:overlay blockSize:0} overlay_0-844:{mountpoint:/var/lib/containers/storage/overlay/0ad13189687176d6c906c589c87a251c84d0752e61bed7372d652be55bd1e3e0/merged major:0 minor:844 fsType:overlay blockSize:0} overlay_0-846:{mountpoint:/var/lib/containers/storage/overlay/d6e6b48facaf2b5bb156b441bed064299c31bd42f5c30d6fbe2ea1f17e6e93ec/merged major:0 minor:846 fsType:overlay blockSize:0} overlay_0-849:{mountpoint:/var/lib/containers/storage/overlay/74875b35470e656f47301816fa3abca86dee445a1661d0b66d5882a390eecc8c/merged major:0 minor:849 fsType:overlay blockSize:0} overlay_0-864:{mountpoint:/var/lib/containers/storage/overlay/70a38b2288cbc59fad806fa5c1eda3de7171dca651c07702e707a1b3ed1d6f17/merged major:0 minor:864 fsType:overlay blockSize:0} overlay_0-866:{mountpoint:/var/lib/containers/storage/overlay/eac9c789a10e5d4f7802716ead7d3d02ca0e1c2cd646e01b8181cbcc48285626/merged major:0 minor:866 fsType:overlay blockSize:0} overlay_0-868:{mountpoint:/var/lib/containers/storage/overlay/a445e2fe261c029921dcce508b329a91fe5592eb9e45c116a6b7c76f68f52c6f/merged major:0 minor:868 fsType:overlay blockSize:0} overlay_0-870:{mountpoint:/var/lib/containers/storage/overlay/d59f84da9ba009e025bdef07b53823afeaac5be6a825bf005e04271bcdbf4a4f/merged major:0 minor:870 fsType:overlay blockSize:0} overlay_0-88:{mountpoint:/var/lib/containers/storage/overlay/ceb23b0d5e9bafc9aeda612f851daa3be7eec7a186aa63f6dce5fb768ab8c4a2/merged major:0 minor:88 fsType:overlay blockSize:0} overlay_0-889:{mountpoint:/var/lib/containers/storage/overlay/523e0ae46426887d9e5eee096fbcc6e546fd9826390e78a8aa7afa6164bcde3a/merged major:0 minor:889 fsType:overlay blockSize:0} overlay_0-903:{mountpoint:/var/lib/containers/storage/overlay/089b1861c639dbbe9d881edccd80cbf3b15a4562a6f07e187a53195d3d03a964/merged major:0 minor:903 fsType:overlay blockSize:0} overlay_0-907:{mountpoint:/var/lib/containers/storage/overlay/9c02858346d9828c4488a756bdb6c6d70fe7e1d539541db6ae4b867dcd71b4f0/merged major:0 minor:907 fsType:overlay blockSize:0} overlay_0-912:{mountpoint:/var/lib/containers/storage/overlay/b1a03ffe7bb548a60ac3fc5caeebc9168b24e6eeca2ae97f8fd89622b0e94b82/merged major:0 minor:912 fsType:overlay blockSize:0} overlay_0-92:{mountpoint:/var/lib/containers/storage/overlay/5d0653deca705fc60b5da5f31925714d733bf52231dcc7f5545317a9f45dcbcb/merged major:0 minor:92 fsType:overlay blockSize:0} overlay_0-930:{mountpoint:/var/lib/containers/storage/overlay/cec62b93b02ddbe48f730803581981b3f79b4467348ca6054863e5e9090f372e/merged major:0 minor:930 fsType:overlay blockSize:0} overlay_0-932:{mountpoint:/var/lib/containers/storage/overlay/bba636199bc6ea355569a35fdc7137d97f6308384f57d05dde87fd6050051051/merged major:0 minor:932 fsType:overlay blockSize:0} overlay_0-934:{mountpoint:/var/lib/containers/storage/overlay/79ac01dadd17e1d5b6d98ee87221f3f5eef39cf648fb0bbf54cedee1de385261/merged major:0 minor:934 fsType:overlay blockSize:0} overlay_0-945:{mountpoint:/var/lib/containers/storage/overlay/8da0cfc6382be52ae85aca714ea911fba1d032e76e3838e0e105bc8065854aa4/merged major:0 minor:945 fsType:overlay blockSize:0} overlay_0-947:{mountpoint:/var/lib/containers/storage/overlay/ad32e22d2b4ef1d6e17029d0d148db4191ba40352a71c59c0d9855de6fbac0ca/merged major:0 minor:947 fsType:overlay blockSize:0} overlay_0-949:{mountpoint:/var/lib/containers/storage/overlay/29309ca10746b711974a8af67113a137b908666933b03e0b4dbe447e50a24990/merged major:0 minor:949 fsType:overlay blockSize:0} overlay_0-951:{mountpoint:/var/lib/containers/storage/overlay/74f755359989f5bd86c93e382dc3b05c84560f70468316d7560fb661e8da87c1/merged major:0 minor:951 fsType:overlay blockSize:0} overlay_0-953:{mountpoint:/var/lib/containers/storage/overlay/ab4fc3c87ffdb18100c5b8e6f38d91184305340c55e24614b3d04ea8bc46a6f6/merged major:0 minor:953 fsType:overlay blockSize:0} overlay_0-955:{mountpoint:/var/lib/containers/storage/overlay/cc37b127ea9cebece083c148e5c1b8c00c5279967c0e44afea4a1bca57778a55/merged major:0 minor:955 fsType:overlay blockSize:0} overlay_0-964:{mountpoint:/var/lib/containers/storage/overlay/18dbe25db66b5a337b848c4f36d771946e746352767b4b3f35e6ba198b53f83e/merged major:0 minor:964 fsType:overlay blockSize:0} overlay_0-982:{mountpoint:/var/lib/containers/storage/overlay/7e8f2c73e0b734265cd8e2ac6f13811a6209c2640f9d9af6178e830ebbf20488/merged major:0 minor:982 fsType:overlay blockSize:0} overlay_0-992:{mountpoint:/var/lib/containers/storage/overlay/122bec111c78ff6d21f34833b1651ea531220c94f7595fa90347f2b15e20bd26/merged major:0 minor:992 fsType:overlay blockSize:0} overlay_0-995:{mountpoint:/var/lib/containers/storage/overlay/dc08c07972cb1ab2aa461ce2e6851416f735dd0bfb22f58f6212f18eb285c62b/merged major:0 minor:995 fsType:overlay blockSize:0} overlay_0-997:{mountpoint:/var/lib/containers/storage/overlay/46164e4c1ffb5fc8aa6452e1e8fcf95971153553090c208b2d4e426f7d9d45fa/merged major:0 minor:997 fsType:overlay blockSize:0} overlay_0-999:{mountpoint:/var/lib/containers/storage/overlay/46de3b49e40cb199c013f1c9b3cdd505ca85cf6f8fe25c889f018c6bf20e6024/merged major:0 minor:999 fsType:overlay blockSize:0}] Dec 04 22:18:56.436924 master-0 kubenswrapper[33572]: I1204 22:18:56.434408 33572 manager.go:217] Machine: {Timestamp:2025-12-04 22:18:56.432564111 +0000 UTC m=+0.160089810 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:58e57637271046a9a49cd83dda54d0eb SystemUUID:58e57637-2710-46a9-a49c-d83dda54d0eb BootID:4d17516d-34b9-4c3d-aaa6-c745ecd06d22 Filesystems:[{Device:overlay_0-349 DeviceMajor:0 DeviceMinor:349 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0a8ac4004225e98679de5f00828ef4b72b059bfd913e3c0b107c1aef5ccb1667/userdata/shm DeviceMajor:0 DeviceMinor:1031 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc/userdata/shm DeviceMajor:0 DeviceMinor:143 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:1128 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/6c8c45e0-2342-499b-aa6b-339b6a722a87/volumes/kubernetes.io~projected/kube-api-access-gcgg9 DeviceMajor:0 DeviceMinor:126 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:558 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:1393 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:297 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/510a595a-21bf-48fc-85cd-707bc8f5536f/volumes/kubernetes.io~projected/kube-api-access-gfhgj DeviceMajor:0 DeviceMinor:408 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-932 DeviceMajor:0 DeviceMinor:932 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fe41c35d4fc12b10f7c0380ded0175f838a7cb9e3aad0aa5a08446be17e65126/userdata/shm DeviceMajor:0 DeviceMinor:555 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:1235 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:313 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1f66bf17e6a9a3d1673b91cfae48275a45a440300946a8bf5061bac66c63db97/userdata/shm DeviceMajor:0 DeviceMinor:602 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1144 DeviceMajor:0 DeviceMinor:1144 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-147 DeviceMajor:0 DeviceMinor:147 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~projected/kube-api-access-lclkg DeviceMajor:0 DeviceMinor:127 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-343 DeviceMajor:0 DeviceMinor:343 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-617 DeviceMajor:0 DeviceMinor:617 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~secret/secret-metrics-server-tls DeviceMajor:0 DeviceMinor:1566 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1605 DeviceMajor:0 DeviceMinor:1605 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-840 DeviceMajor:0 DeviceMinor:840 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-543 DeviceMajor:0 DeviceMinor:543 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c6a5d14d-0409-4024-b0a8-200fa2594185/volumes/kubernetes.io~projected/kube-api-access-bfklr DeviceMajor:0 DeviceMinor:289 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:563 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/810c363b-a4c7-428d-a2fb-285adc29f477/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:1022 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d/volumes/kubernetes.io~projected/kube-api-access-gpksd DeviceMajor:0 DeviceMinor:1123 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0a726f44-a509-46b3-a6d5-70afe3b55e9f/volumes/kubernetes.io~secret/node-exporter-tls DeviceMajor:0 DeviceMinor:1475 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e37d318a-5bf8-46ed-b6de-494102738da7/volumes/kubernetes.io~projected/kube-api-access-r57bb DeviceMajor:0 DeviceMinor:292 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-347 DeviceMajor:0 DeviceMinor:347 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-581 DeviceMajor:0 DeviceMinor:581 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-844 DeviceMajor:0 DeviceMinor:844 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-997 DeviceMajor:0 DeviceMinor:997 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1049 DeviceMajor:0 DeviceMinor:1049 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/eb4d8477-c3b5-4e88-aaa9-222ad56d974c/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:1451 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-390 DeviceMajor:0 DeviceMinor:390 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0a726f44-a509-46b3-a6d5-70afe3b55e9f/volumes/kubernetes.io~projected/kube-api-access-k8jmv DeviceMajor:0 DeviceMinor:1481 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-522 DeviceMajor:0 DeviceMinor:522 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-706 DeviceMajor:0 DeviceMinor:706 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1414 DeviceMajor:0 DeviceMinor:1414 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-216 DeviceMajor:0 DeviceMinor:216 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1001 DeviceMajor:0 DeviceMinor:1001 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-118 DeviceMajor:0 DeviceMinor:118 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-431 DeviceMajor:0 DeviceMinor:431 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ebcf83d7998d4cc60b59e0a4ee1b7d80a2c88668db04583317334cbd65922154/userdata/shm DeviceMajor:0 DeviceMinor:226 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/kube-api-access-lr65l DeviceMajor:0 DeviceMinor:317 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-889 DeviceMajor:0 DeviceMinor:889 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-145 DeviceMajor:0 DeviceMinor:145 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-613 DeviceMajor:0 DeviceMinor:613 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/55c4f1e1-1b78-45ec-915d-8055ab3e2786/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:1051 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-219 DeviceMajor:0 DeviceMinor:219 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8322a14628afc897780b45b61f17a00da7b18029f93a7a74b52c8e380031ff4f/userdata/shm DeviceMajor:0 DeviceMinor:1673 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:300 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:621 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1273 DeviceMajor:0 DeviceMinor:1273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-995 DeviceMajor:0 DeviceMinor:995 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1675 DeviceMajor:0 DeviceMinor:1675 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4fb8b7eb82d3c4d48b5f1cf2512cb01480c4c7ee72d7b27dc0bc0ec05cc4c756/userdata/shm DeviceMajor:0 DeviceMinor:455 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4e849604a662b099203e2c576ae634e61f44af1ebad609cae720f6f60b5023c0/userdata/shm DeviceMajor:0 DeviceMinor:1427 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-903 DeviceMajor:0 DeviceMinor:903 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-422 DeviceMajor:0 DeviceMinor:422 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-69 DeviceMajor:0 DeviceMinor:69 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4f7c6491dc22a01b1d98b832f138731bbe081d987cf8c5e14ed87abcbbcb568a/userdata/shm DeviceMajor:0 DeviceMinor:89 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-567 DeviceMajor:0 DeviceMinor:567 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:562 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ae107ad4-104c-4264-9844-afb3af28b19e/volumes/kubernetes.io~projected/kube-api-access-9gj4j DeviceMajor:0 DeviceMinor:1127 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b/volumes/kubernetes.io~projected/kube-api-access-g2ghk DeviceMajor:0 DeviceMinor:971 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-999 DeviceMajor:0 DeviceMinor:999 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0a726f44-a509-46b3-a6d5-70afe3b55e9f/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1479 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-413 DeviceMajor:0 DeviceMinor:413 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/331b7dcdf8e2d87b8a376fb7581bf5553e5209af627a95ddb5944fe5be12cb6d/userdata/shm DeviceMajor:0 DeviceMinor:1020 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/10dda04dd9f1ca247b35249d5e7333d86ebc7e3902573b98ac28839fb9bcb514/userdata/shm DeviceMajor:0 DeviceMinor:1404 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:287 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c/volumes/kubernetes.io~projected/kube-api-access-w2ndk DeviceMajor:0 DeviceMinor:304 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b86ff0e8-2c72-4dc6-ac55-3c21940d044f/volumes/kubernetes.io~projected/kube-api-access-2vpxd DeviceMajor:0 DeviceMinor:927 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/74197c50-9a41-40e8-9289-c7e6afbd3737/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:46 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d20bd8f7df5ba066210630b0736a496814188135f419c1b214059e7501c8fbf9/userdata/shm DeviceMajor:0 DeviceMinor:187 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/967bf4ac-f025-4296-8ed9-183a345f6b7c/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:1014 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:1122 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb/volumes/kubernetes.io~projected/kube-api-access-jc47q DeviceMajor:0 DeviceMinor:1129 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/857d5516010228f43de819bef01594930c5dd807f2fd4da2fb1509f797fa1774/userdata/shm DeviceMajor:0 DeviceMinor:1490 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a/userdata/shm DeviceMajor:0 DeviceMinor:327 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-714 DeviceMajor:0 DeviceMinor:714 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6684358b-d7a6-4396-9b4f-ea67d85e4517/volumes/kubernetes.io~secret/prometheus-operator-tls DeviceMajor:0 DeviceMinor:140 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1467 DeviceMajor:0 DeviceMinor:1467 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-191 DeviceMajor:0 DeviceMinor:191 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f19df2e06dca5a2f80ab8037e49629477ed1cac1328bfc7445b4bdab076568fc/userdata/shm DeviceMajor:0 DeviceMinor:993 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1139 DeviceMajor:0 DeviceMinor:1139 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-399 DeviceMajor:0 DeviceMinor:399 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0173b8a7-07b4-407a-80b6-d86754072fd8/volumes/kubernetes.io~projected/kube-api-access-z4cdh DeviceMajor:0 DeviceMinor:475 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-663 DeviceMajor:0 DeviceMinor:663 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1203 DeviceMajor:0 DeviceMinor:1203 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:295 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/7f091088-2166-4026-9fa6-62bd83407edb/volumes/kubernetes.io~projected/kube-api-access-s5t2f DeviceMajor:0 DeviceMinor:310 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-359 DeviceMajor:0 DeviceMinor:359 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae/volumes/kubernetes.io~projected/kube-api-access-mwx5k DeviceMajor:0 DeviceMinor:1120 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-488 DeviceMajor:0 DeviceMinor:488 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~projected/kube-api-access-dvrr5 DeviceMajor:0 DeviceMinor:311 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aab8d5d7d7caaf80016fb84803d68f187962ac87f50be8e340ea0edecd46547b/userdata/shm DeviceMajor:0 DeviceMinor:1024 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/773c5e8477795f70534d191a1a57bd8d7b1ab26d3d0db825f97bcaae1e3dd144/userdata/shm DeviceMajor:0 DeviceMinor:1402 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1505 DeviceMajor:0 DeviceMinor:1505 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:158 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fb0274dc-fac1-41f9-b3e5-77253d851fdf/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:557 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1429 DeviceMajor:0 DeviceMinor:1429 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a3899a38-39b8-4b48-81e5-4d8854ecc8ab/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:1003 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-120 DeviceMajor:0 DeviceMinor:120 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:332 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~projected/kube-api-access-7fmp4 DeviceMajor:0 DeviceMinor:627 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e7a7f632-2442-4837-b068-c22b03c71fb0/volumes/kubernetes.io~projected/kube-api-access-c7d9j DeviceMajor:0 DeviceMinor:1030 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1186 DeviceMajor:0 DeviceMinor:1186 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-151 DeviceMajor:0 DeviceMinor:151 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/17912746-74eb-4c78-8c1b-2f66e7ce4299/volumes/kubernetes.io~secret/openshift-state-metrics-tls DeviceMajor:0 DeviceMinor:1478 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0/userdata/shm DeviceMajor:0 DeviceMinor:341 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f1534e25-7add-46a1-8f4e-0065c232aa4e/volumes/kubernetes.io~projected/kube-api-access-4d7pj DeviceMajor:0 DeviceMinor:923 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-716 DeviceMajor:0 DeviceMinor:716 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1310 DeviceMajor:0 DeviceMinor:1310 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-218 DeviceMajor:0 DeviceMinor:218 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878/userdata/shm DeviceMajor:0 DeviceMinor:323 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-65 DeviceMajor:0 DeviceMinor:65 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1513 DeviceMajor:0 DeviceMinor:1513 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1601 DeviceMajor:0 DeviceMinor:1601 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-397 DeviceMajor:0 DeviceMinor:397 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cf7bfab376e6cd33db16a88818ab6aaf3d6cee23a9706c34952502952f7ad2f6/userdata/shm DeviceMajor:0 DeviceMinor:799 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1078 DeviceMajor:0 DeviceMinor:1078 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1125 DeviceMajor:0 DeviceMinor:1125 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1288 DeviceMajor:0 DeviceMinor:1288 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-778 DeviceMajor:0 DeviceMinor:778 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:1392 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-544 DeviceMajor:0 DeviceMinor:544 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-363 DeviceMajor:0 DeviceMinor:363 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-708 DeviceMajor:0 DeviceMinor:708 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a5c2d3b8-41c0-4531-b770-57b7c567fe30/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:762 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/60aac9ad737c32ff74467da7a19ec918b06f0d3f5c0137f4d12c177366392be7/userdata/shm DeviceMajor:0 DeviceMinor:373 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-417 DeviceMajor:0 DeviceMinor:417 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ee18ee29a901424dbc24deaf0f12aa2ff23c84383c4ff6df05066daea9fbaebe/userdata/shm DeviceMajor:0 DeviceMinor:64 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/17912746-74eb-4c78-8c1b-2f66e7ce4299/volumes/kubernetes.io~projected/kube-api-access-smtnh DeviceMajor:0 DeviceMinor:1482 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/76fd9f44-4365-4271-8772-025655c50334/volumes/kubernetes.io~projected/kube-api-access-9j8fr DeviceMajor:0 DeviceMinor:142 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b/userdata/shm DeviceMajor:0 DeviceMinor:161 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ce6b5a46-172b-4575-ba22-ff3c6ea4207f/volumes/kubernetes.io~projected/kube-api-access-2cfhv DeviceMajor:0 DeviceMinor:626 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-866 DeviceMajor:0 DeviceMinor:866 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1494 DeviceMajor:0 DeviceMinor:1494 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fc5560b4f5b75417c091bf8b734a41efa2795ce2d8cceb8a89a66960f1ba3320/userdata/shm DeviceMajor:0 DeviceMinor:596 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d740441407a329d552a22d88957f884c55899842a2703505cfb149663a1e6ff/userdata/shm DeviceMajor:0 DeviceMinor:972 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/volumes/kubernetes.io~projected/kube-api-access-4g7n9 DeviceMajor:0 DeviceMinor:1098 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1086 DeviceMajor:0 DeviceMinor:1086 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1264 DeviceMajor:0 DeviceMinor:1264 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-103 DeviceMajor:0 DeviceMinor:103 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5/volumes/kubernetes.io~projected/kube-api-access-g5nkh DeviceMajor:0 DeviceMinor:306 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:532 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/5598683a-cd32-486d-8839-205829d55cc2/volumes/kubernetes.io~projected/kube-api-access-s2lwr DeviceMajor:0 DeviceMinor:990 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/2d142201-6e77-4828-b86b-05d4144a2f08/volumes/kubernetes.io~projected/kube-api-access-6cblk DeviceMajor:0 DeviceMinor:1029 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-457 DeviceMajor:0 DeviceMinor:457 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~secret/secret-metrics-client-certs DeviceMajor:0 DeviceMinor:1564 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1436 DeviceMajor:0 DeviceMinor:1436 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-722 DeviceMajor:0 DeviceMinor:722 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1375 DeviceMajor:0 DeviceMinor:1375 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-197 DeviceMajor:0 DeviceMinor:197 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/813f3ee7-35b5-4ee8-b453-00d16d910eae/volumes/kubernetes.io~projected/kube-api-access-8w592 DeviceMajor:0 DeviceMinor:294 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/56f25fad-089d-4df6-abb1-10d4c76750f1/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:312 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0/userdata/shm DeviceMajor:0 DeviceMinor:339 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-864 DeviceMajor:0 DeviceMinor:864 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1302 DeviceMajor:0 DeviceMinor:1302 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~secret/client-ca-bundle DeviceMajor:0 DeviceMinor:1565 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-268 DeviceMajor:0 DeviceMinor:268 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-541 DeviceMajor:0 DeviceMinor:541 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/addddaac-a31a-4dbf-b78f-87225b11b463/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:588 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/800f436c-145d-4281-8d4d-644ba2cb0ebb/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:966 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1251 DeviceMajor:0 DeviceMinor:1251 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-464 DeviceMajor:0 DeviceMinor:464 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-442 DeviceMajor:0 DeviceMinor:442 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-71 DeviceMajor:0 DeviceMinor:71 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-357 DeviceMajor:0 DeviceMinor:357 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c6a5d14d-0409-4024-b0a8-200fa2594185/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:587 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:592 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-493 DeviceMajor:0 DeviceMinor:493 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fbb8e73f-7e50-451b-b400-e88a86b51e09/volumes/kubernetes.io~projected/kube-api-access-c6b4p DeviceMajor:0 DeviceMinor:578 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1182 DeviceMajor:0 DeviceMinor:1182 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-378 DeviceMajor:0 DeviceMinor:378 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ce6b5a46-172b-4575-ba22-ff3c6ea4207f/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:560 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/eb4d8477-c3b5-4e88-aaa9-222ad56d974c/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:1452 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-710 DeviceMajor:0 DeviceMinor:710 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/volumes/kubernetes.io~secret/kube-state-metrics-tls DeviceMajor:0 DeviceMinor:1476 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/volumes/kubernetes.io~projected/kube-api-access-b9z4k DeviceMajor:0 DeviceMinor:1483 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1519 DeviceMajor:0 DeviceMinor:1519 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/28f1828f69f1c4d0d02aa7e938a400a27e1212d35ae3b194af92227cb1a24b54/userdata/shm DeviceMajor:0 DeviceMinor:388 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9ff695d5f754bc1ab4c8a5e3ad28cb942f185e2cf70cdd2d8a2eeb6d3f679b39/userdata/shm DeviceMajor:0 DeviceMinor:1400 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-639 DeviceMajor:0 DeviceMinor:639 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1039 DeviceMajor:0 DeviceMinor:1039 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1486 DeviceMajor:0 DeviceMinor:1486 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-193 DeviceMajor:0 DeviceMinor:193 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/caf36cfc8384a756669c5effc9f040f914b8e0fafbb77841a2ef74350bfc51bf/userdata/shm DeviceMajor:0 DeviceMinor:491 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:848 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1417 DeviceMajor:0 DeviceMinor:1417 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/kube-api-access-m4cct DeviceMajor:0 DeviceMinor:321 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d02c679c1b193ea195c44f77ce5059c11b500930cda814d106399c1a88668f1/userdata/shm DeviceMajor:0 DeviceMinor:538 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-907 DeviceMajor:0 DeviceMinor:907 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1384 DeviceMajor:0 DeviceMinor:1384 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1706 DeviceMajor:0 DeviceMinor:1706 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b5aeee4aff8bc78bf6e36ba938b9609781a2d34417a21d03fdc2c6a101065131/userdata/shm DeviceMajor:0 DeviceMinor:593 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1035 DeviceMajor:0 DeviceMinor:1035 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1540 DeviceMajor:0 DeviceMinor:1540 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1191 DeviceMajor:0 DeviceMinor:1191 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1639 DeviceMajor:0 DeviceMinor:1639 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1630 DeviceMajor:0 DeviceMinor:1630 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b/userdata/shm DeviceMajor:0 DeviceMinor:319 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa/volumes/kubernetes.io~projected/kube-api-access-xdbpk DeviceMajor:0 DeviceMinor:153 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-803 DeviceMajor:0 DeviceMinor:803 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1ed0b431491d7769d0a806c2775a07d37b29bbfb434d8d9d3536f46e64b03c26/userdata/shm DeviceMajor:0 DeviceMinor:1042 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-531 DeviceMajor:0 DeviceMinor:531 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6684358b-d7a6-4396-9b4f-ea67d85e4517/volumes/kubernetes.io~projected/kube-api-access-qkg8s DeviceMajor:0 DeviceMinor:1395 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1492 DeviceMajor:0 DeviceMinor:1492 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-382 DeviceMajor:0 DeviceMinor:382 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-411 DeviceMajor:0 DeviceMinor:411 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1636 DeviceMajor:0 DeviceMinor:1636 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a8636bd7-fa9e-44b9-82df-9d37b398736d/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:795 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-452 DeviceMajor:0 DeviceMinor:452 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1275 DeviceMajor:0 DeviceMinor:1275 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1406 DeviceMajor:0 DeviceMinor:1406 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-641 DeviceMajor:0 DeviceMinor:641 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:299 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:308 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-361 DeviceMajor:0 DeviceMinor:361 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-846 DeviceMajor:0 DeviceMinor:846 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-374 DeviceMajor:0 DeviceMinor:374 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1335 DeviceMajor:0 DeviceMinor:1335 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5dac8e25-0f51-4c04-929c-060479689a9d/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:73 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-611 DeviceMajor:0 DeviceMinor:611 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-130 DeviceMajor:0 DeviceMinor:130 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-513 DeviceMajor:0 DeviceMinor:513 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-609 DeviceMajor:0 DeviceMinor:609 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-125 DeviceMajor:0 DeviceMinor:125 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-741 DeviceMajor:0 DeviceMinor:741 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-843 DeviceMajor:0 DeviceMinor:843 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223/userdata/shm DeviceMajor:0 DeviceMinor:337 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7e789ca56ff169f6e94ee218684222ae21d64421f7ece2b51fa33799d9ab1ccc/userdata/shm DeviceMajor:0 DeviceMinor:1070 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2/userdata/shm DeviceMajor:0 DeviceMinor:322 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-504 DeviceMajor:0 DeviceMinor:504 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9c65ccceea882cd7d898803d924bbc08d4f3c8fef9c388b0db802fee0be2d9fc/userdata/shm DeviceMajor:0 DeviceMinor:831 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes/kubernetes.io~projected/kube-api-access-w82st DeviceMajor:0 DeviceMinor:1567 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e065179e-634a-4cbe-bb59-5b01c514e4de/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:301 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e7a7f632-2442-4837-b068-c22b03c71fb0/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:1027 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1294 DeviceMajor:0 DeviceMinor:1294 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-367 DeviceMajor:0 DeviceMinor:367 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/99da9d5b3d27d57501f5191969d7c3ca653c3d4bf3252f476bdc359e5ff9e271/userdata/shm DeviceMajor:0 DeviceMinor:476 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/21f00834ca375d484385e21696e2d8aa4483b916d5393969f0246cfd9dc0471a/userdata/shm DeviceMajor:0 DeviceMinor:761 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/810c363b-a4c7-428d-a2fb-285adc29f477/volumes/kubernetes.io~projected/kube-api-access-2nrj9 DeviceMajor:0 DeviceMinor:1023 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/74197c50-9a41-40e8-9289-c7e6afbd3737/volumes/kubernetes.io~projected/kube-api-access-hq44d DeviceMajor:0 DeviceMinor:47 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/651c0fad-1577-4a7f-8718-ec2fd2f06c3e/volumes/kubernetes.io~projected/kube-api-access-n8gh2 DeviceMajor:0 DeviceMinor:387 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/34cad3de-8f3f-48cd-bd39-8745fad19e65/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:1671 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:298 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/volumes/kubernetes.io~projected/kube-api-access-vd6d8 DeviceMajor:0 DeviceMinor:480 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-511 DeviceMajor:0 DeviceMinor:511 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-729 DeviceMajor:0 DeviceMinor:729 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1684 DeviceMajor:0 DeviceMinor:1684 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376/userdata/shm DeviceMajor:0 DeviceMinor:333 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/bda1cb0d-26cf-4b94-b359-432492112888/volumes/kubernetes.io~projected/kube-api-access-8r2fn DeviceMajor:0 DeviceMinor:1399 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8688dbdb4594010fcb7f8a4ca909c4672a49bcb4f5b75dd3a827d0833c1ed0fe/userdata/shm DeviceMajor:0 DeviceMinor:1568 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~projected/kube-api-access-4mttq DeviceMajor:0 DeviceMinor:293 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-615 DeviceMajor:0 DeviceMinor:615 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-712 DeviceMajor:0 DeviceMinor:712 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/db48b3f4e1a29b857cceceb534352169660eed12f652161e8b97983c91525c06/userdata/shm DeviceMajor:0 DeviceMinor:1043 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1161 DeviceMajor:0 DeviceMinor:1161 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e/userdata/shm DeviceMajor:0 DeviceMinor:316 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fbb8e73f-7e50-451b-b400-e88a86b51e09/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:576 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1142 DeviceMajor:0 DeviceMinor:1142 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-745 DeviceMajor:0 DeviceMinor:745 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b86ff0e8-2c72-4dc6-ac55-3c21940d044f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:926 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3a3d3c261f26f43a69f444c9052f030961843ae0bbf89248b5cd01b597da7064/userdata/shm DeviceMajor:0 DeviceMinor:369 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1517 DeviceMajor:0 DeviceMinor:1517 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-776 DeviceMajor:0 DeviceMinor:776 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cf87cc00ba78c6e3cc8680200b1afa8d433e342bc7744db35e1f64b4a3e5a078/userdata/shm DeviceMajor:0 DeviceMinor:540 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-420 DeviceMajor:0 DeviceMinor:420 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:185 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df11c4a8f3347747aecb87e080a5126c781276285c92708ad28d5159ae4229dc/userdata/shm DeviceMajor:0 DeviceMinor:481 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-934 DeviceMajor:0 DeviceMinor:934 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1110 DeviceMajor:0 DeviceMinor:1110 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-518 DeviceMajor:0 DeviceMinor:518 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-345 DeviceMajor:0 DeviceMinor:345 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-607 DeviceMajor:0 DeviceMinor:607 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/813f3ee7-35b5-4ee8-b453-00d16d910eae/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:638 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1306 DeviceMajor:0 DeviceMinor:1306 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~projected/kube-api-access-pctsn DeviceMajor:0 DeviceMinor:1398 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-416 DeviceMajor:0 DeviceMinor:416 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1653 DeviceMajor:0 DeviceMinor:1653 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a3899a38-39b8-4b48-81e5-4d8854ecc8ab/volumes/kubernetes.io~projected/kube-api-access-pt2jq DeviceMajor:0 DeviceMinor:1005 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:ove Dec 04 22:18:56.437574 master-0 kubenswrapper[33572]: rlay_0-1333 DeviceMajor:0 DeviceMinor:1333 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-53 DeviceMajor:0 DeviceMinor:53 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0beb871c-3bf1-471c-a028-746a650267bf/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:534 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/55c4f1e1-1b78-45ec-915d-8055ab3e2786/volumes/kubernetes.io~projected/kube-api-access-b8g99 DeviceMajor:0 DeviceMinor:1052 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-402 DeviceMajor:0 DeviceMinor:402 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/634c1df6-de4d-4e26-8c71-d39311cae0ce/volumes/kubernetes.io~projected/kube-api-access-xgt75 DeviceMajor:0 DeviceMinor:186 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-546 DeviceMajor:0 DeviceMinor:546 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4d378b74b84c73d0247bb3e1b1ed1084c6d9b778481316b2ed8d047f73fdee53/userdata/shm DeviceMajor:0 DeviceMinor:1041 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-395 DeviceMajor:0 DeviceMinor:395 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a5c2d3b8-41c0-4531-b770-57b7c567fe30/volumes/kubernetes.io~projected/kube-api-access-5vpbl DeviceMajor:0 DeviceMinor:796 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-404 DeviceMajor:0 DeviceMinor:404 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-450 DeviceMajor:0 DeviceMinor:450 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-526 DeviceMajor:0 DeviceMinor:526 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1434 DeviceMajor:0 DeviceMinor:1434 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1496 DeviceMajor:0 DeviceMinor:1496 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1408 DeviceMajor:0 DeviceMinor:1408 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:302 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-636 DeviceMajor:0 DeviceMinor:636 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e7a7f632-2442-4837-b068-c22b03c71fb0/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:1026 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f1534e25-7add-46a1-8f4e-0065c232aa4e/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:922 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ce6002bb-4948-45ab-bb1d-ed65e86b6466/volumes/kubernetes.io~projected/kube-api-access-87gv4 DeviceMajor:0 DeviceMinor:1130 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-635 DeviceMajor:0 DeviceMinor:635 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/479597d06e399852cde3f4983981240e3b9a935772d2dd22d716d20e734ab158/userdata/shm DeviceMajor:0 DeviceMinor:601 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1304 DeviceMajor:0 DeviceMinor:1304 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-964 DeviceMajor:0 DeviceMinor:964 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1082 DeviceMajor:0 DeviceMinor:1082 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-353 DeviceMajor:0 DeviceMinor:353 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1f07b0c4938e582ee1cf91b0ddf3d74df5a116aa0098efb24e115a5e8116176b/userdata/shm DeviceMajor:0 DeviceMinor:968 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:1097 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1331 DeviceMajor:0 DeviceMinor:1331 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1201 DeviceMajor:0 DeviceMinor:1201 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/eb4d8477-c3b5-4e88-aaa9-222ad56d974c/volumes/kubernetes.io~projected/kube-api-access-2w8vs DeviceMajor:0 DeviceMinor:1453 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-955 DeviceMajor:0 DeviceMinor:955 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1329 DeviceMajor:0 DeviceMinor:1329 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b966c210-5415-4fa5-88ab-c85aba979b28/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:1394 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-623 DeviceMajor:0 DeviceMinor:623 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1154 DeviceMajor:0 DeviceMinor:1154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-425 DeviceMajor:0 DeviceMinor:425 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a043ea49-97f9-4ae6-83b9-733f12754d94/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:1033 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1279 DeviceMajor:0 DeviceMinor:1279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1281 DeviceMajor:0 DeviceMinor:1281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-392 DeviceMajor:0 DeviceMinor:392 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4d68dcb1-efe4-425f-9b28-1e5575548a32/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:489 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-982 DeviceMajor:0 DeviceMinor:982 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-170 DeviceMajor:0 DeviceMinor:170 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-244 DeviceMajor:0 DeviceMinor:244 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde/userdata/shm DeviceMajor:0 DeviceMinor:335 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/4d68dcb1-efe4-425f-9b28-1e5575548a32/volumes/kubernetes.io~projected/kube-api-access-r6vjb DeviceMajor:0 DeviceMinor:490 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-833 DeviceMajor:0 DeviceMinor:833 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1268 DeviceMajor:0 DeviceMinor:1268 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/59d3d0d8-1a2a-4d14-8312-d33818acba88/volumes/kubernetes.io~projected/kube-api-access-d4rft DeviceMajor:0 DeviceMinor:159 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-644 DeviceMajor:0 DeviceMinor:644 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-945 DeviceMajor:0 DeviceMinor:945 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1542 DeviceMajor:0 DeviceMinor:1542 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3aa682501427a1a26306dd6e6ffbe29276935fb92a5916c957736c383157a162/userdata/shm DeviceMajor:0 DeviceMinor:132 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a544105a-5bec-456a-aef6-c160943c1f67/volumes/kubernetes.io~projected/kube-api-access-scht6 DeviceMajor:0 DeviceMinor:309 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-870 DeviceMajor:0 DeviceMinor:870 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/967bf4ac-f025-4296-8ed9-183a345f6b7c/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:1013 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c178afcf-b713-4c74-b22b-6169ba3123f5/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:1396 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-409 DeviceMajor:0 DeviceMinor:409 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-105 DeviceMajor:0 DeviceMinor:105 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-801 DeviceMajor:0 DeviceMinor:801 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6684358b-d7a6-4396-9b4f-ea67d85e4517/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:141 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-992 DeviceMajor:0 DeviceMinor:992 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141/volumes/kubernetes.io~projected/kube-api-access-bfcv9 DeviceMajor:0 DeviceMinor:564 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fb0274dc-fac1-41f9-b3e5-77253d851fdf/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:561 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/29828f55-427b-4fe3-8713-03bcd6ac9dec/volumes/kubernetes.io~projected/kube-api-access-t9rxt DeviceMajor:0 DeviceMinor:1124 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-406 DeviceMajor:0 DeviceMinor:406 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1290 DeviceMajor:0 DeviceMinor:1290 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1480 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/871cb002-67f4-43aa-a41d-7a5b2f340059/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:124 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:288 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:296 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/volumes/kubernetes.io~projected/kube-api-access-kcw8f DeviceMajor:0 DeviceMinor:305 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fb0274dc-fac1-41f9-b3e5-77253d851fdf/volumes/kubernetes.io~projected/kube-api-access-r4czl DeviceMajor:0 DeviceMinor:625 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/56273487d972eb4bce7bb2a2d532d0ecc4790cf320f72d236deb00d6ee12d734/userdata/shm DeviceMajor:0 DeviceMinor:901 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-949 DeviceMajor:0 DeviceMinor:949 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-912 DeviceMajor:0 DeviceMinor:912 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:156 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-260 DeviceMajor:0 DeviceMinor:260 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2d1e4c21a00903c83707b4497502b9b73283be3d30ce9a5aaab7e54bf8f72dba/userdata/shm DeviceMajor:0 DeviceMinor:928 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/5598683a-cd32-486d-8839-205829d55cc2/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:989 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f09554581756c0f7969c0e40141de26184513807dc2c20e4d5041730923d9d0c/userdata/shm DeviceMajor:0 DeviceMinor:747 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-548 DeviceMajor:0 DeviceMinor:548 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1515 DeviceMajor:0 DeviceMinor:1515 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1570 DeviceMajor:0 DeviceMinor:1570 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4f2b4d655e0bf0ff49528fd7206e3f4bbadf9c2ae53c24cb531027c7c4811ac5/userdata/shm DeviceMajor:0 DeviceMinor:108 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-236 DeviceMajor:0 DeviceMinor:236 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:556 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c9c2e653f3b9114eb61aa0f9377a8f57b59a4f433c04f9168d0a7788bc429f4a/userdata/shm DeviceMajor:0 DeviceMinor:67 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1271 DeviceMajor:0 DeviceMinor:1271 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-380 DeviceMajor:0 DeviceMinor:380 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-205 DeviceMajor:0 DeviceMinor:205 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-632 DeviceMajor:0 DeviceMinor:632 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-733 DeviceMajor:0 DeviceMinor:733 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1066 DeviceMajor:0 DeviceMinor:1066 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1101 DeviceMajor:0 DeviceMinor:1101 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/17912746-74eb-4c78-8c1b-2f66e7ce4299/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1477 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1688 DeviceMajor:0 DeviceMinor:1688 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fbb8e73f-7e50-451b-b400-e88a86b51e09/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:577 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-788 DeviceMajor:0 DeviceMinor:788 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-827 DeviceMajor:0 DeviceMinor:827 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/09c1c58555576a7a5ad0c40263cb1b24f2532f6f0895e3889a790f41c5622cf5/userdata/shm DeviceMajor:0 DeviceMinor:943 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/73496f020ec19048256b7ee616b5604b8f6faef21ddc2795a2639ad6cafa0a2c/userdata/shm DeviceMajor:0 DeviceMinor:1488 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-252 DeviceMajor:0 DeviceMinor:252 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c83e316239457de6d2cf065ee11c69192c6233457017b9e9bdae1e03d84ad9fc/userdata/shm DeviceMajor:0 DeviceMinor:536 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1047 DeviceMajor:0 DeviceMinor:1047 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1152 DeviceMajor:0 DeviceMinor:1152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/651c0fad-1577-4a7f-8718-ec2fd2f06c3e/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:386 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1588 DeviceMajor:0 DeviceMinor:1588 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1677 DeviceMajor:0 DeviceMinor:1677 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c3863c74-8f22-4c67-bef5-2d0d39df4abd/volumes/kubernetes.io~projected/kube-api-access-pc4z5 DeviceMajor:0 DeviceMinor:732 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1270 DeviceMajor:0 DeviceMinor:1270 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1556 DeviceMajor:0 DeviceMinor:1556 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1593 DeviceMajor:0 DeviceMinor:1593 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161/volumes/kubernetes.io~projected/kube-api-access-9wh6b DeviceMajor:0 DeviceMinor:157 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-183 DeviceMajor:0 DeviceMinor:183 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b9ac4ee53782e9fd4b340ed2b43fd3025db3cb82bd0881252f116248836951ce/userdata/shm DeviceMajor:0 DeviceMinor:600 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8f9c23aa8f546cd0849cf585c0cd6540010999bcb6db0df49677dfec81935af9/userdata/shm DeviceMajor:0 DeviceMinor:1237 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1431 DeviceMajor:0 DeviceMinor:1431 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5dac8e25-0f51-4c04-929c-060479689a9d/volumes/kubernetes.io~projected/kube-api-access-ch6s4 DeviceMajor:0 DeviceMinor:146 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-947 DeviceMajor:0 DeviceMinor:947 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5af5cfe128eaa351f012440567883f2b0f5ad3e1b0e50ea2b67166561450dd28/userdata/shm DeviceMajor:0 DeviceMinor:554 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/967bf4ac-f025-4296-8ed9-183a345f6b7c/volumes/kubernetes.io~projected/kube-api-access-hsk29 DeviceMajor:0 DeviceMinor:1015 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-728 DeviceMajor:0 DeviceMinor:728 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1420 DeviceMajor:0 DeviceMinor:1420 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-520 DeviceMajor:0 DeviceMinor:520 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-483 DeviceMajor:0 DeviceMinor:483 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/989a73ce-3898-4f65-a437-2c7061f9375f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:559 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b/userdata/shm DeviceMajor:0 DeviceMinor:330 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-384 DeviceMajor:0 DeviceMinor:384 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c382efd8856765d7e3a7c1f5148c4c397e023bc0d7fa9282b9bc8277f0af2687/userdata/shm DeviceMajor:0 DeviceMinor:160 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-223 DeviceMajor:0 DeviceMinor:223 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-473 DeviceMajor:0 DeviceMinor:473 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:590 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1084 DeviceMajor:0 DeviceMinor:1084 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-365 DeviceMajor:0 DeviceMinor:365 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/264531cb97973b0deb400a67899ce39a8e7e6bd105e2fd0acd10b7958dc4add3/userdata/shm DeviceMajor:0 DeviceMinor:924 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a043ea49-97f9-4ae6-83b9-733f12754d94/volumes/kubernetes.io~projected/kube-api-access-6jlvp DeviceMajor:0 DeviceMinor:1034 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1412 DeviceMajor:0 DeviceMinor:1412 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/800f436c-145d-4281-8d4d-644ba2cb0ebb/volumes/kubernetes.io~projected/kube-api-access-ngkqz DeviceMajor:0 DeviceMinor:967 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-646 DeviceMajor:0 DeviceMinor:646 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1292 DeviceMajor:0 DeviceMinor:1292 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/650ca7f20c2d1cb1f57ba5643ad53b21f17eea7d93316d18d3c9ccbd27770c35/userdata/shm DeviceMajor:0 DeviceMinor:598 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a3899a38-39b8-4b48-81e5-4d8854ecc8ab/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:1004 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c33c6e351b6426a43cd389bbd81cef5f132f38999fc440de5ea48da556537499/userdata/shm DeviceMajor:0 DeviceMinor:1111 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/de08a4b22951aecb57c029d3a74e638dc3b7212560569cf21e54820113aad20f/userdata/shm DeviceMajor:0 DeviceMinor:1397 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-168 DeviceMajor:0 DeviceMinor:168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ceb419e4-d804-4111-b8d8-8436cc2ee617/volumes/kubernetes.io~projected/kube-api-access-c7l9n DeviceMajor:0 DeviceMinor:307 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4fc051e954a566d97cf4dcb3626713517bc5479301f571be1eec860a1f2d884c/userdata/shm DeviceMajor:0 DeviceMinor:435 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-849 DeviceMajor:0 DeviceMinor:849 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1313 DeviceMajor:0 DeviceMinor:1313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/465637a4-42be-4a65-a859-7af699960138/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:286 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-774 DeviceMajor:0 DeviceMinor:774 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f893663c-7c1e-4eda-9839-99c1c0440304/volumes/kubernetes.io~projected/kube-api-access-g8d54 DeviceMajor:0 DeviceMinor:290 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-351 DeviceMajor:0 DeviceMinor:351 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-462 DeviceMajor:0 DeviceMinor:462 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c3863c74-8f22-4c67-bef5-2d0d39df4abd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:506 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:1121 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0c849ebda1ef05c2e7568afd8bbf5411d8e51e42f17fd972708d247af11d0983/userdata/shm DeviceMajor:0 DeviceMinor:1484 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/46229484-5fa1-4595-94a0-44477abae90e/volumes/kubernetes.io~projected/kube-api-access-jwk6f DeviceMajor:0 DeviceMinor:291 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1099 DeviceMajor:0 DeviceMinor:1099 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-624 DeviceMajor:0 DeviceMinor:624 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e976d29655bd6a4804e44aefca912f97ea12da492eb732dd0d56f5c8ee61e225/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-355 DeviceMajor:0 DeviceMinor:355 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8ccadfcf02bee77f4b3f98d491b1ee8f4b7c03cb21fbe9104543e0f3a4c0e10a/userdata/shm DeviceMajor:0 DeviceMinor:1132 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1277 DeviceMajor:0 DeviceMinor:1277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-112 DeviceMajor:0 DeviceMinor:112 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-163 DeviceMajor:0 DeviceMinor:163 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4c52307b147fc1f96631f9272147cbdbb3ffe8d871369692fc386dc96586c86f/userdata/shm DeviceMajor:0 DeviceMinor:677 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-868 DeviceMajor:0 DeviceMinor:868 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee/volumes/kubernetes.io~projected/kube-api-access-xkrvr DeviceMajor:0 DeviceMinor:1236 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f/userdata/shm DeviceMajor:0 DeviceMinor:314 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-951 DeviceMajor:0 DeviceMinor:951 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-225 DeviceMajor:0 DeviceMinor:225 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-495 DeviceMajor:0 DeviceMinor:495 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-930 DeviceMajor:0 DeviceMinor:930 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-953 DeviceMajor:0 DeviceMinor:953 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1308 DeviceMajor:0 DeviceMinor:1308 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-723 DeviceMajor:0 DeviceMinor:723 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-829 DeviceMajor:0 DeviceMinor:829 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/34cad3de-8f3f-48cd-bd39-8745fad19e65/volumes/kubernetes.io~projected/kube-api-access-tq55v DeviceMajor:0 DeviceMinor:1672 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-172 DeviceMajor:0 DeviceMinor:172 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-478 DeviceMajor:0 DeviceMinor:478 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-675 DeviceMajor:0 DeviceMinor:675 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2d142201-6e77-4828-b86b-05d4144a2f08/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:1028 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-579 DeviceMajor:0 DeviceMinor:579 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a8636bd7-fa9e-44b9-82df-9d37b398736d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:792 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1410 DeviceMajor:0 DeviceMinor:1410 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-424 DeviceMajor:0 DeviceMinor:424 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/690b447a-19c0-4925-bc9d-d0c86a83a377/volumes/kubernetes.io~projected/kube-api-access-wsxkk DeviceMajor:0 DeviceMinor:325 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/24648a41-875f-4e98-8b21-3bdd38dffa32/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:329 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c2279404-fa75-4de2-a302-d7b15ead5232/volumes/kubernetes.io~projected/kube-api-access-dd5zx DeviceMajor:0 DeviceMinor:760 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1037 DeviceMajor:0 DeviceMinor:1037 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-725 DeviceMajor:0 DeviceMinor:725 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1598 DeviceMajor:0 DeviceMinor:1598 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-92 DeviceMajor:0 DeviceMinor:92 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1088 DeviceMajor:0 DeviceMinor:1088 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-454 DeviceMajor:0 DeviceMinor:454 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-88 DeviceMajor:0 DeviceMinor:88 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:586 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/35821f48-b000-4915-847f-a739b6efc5ee/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:589 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-720 DeviceMajor:0 DeviceMinor:720 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1572 DeviceMajor:0 DeviceMinor:1572 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-149 DeviceMajor:0 DeviceMinor:149 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0a8ac4004225e98 MacAddress:56:63:12:c5:cb:ce Speed:10000 Mtu:8900} {Name:1ed0b431491d776 MacAddress:f6:53:8b:dd:5b:66 Speed:10000 Mtu:8900} {Name:1f07b0c4938e582 MacAddress:9e:47:26:ca:11:65 Speed:10000 Mtu:8900} {Name:1f66bf17e6a9a3d MacAddress:76:82:fe:b0:35:38 Speed:10000 Mtu:8900} {Name:264531cb97973b0 MacAddress:62:1e:66:d6:e9:ee Speed:10000 Mtu:8900} {Name:28f1828f69f1c4d MacAddress:12:29:0c:78:45:9e Speed:10000 Mtu:8900} {Name:2d1e4c21a00903c MacAddress:de:68:24:ce:38:8a Speed:10000 Mtu:8900} {Name:331b7dcdf8e2d87 MacAddress:42:59:24:91:80:ae Speed:10000 Mtu:8900} {Name:3a3d3c261f26f43 MacAddress:e6:04:e9:a7:4f:3e Speed:10000 Mtu:8900} {Name:479597d06e39985 MacAddress:7e:15:78:23:33:9d Speed:10000 Mtu:8900} {Name:4c52307b147fc1f MacAddress:d2:8e:47:61:af:5a Speed:10000 Mtu:8900} {Name:4d378b74b84c73d MacAddress:e2:e8:80:fb:43:7b Speed:10000 Mtu:8900} {Name:4fc051e954a566d MacAddress:0a:24:da:ef:ef:8b Speed:10000 Mtu:8900} {Name:51d6ac9cf90d7c8 MacAddress:26:5c:e7:61:a9:42 Speed:10000 Mtu:8900} {Name:53415730f490fe2 MacAddress:e2:f4:f7:19:81:55 Speed:10000 Mtu:8900} {Name:56273487d972eb4 MacAddress:ca:c2:81:4d:83:d5 Speed:10000 Mtu:8900} {Name:58c253cb55a2596 MacAddress:22:61:bb:31:8a:a0 Speed:10000 Mtu:8900} {Name:5af5cfe128eaa35 MacAddress:f2:2c:4f:7d:2f:d6 Speed:10000 Mtu:8900} {Name:60aac9ad737c32f MacAddress:5a:d5:0e:9a:cd:40 Speed:10000 Mtu:8900} {Name:650ca7f20c2d1cb MacAddress:6a:25:59:ce:49:47 Speed:10000 Mtu:8900} {Name:66619c69e4b8475 MacAddress:5e:d1:6c:51:a3:fb Speed:10000 Mtu:8900} {Name:6bf343216707410 MacAddress:46:e7:ec:ee:39:79 Speed:10000 Mtu:8900} {Name:73496f020ec1904 MacAddress:aa:45:95:72:bb:a5 Speed:10000 Mtu:8900} {Name:773c5e8477795f7 MacAddress:aa:ef:f2:76:6e:b5 Speed:10000 Mtu:8900} {Name:7d02c679c1b193e MacAddress:f6:1e:7d:10:3d:c9 Speed:10000 Mtu:8900} {Name:7d740441407a329 MacAddress:9a:47:72:9b:0a:a8 Speed:10000 Mtu:8900} {Name:7e789ca56ff169f MacAddress:ee:e2:da:a3:96:02 Speed:10000 Mtu:8900} {Name:8322a14628afc89 MacAddress:f2:7a:f1:4a:52:45 Speed:10000 Mtu:8900} {Name:857d5516010228f MacAddress:06:16:fb:fc:89:70 Speed:10000 Mtu:8900} {Name:8688dbdb4594010 MacAddress:3a:c1:d5:1f:38:4a Speed:10000 Mtu:8900} {Name:8f9c23aa8f546cd MacAddress:56:c8:6c:91:eb:7f Speed:10000 Mtu:8900} {Name:969ff9f89143902 MacAddress:56:fd:2b:03:22:7e Speed:10000 Mtu:8900} {Name:99da9d5b3d27d57 MacAddress:72:bf:fd:c3:16:95 Speed:10000 Mtu:8900} {Name:9c65ccceea882cd MacAddress:2e:73:61:1e:e5:eb Speed:10000 Mtu:8900} {Name:9ff695d5f754bc1 MacAddress:82:aa:c8:92:86:44 Speed:10000 Mtu:8900} {Name:aa68f9d56263db1 MacAddress:fe:de:75:bb:75:ec Speed:10000 Mtu:8900} {Name:aab8d5d7d7caaf8 MacAddress:de:05:81:59:ce:54 Speed:10000 Mtu:8900} {Name:ab0050370c98df5 MacAddress:ee:f1:e4:ef:79:b8 Speed:10000 Mtu:8900} {Name:b5aeee4aff8bc78 MacAddress:82:6c:84:cf:81:22 Speed:10000 Mtu:8900} {Name:b9ac4ee53782e9f MacAddress:9a:64:e2:4c:61:1e Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:12:f7:6c:8f:54:65 Speed:0 Mtu:8900} {Name:c33c6e351b6426a MacAddress:96:43:e5:0c:95:1f Speed:10000 Mtu:8900} {Name:c83e316239457de MacAddress:a2:6f:7c:9f:b9:ff Speed:10000 Mtu:8900} {Name:c9c2e653f3b9114 MacAddress:aa:dd:1c:0f:a4:cd Speed:10000 Mtu:8900} {Name:caf36cfc8384a75 MacAddress:da:e6:02:44:7d:9e Speed:10000 Mtu:8900} {Name:cf87cc00ba78c6e MacAddress:0e:f4:85:36:09:38 Speed:10000 Mtu:8900} {Name:db48b3f4e1a29b8 MacAddress:26:5d:7f:54:e5:da Speed:10000 Mtu:8900} {Name:de08a4b22951aec MacAddress:d6:7d:a8:b8:e3:63 Speed:10000 Mtu:8900} {Name:df11c4a8f334774 MacAddress:82:3e:b5:f0:64:e5 Speed:10000 Mtu:8900} {Name:e18ea7a7e8b99e9 MacAddress:86:ec:6f:fd:b3:7d Speed:10000 Mtu:8900} {Name:ea7c4bd82fb1342 MacAddress:6e:9c:b3:97:f2:28 Speed:10000 Mtu:8900} {Name:ebcf83d7998d4cc MacAddress:3a:5e:40:cc:25:c1 Speed:10000 Mtu:8900} {Name:ee18ee29a901424 MacAddress:06:ab:1a:57:6f:fd Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:65:18:02 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:40:5b:43 Speed:-1 Mtu:9000} {Name:f19df2e06dca5a2 MacAddress:3e:70:e8:66:cd:d8 Speed:10000 Mtu:8900} {Name:faca5825f225200 MacAddress:6e:35:7a:fe:5b:ea Speed:10000 Mtu:8900} {Name:fc5560b4f5b7541 MacAddress:12:66:2c:6c:ef:1e Speed:10000 Mtu:8900} {Name:fe41c35d4fc12b1 MacAddress:c2:8a:fe:5b:40:38 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:c6:7e:e9:15:bf:77 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 04 22:18:56.437574 master-0 kubenswrapper[33572]: I1204 22:18:56.436929 33572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 04 22:18:56.437574 master-0 kubenswrapper[33572]: I1204 22:18:56.437011 33572 manager.go:233] Version: {KernelVersion:5.14.0-427.100.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511170715-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 04 22:18:56.437574 master-0 kubenswrapper[33572]: I1204 22:18:56.437305 33572 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 04 22:18:56.438111 master-0 kubenswrapper[33572]: I1204 22:18:56.437596 33572 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 04 22:18:56.438111 master-0 kubenswrapper[33572]: I1204 22:18:56.437644 33572 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 04 22:18:56.438111 master-0 kubenswrapper[33572]: I1204 22:18:56.438004 33572 topology_manager.go:138] "Creating topology manager with none policy" Dec 04 22:18:56.438111 master-0 kubenswrapper[33572]: I1204 22:18:56.438017 33572 container_manager_linux.go:303] "Creating device plugin manager" Dec 04 22:18:56.438111 master-0 kubenswrapper[33572]: I1204 22:18:56.438028 33572 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 22:18:56.438111 master-0 kubenswrapper[33572]: I1204 22:18:56.438057 33572 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 22:18:56.438111 master-0 kubenswrapper[33572]: I1204 22:18:56.438103 33572 state_mem.go:36] "Initialized new in-memory state store" Dec 04 22:18:56.438364 master-0 kubenswrapper[33572]: I1204 22:18:56.438209 33572 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 04 22:18:56.438364 master-0 kubenswrapper[33572]: I1204 22:18:56.438299 33572 kubelet.go:418] "Attempting to sync node with API server" Dec 04 22:18:56.438364 master-0 kubenswrapper[33572]: I1204 22:18:56.438313 33572 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 04 22:18:56.438364 master-0 kubenswrapper[33572]: I1204 22:18:56.438331 33572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 04 22:18:56.438364 master-0 kubenswrapper[33572]: I1204 22:18:56.438347 33572 kubelet.go:324] "Adding apiserver pod source" Dec 04 22:18:56.438364 master-0 kubenswrapper[33572]: I1204 22:18:56.438367 33572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 04 22:18:56.440000 master-0 kubenswrapper[33572]: I1204 22:18:56.439957 33572 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 04 22:18:56.440185 master-0 kubenswrapper[33572]: I1204 22:18:56.440157 33572 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 04 22:18:56.440497 master-0 kubenswrapper[33572]: I1204 22:18:56.440472 33572 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 04 22:18:56.440705 master-0 kubenswrapper[33572]: I1204 22:18:56.440680 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 04 22:18:56.440754 master-0 kubenswrapper[33572]: I1204 22:18:56.440708 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 04 22:18:56.440754 master-0 kubenswrapper[33572]: I1204 22:18:56.440719 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 04 22:18:56.440754 master-0 kubenswrapper[33572]: I1204 22:18:56.440728 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 04 22:18:56.440754 master-0 kubenswrapper[33572]: I1204 22:18:56.440738 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 04 22:18:56.440754 master-0 kubenswrapper[33572]: I1204 22:18:56.440746 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 04 22:18:56.440754 master-0 kubenswrapper[33572]: I1204 22:18:56.440756 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 04 22:18:56.440966 master-0 kubenswrapper[33572]: I1204 22:18:56.440764 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 04 22:18:56.440966 master-0 kubenswrapper[33572]: I1204 22:18:56.440776 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 04 22:18:56.440966 master-0 kubenswrapper[33572]: I1204 22:18:56.440785 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 04 22:18:56.440966 master-0 kubenswrapper[33572]: I1204 22:18:56.440814 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 04 22:18:56.440966 master-0 kubenswrapper[33572]: I1204 22:18:56.440831 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 04 22:18:56.440966 master-0 kubenswrapper[33572]: I1204 22:18:56.440873 33572 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 04 22:18:56.441301 master-0 kubenswrapper[33572]: I1204 22:18:56.441284 33572 server.go:1280] "Started kubelet" Dec 04 22:18:56.442315 master-0 kubenswrapper[33572]: I1204 22:18:56.442048 33572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 04 22:18:56.442483 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 04 22:18:56.442708 master-0 kubenswrapper[33572]: I1204 22:18:56.442516 33572 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 04 22:18:56.443028 master-0 kubenswrapper[33572]: I1204 22:18:56.442829 33572 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 04 22:18:56.443211 master-0 kubenswrapper[33572]: I1204 22:18:56.443120 33572 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 04 22:18:56.461707 master-0 kubenswrapper[33572]: I1204 22:18:56.461637 33572 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 22:18:56.463809 master-0 kubenswrapper[33572]: I1204 22:18:56.463784 33572 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 22:18:56.464694 master-0 kubenswrapper[33572]: I1204 22:18:56.464654 33572 server.go:449] "Adding debug handlers to kubelet server" Dec 04 22:18:56.471797 master-0 kubenswrapper[33572]: E1204 22:18:56.470037 33572 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Dec 04 22:18:56.476174 master-0 kubenswrapper[33572]: I1204 22:18:56.476146 33572 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 04 22:18:56.477010 master-0 kubenswrapper[33572]: I1204 22:18:56.476968 33572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 04 22:18:56.477278 master-0 kubenswrapper[33572]: I1204 22:18:56.477225 33572 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-05 21:50:08 +0000 UTC, rotation deadline is 2025-12-05 18:57:01.649648816 +0000 UTC Dec 04 22:18:56.477372 master-0 kubenswrapper[33572]: I1204 22:18:56.477357 33572 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h38m5.172296265s for next certificate rotation Dec 04 22:18:56.477483 master-0 kubenswrapper[33572]: I1204 22:18:56.477449 33572 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 04 22:18:56.477483 master-0 kubenswrapper[33572]: I1204 22:18:56.477480 33572 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 04 22:18:56.477674 master-0 kubenswrapper[33572]: I1204 22:18:56.477652 33572 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 04 22:18:56.480474 master-0 kubenswrapper[33572]: I1204 22:18:56.480442 33572 factory.go:153] Registering CRI-O factory Dec 04 22:18:56.480556 master-0 kubenswrapper[33572]: I1204 22:18:56.480476 33572 factory.go:221] Registration of the crio container factory successfully Dec 04 22:18:56.480630 master-0 kubenswrapper[33572]: I1204 22:18:56.480608 33572 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 04 22:18:56.480694 master-0 kubenswrapper[33572]: I1204 22:18:56.480631 33572 factory.go:55] Registering systemd factory Dec 04 22:18:56.480694 master-0 kubenswrapper[33572]: I1204 22:18:56.480644 33572 factory.go:221] Registration of the systemd container factory successfully Dec 04 22:18:56.480694 master-0 kubenswrapper[33572]: I1204 22:18:56.480674 33572 factory.go:103] Registering Raw factory Dec 04 22:18:56.480799 master-0 kubenswrapper[33572]: I1204 22:18:56.480699 33572 manager.go:1196] Started watching for new ooms in manager Dec 04 22:18:56.481584 master-0 kubenswrapper[33572]: I1204 22:18:56.481560 33572 manager.go:319] Starting recovery of all containers Dec 04 22:18:56.488120 master-0 kubenswrapper[33572]: I1204 22:18:56.487942 33572 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 22:18:56.494872 master-0 kubenswrapper[33572]: I1204 22:18:56.494770 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d142201-6e77-4828-b86b-05d4144a2f08" volumeName="kubernetes.io/projected/2d142201-6e77-4828-b86b-05d4144a2f08-kube-api-access-6cblk" seLinuxMountContext="" Dec 04 22:18:56.494988 master-0 kubenswrapper[33572]: I1204 22:18:56.494872 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" volumeName="kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle" seLinuxMountContext="" Dec 04 22:18:56.494988 master-0 kubenswrapper[33572]: I1204 22:18:56.494892 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae107ad4-104c-4264-9844-afb3af28b19e" volumeName="kubernetes.io/projected/ae107ad4-104c-4264-9844-afb3af28b19e-kube-api-access-9gj4j" seLinuxMountContext="" Dec 04 22:18:56.494988 master-0 kubenswrapper[33572]: I1204 22:18:56.494912 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c6a5d14d-0409-4024-b0a8-200fa2594185" volumeName="kubernetes.io/configmap/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-trusted-ca" seLinuxMountContext="" Dec 04 22:18:56.494988 master-0 kubenswrapper[33572]: I1204 22:18:56.494930 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c6a5d14d-0409-4024-b0a8-200fa2594185" volumeName="kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics" seLinuxMountContext="" Dec 04 22:18:56.494988 master-0 kubenswrapper[33572]: I1204 22:18:56.494947 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f893663c-7c1e-4eda-9839-99c1c0440304" volumeName="kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-trusted-ca-bundle" seLinuxMountContext="" Dec 04 22:18:56.494988 master-0 kubenswrapper[33572]: I1204 22:18:56.494962 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="34cad3de-8f3f-48cd-bd39-8745fad19e65" volumeName="kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs" seLinuxMountContext="" Dec 04 22:18:56.494988 master-0 kubenswrapper[33572]: I1204 22:18:56.494978 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" volumeName="kubernetes.io/empty-dir/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-audit-log" seLinuxMountContext="" Dec 04 22:18:56.495179 master-0 kubenswrapper[33572]: I1204 22:18:56.494997 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a544105a-5bec-456a-aef6-c160943c1f67" volumeName="kubernetes.io/configmap/a544105a-5bec-456a-aef6-c160943c1f67-config" seLinuxMountContext="" Dec 04 22:18:56.495179 master-0 kubenswrapper[33572]: I1204 22:18:56.495013 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17912746-74eb-4c78-8c1b-2f66e7ce4299" volumeName="kubernetes.io/projected/17912746-74eb-4c78-8c1b-2f66e7ce4299-kube-api-access-smtnh" seLinuxMountContext="" Dec 04 22:18:56.495179 master-0 kubenswrapper[33572]: I1204 22:18:56.495031 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56f25fad-089d-4df6-abb1-10d4c76750f1" volumeName="kubernetes.io/projected/56f25fad-089d-4df6-abb1-10d4c76750f1-kube-api-access" seLinuxMountContext="" Dec 04 22:18:56.495179 master-0 kubenswrapper[33572]: I1204 22:18:56.495078 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5dac8e25-0f51-4c04-929c-060479689a9d" volumeName="kubernetes.io/secret/5dac8e25-0f51-4c04-929c-060479689a9d-machine-approver-tls" seLinuxMountContext="" Dec 04 22:18:56.495179 master-0 kubenswrapper[33572]: I1204 22:18:56.495097 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c178afcf-b713-4c74-b22b-6169ba3123f5" volumeName="kubernetes.io/projected/c178afcf-b713-4c74-b22b-6169ba3123f5-kube-api-access-pctsn" seLinuxMountContext="" Dec 04 22:18:56.495179 master-0 kubenswrapper[33572]: I1204 22:18:56.495116 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e065179e-634a-4cbe-bb59-5b01c514e4de" volumeName="kubernetes.io/configmap/e065179e-634a-4cbe-bb59-5b01c514e4de-config" seLinuxMountContext="" Dec 04 22:18:56.495179 master-0 kubenswrapper[33572]: I1204 22:18:56.495132 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="512ba6af-11ad-4217-a1ce-a2ab3ef67ec5" volumeName="kubernetes.io/projected/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-kube-api-access-g5nkh" seLinuxMountContext="" Dec 04 22:18:56.495179 master-0 kubenswrapper[33572]: I1204 22:18:56.495146 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56f25fad-089d-4df6-abb1-10d4c76750f1" volumeName="kubernetes.io/secret/56f25fad-089d-4df6-abb1-10d4c76750f1-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495163 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5dac8e25-0f51-4c04-929c-060479689a9d" volumeName="kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-auth-proxy-config" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495210 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5dac8e25-0f51-4c04-929c-060479689a9d" volumeName="kubernetes.io/projected/5dac8e25-0f51-4c04-929c-060479689a9d-kube-api-access-ch6s4" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495230 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="addddaac-a31a-4dbf-b78f-87225b11b463" volumeName="kubernetes.io/configmap/addddaac-a31a-4dbf-b78f-87225b11b463-trusted-ca" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495245 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" volumeName="kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-webhook-cert" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495263 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0beb871c-3bf1-471c-a028-746a650267bf" volumeName="kubernetes.io/projected/0beb871c-3bf1-471c-a028-746a650267bf-kube-api-access-dvrr5" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495281 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24648a41-875f-4e98-8b21-3bdd38dffa32" volumeName="kubernetes.io/projected/24648a41-875f-4e98-8b21-3bdd38dffa32-kube-api-access" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495298 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f091088-2166-4026-9fa6-62bd83407edb" volumeName="kubernetes.io/configmap/7f091088-2166-4026-9fa6-62bd83407edb-config" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495316 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e" volumeName="kubernetes.io/projected/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-kube-api-access-kcw8f" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495331 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" volumeName="kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495346 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" volumeName="kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-env-overrides" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495367 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" volumeName="kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovnkube-config" seLinuxMountContext="" Dec 04 22:18:56.495405 master-0 kubenswrapper[33572]: I1204 22:18:56.495385 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a8636bd7-fa9e-44b9-82df-9d37b398736d" volumeName="kubernetes.io/configmap/a8636bd7-fa9e-44b9-82df-9d37b398736d-service-ca" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495430 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74197c50-9a41-40e8-9289-c7e6afbd3737" volumeName="kubernetes.io/projected/74197c50-9a41-40e8-9289-c7e6afbd3737-kube-api-access-hq44d" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495446 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" volumeName="kubernetes.io/empty-dir/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-tmpfs" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495464 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="465637a4-42be-4a65-a859-7af699960138" volumeName="kubernetes.io/empty-dir/465637a4-42be-4a65-a859-7af699960138-operand-assets" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495483 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5598683a-cd32-486d-8839-205829d55cc2" volumeName="kubernetes.io/configmap/5598683a-cd32-486d-8839-205829d55cc2-auth-proxy-config" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495506 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" volumeName="kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495526 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f893663c-7c1e-4eda-9839-99c1c0440304" volumeName="kubernetes.io/secret/f893663c-7c1e-4eda-9839-99c1c0440304-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495711 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0274dc-fac1-41f9-b3e5-77253d851fdf" volumeName="kubernetes.io/empty-dir/fb0274dc-fac1-41f9-b3e5-77253d851fdf-cache" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495729 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9" volumeName="kubernetes.io/projected/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-kube-api-access-4g7n9" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495745 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce6002bb-4948-45ab-bb1d-ed65e86b6466" volumeName="kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-utilities" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495761 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f1534e25-7add-46a1-8f4e-0065c232aa4e" volumeName="kubernetes.io/secret/f1534e25-7add-46a1-8f4e-0065c232aa4e-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495779 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d142201-6e77-4828-b86b-05d4144a2f08" volumeName="kubernetes.io/empty-dir/2d142201-6e77-4828-b86b-05d4144a2f08-snapshots" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495796 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" volumeName="kubernetes.io/secret/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 04 22:18:56.495823 master-0 kubenswrapper[33572]: I1204 22:18:56.495812 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c6a5d14d-0409-4024-b0a8-200fa2594185" volumeName="kubernetes.io/projected/c6a5d14d-0409-4024-b0a8-200fa2594185-kube-api-access-bfklr" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495828 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8d84a7d3-46d1-48e3-83f3-f6b32f16cc76" volumeName="kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495845 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a544105a-5bec-456a-aef6-c160943c1f67" volumeName="kubernetes.io/secret/a544105a-5bec-456a-aef6-c160943c1f67-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495862 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a8636bd7-fa9e-44b9-82df-9d37b398736d" volumeName="kubernetes.io/secret/a8636bd7-fa9e-44b9-82df-9d37b398736d-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495878 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c178afcf-b713-4c74-b22b-6169ba3123f5" volumeName="kubernetes.io/configmap/c178afcf-b713-4c74-b22b-6169ba3123f5-service-ca-bundle" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495895 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3863c74-8f22-4c67-bef5-2d0d39df4abd" volumeName="kubernetes.io/secret/c3863c74-8f22-4c67-bef5-2d0d39df4abd-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495911 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="465637a4-42be-4a65-a859-7af699960138" volumeName="kubernetes.io/secret/465637a4-42be-4a65-a859-7af699960138-cluster-olm-operator-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495928 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d68dcb1-efe4-425f-9b28-1e5575548a32" volumeName="kubernetes.io/projected/4d68dcb1-efe4-425f-9b28-1e5575548a32-kube-api-access-r6vjb" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495945 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="690b447a-19c0-4925-bc9d-d0c86a83a377" volumeName="kubernetes.io/secret/690b447a-19c0-4925-bc9d-d0c86a83a377-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495964 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-ca" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495981 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7a7f632-2442-4837-b068-c22b03c71fb0" volumeName="kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-profile-collector-cert" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.495999 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" volumeName="kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-trusted-ca-bundle" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.496022 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eb4d8477-c3b5-4e88-aaa9-222ad56d974c" volumeName="kubernetes.io/projected/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-kube-api-access-2w8vs" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.496041 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce6002bb-4948-45ab-bb1d-ed65e86b6466" volumeName="kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-catalog-content" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.496060 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e065179e-634a-4cbe-bb59-5b01c514e4de" volumeName="kubernetes.io/secret/e065179e-634a-4cbe-bb59-5b01c514e4de-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.496079 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="34cad3de-8f3f-48cd-bd39-8745fad19e65" volumeName="kubernetes.io/projected/34cad3de-8f3f-48cd-bd39-8745fad19e65-kube-api-access-tq55v" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.496097 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="651c0fad-1577-4a7f-8718-ec2fd2f06c3e" volumeName="kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.496113 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8d84a7d3-46d1-48e3-83f3-f6b32f16cc76" volumeName="kubernetes.io/projected/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-api-access-b9z4k" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.496136 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb" volumeName="kubernetes.io/projected/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-kube-api-access-jc47q" seLinuxMountContext="" Dec 04 22:18:56.496161 master-0 kubenswrapper[33572]: I1204 22:18:56.496153 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bda1cb0d-26cf-4b94-b359-432492112888" volumeName="kubernetes.io/projected/bda1cb0d-26cf-4b94-b359-432492112888-kube-api-access-8r2fn" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496170 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7a7f632-2442-4837-b068-c22b03c71fb0" volumeName="kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-srv-cert" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496190 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46229484-5fa1-4595-94a0-44477abae90e" volumeName="kubernetes.io/configmap/46229484-5fa1-4595-94a0-44477abae90e-config" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496206 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="651c0fad-1577-4a7f-8718-ec2fd2f06c3e" volumeName="kubernetes.io/projected/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-kube-api-access-n8gh2" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496253 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6684358b-d7a6-4396-9b4f-ea67d85e4517" volumeName="kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-kube-rbac-proxy-config" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496272 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7b2270b-2afc-4bf5-ae1a-5ccf9814657b" volumeName="kubernetes.io/secret/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-samples-operator-tls" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496292 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d142201-6e77-4828-b86b-05d4144a2f08" volumeName="kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-service-ca-bundle" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496306 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="35821f48-b000-4915-847f-a739b6efc5ee" volumeName="kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-kube-api-access-m4cct" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496324 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" volumeName="kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496340 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0274dc-fac1-41f9-b3e5-77253d851fdf" volumeName="kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-ca-certs" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496356 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17912746-74eb-4c78-8c1b-2f66e7ce4299" volumeName="kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496373 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" volumeName="kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-client" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496388 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6684358b-d7a6-4396-9b4f-ea67d85e4517" volumeName="kubernetes.io/projected/6684358b-d7a6-4396-9b4f-ea67d85e4517-kube-api-access-qkg8s" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496403 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce6b5a46-172b-4575-ba22-ff3c6ea4207f" volumeName="kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-kube-api-access-2cfhv" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496418 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="967bf4ac-f025-4296-8ed9-183a345f6b7c" volumeName="kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-profile-collector-cert" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496434 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5c2d3b8-41c0-4531-b770-57b7c567fe30" volumeName="kubernetes.io/projected/a5c2d3b8-41c0-4531-b770-57b7c567fe30-kube-api-access-5vpbl" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496452 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a8636bd7-fa9e-44b9-82df-9d37b398736d" volumeName="kubernetes.io/projected/a8636bd7-fa9e-44b9-82df-9d37b398736d-kube-api-access" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496503 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24648a41-875f-4e98-8b21-3bdd38dffa32" volumeName="kubernetes.io/configmap/24648a41-875f-4e98-8b21-3bdd38dffa32-config" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496521 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce6b5a46-172b-4575-ba22-ff3c6ea4207f" volumeName="kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-ca-certs" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496556 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7a7f632-2442-4837-b068-c22b03c71fb0" volumeName="kubernetes.io/projected/e7a7f632-2442-4837-b068-c22b03c71fb0-kube-api-access-c7d9j" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496577 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="465637a4-42be-4a65-a859-7af699960138" volumeName="kubernetes.io/projected/465637a4-42be-4a65-a859-7af699960138-kube-api-access-4mttq" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496594 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" volumeName="kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496612 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" volumeName="kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496627 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" volumeName="kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-apiservice-cert" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496674 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0beb871c-3bf1-471c-a028-746a650267bf" volumeName="kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496690 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="59d3d0d8-1a2a-4d14-8312-d33818acba88" volumeName="kubernetes.io/projected/59d3d0d8-1a2a-4d14-8312-d33818acba88-kube-api-access-d4rft" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496707 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="59d3d0d8-1a2a-4d14-8312-d33818acba88" volumeName="kubernetes.io/secret/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovn-node-metrics-cert" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496722 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="871cb002-67f4-43aa-a41d-7a5b2f340059" volumeName="kubernetes.io/projected/871cb002-67f4-43aa-a41d-7a5b2f340059-kube-api-access-lclkg" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496738 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb" volumeName="kubernetes.io/secret/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-proxy-tls" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496753 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae107ad4-104c-4264-9844-afb3af28b19e" volumeName="kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-catalog-content" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496769 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0a726f44-a509-46b3-a6d5-70afe3b55e9f" volumeName="kubernetes.io/empty-dir/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-textfile" seLinuxMountContext="" Dec 04 22:18:56.496785 master-0 kubenswrapper[33572]: I1204 22:18:56.496788 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29828f55-427b-4fe3-8713-03bcd6ac9dec" volumeName="kubernetes.io/projected/29828f55-427b-4fe3-8713-03bcd6ac9dec-kube-api-access-t9rxt" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496806 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" volumeName="kubernetes.io/projected/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-kube-api-access-mwx5k" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496824 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9" volumeName="kubernetes.io/secret/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-machine-api-operator-tls" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496840 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c178afcf-b713-4c74-b22b-6169ba3123f5" volumeName="kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-stats-auth" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496856 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55c4f1e1-1b78-45ec-915d-8055ab3e2786" volumeName="kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-images" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496872 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5dac8e25-0f51-4c04-929c-060479689a9d" volumeName="kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496889 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="690b447a-19c0-4925-bc9d-d0c86a83a377" volumeName="kubernetes.io/configmap/690b447a-19c0-4925-bc9d-d0c86a83a377-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496906 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3863c74-8f22-4c67-bef5-2d0d39df4abd" volumeName="kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-client-ca" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496924 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d" volumeName="kubernetes.io/projected/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-kube-api-access-gpksd" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496940 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46229484-5fa1-4595-94a0-44477abae90e" volumeName="kubernetes.io/secret/46229484-5fa1-4595-94a0-44477abae90e-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496957 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76fd9f44-4365-4271-8772-025655c50334" volumeName="kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-sysctl-allowlist" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496972 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" volumeName="kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.496988 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24648a41-875f-4e98-8b21-3bdd38dffa32" volumeName="kubernetes.io/secret/24648a41-875f-4e98-8b21-3bdd38dffa32-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497002 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d68dcb1-efe4-425f-9b28-1e5575548a32" volumeName="kubernetes.io/secret/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-key" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497078 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="800f436c-145d-4281-8d4d-644ba2cb0ebb" volumeName="kubernetes.io/configmap/800f436c-145d-4281-8d4d-644ba2cb0ebb-cco-trusted-ca" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497100 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" volumeName="kubernetes.io/projected/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-kube-api-access-w82st" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497119 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74197c50-9a41-40e8-9289-c7e6afbd3737" volumeName="kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-auth-proxy-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497139 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74197c50-9a41-40e8-9289-c7e6afbd3737" volumeName="kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-images" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497159 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb" volumeName="kubernetes.io/configmap/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-mcd-auth-proxy-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497176 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7b2270b-2afc-4bf5-ae1a-5ccf9814657b" volumeName="kubernetes.io/projected/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-kube-api-access-g2ghk" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497192 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" volumeName="kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-utilities" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497211 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5598683a-cd32-486d-8839-205829d55cc2" volumeName="kubernetes.io/secret/5598683a-cd32-486d-8839-205829d55cc2-cert" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497230 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="59d3d0d8-1a2a-4d14-8312-d33818acba88" volumeName="kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497249 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c178afcf-b713-4c74-b22b-6169ba3123f5" volumeName="kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-default-certificate" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497266 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0a726f44-a509-46b3-a6d5-70afe3b55e9f" volumeName="kubernetes.io/projected/0a726f44-a509-46b3-a6d5-70afe3b55e9f-kube-api-access-k8jmv" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497282 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6684358b-d7a6-4396-9b4f-ea67d85e4517" volumeName="kubernetes.io/configmap/6684358b-d7a6-4396-9b4f-ea67d85e4517-metrics-client-ca" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497302 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5c2d3b8-41c0-4531-b770-57b7c567fe30" volumeName="kubernetes.io/secret/a5c2d3b8-41c0-4531-b770-57b7c567fe30-metrics-tls" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497319 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8d84a7d3-46d1-48e3-83f3-f6b32f16cc76" volumeName="kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497335 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" volumeName="kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497351 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/projected/ceb419e4-d804-4111-b8d8-8436cc2ee617-kube-api-access-c7l9n" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497367 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fbb8e73f-7e50-451b-b400-e88a86b51e09" volumeName="kubernetes.io/projected/fbb8e73f-7e50-451b-b400-e88a86b51e09-kube-api-access-c6b4p" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497383 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55c4f1e1-1b78-45ec-915d-8055ab3e2786" volumeName="kubernetes.io/secret/55c4f1e1-1b78-45ec-915d-8055ab3e2786-proxy-tls" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497399 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="810c363b-a4c7-428d-a2fb-285adc29f477" volumeName="kubernetes.io/projected/810c363b-a4c7-428d-a2fb-285adc29f477-kube-api-access-2nrj9" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497415 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="addddaac-a31a-4dbf-b78f-87225b11b463" volumeName="kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-kube-api-access-lr65l" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497434 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" volumeName="kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-catalog-content" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497451 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="35821f48-b000-4915-847f-a739b6efc5ee" volumeName="kubernetes.io/configmap/35821f48-b000-4915-847f-a739b6efc5ee-trusted-ca" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497466 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f893663c-7c1e-4eda-9839-99c1c0440304" volumeName="kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497485 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa" volumeName="kubernetes.io/projected/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-kube-api-access-xdbpk" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497500 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f1534e25-7add-46a1-8f4e-0065c232aa4e" volumeName="kubernetes.io/projected/f1534e25-7add-46a1-8f4e-0065c232aa4e-kube-api-access-4d7pj" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497523 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a043ea49-97f9-4ae6-83b9-733f12754d94" volumeName="kubernetes.io/projected/a043ea49-97f9-4ae6-83b9-733f12754d94-kube-api-access-6jlvp" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497563 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b966c210-5415-4fa5-88ab-c85aba979b28" volumeName="kubernetes.io/secret/b966c210-5415-4fa5-88ab-c85aba979b28-tls-certificates" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497582 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cedb0b3e-674e-40b9-a10d-45a9f0c5c59c" volumeName="kubernetes.io/projected/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-kube-api-access-w2ndk" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497598 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="512ba6af-11ad-4217-a1ce-a2ab3ef67ec5" volumeName="kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497612 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55c4f1e1-1b78-45ec-915d-8055ab3e2786" volumeName="kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-auth-proxy-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497632 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55c4f1e1-1b78-45ec-915d-8055ab3e2786" volumeName="kubernetes.io/projected/55c4f1e1-1b78-45ec-915d-8055ab3e2786-kube-api-access-b8g99" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497648 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f091088-2166-4026-9fa6-62bd83407edb" volumeName="kubernetes.io/projected/7f091088-2166-4026-9fa6-62bd83407edb-kube-api-access-s5t2f" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497664 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0274dc-fac1-41f9-b3e5-77253d851fdf" volumeName="kubernetes.io/secret/fb0274dc-fac1-41f9-b3e5-77253d851fdf-catalogserver-certs" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497679 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="800f436c-145d-4281-8d4d-644ba2cb0ebb" volumeName="kubernetes.io/projected/800f436c-145d-4281-8d4d-644ba2cb0ebb-kube-api-access-ngkqz" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497695 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="813f3ee7-35b5-4ee8-b453-00d16d910eae" volumeName="kubernetes.io/projected/813f3ee7-35b5-4ee8-b453-00d16d910eae-kube-api-access-8w592" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497712 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eb4d8477-c3b5-4e88-aaa9-222ad56d974c" volumeName="kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-node-bootstrap-token" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497728 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17912746-74eb-4c78-8c1b-2f66e7ce4299" volumeName="kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497745 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d142201-6e77-4828-b86b-05d4144a2f08" volumeName="kubernetes.io/secret/2d142201-6e77-4828-b86b-05d4144a2f08-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497763 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" volumeName="kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-encryption-config" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497778 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="59d3d0d8-1a2a-4d14-8312-d33818acba88" volumeName="kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-script-lib" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497794 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="871cb002-67f4-43aa-a41d-7a5b2f340059" volumeName="kubernetes.io/secret/871cb002-67f4-43aa-a41d-7a5b2f340059-metrics-tls" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497810 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3863c74-8f22-4c67-bef5-2d0d39df4abd" volumeName="kubernetes.io/projected/c3863c74-8f22-4c67-bef5-2d0d39df4abd-kube-api-access-pc4z5" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497827 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="addddaac-a31a-4dbf-b78f-87225b11b463" volumeName="kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497844 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" volumeName="kubernetes.io/secret/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497860 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.497837 master-0 kubenswrapper[33572]: I1204 22:18:56.497875 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" volumeName="kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.497891 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" volumeName="kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.497907 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a544105a-5bec-456a-aef6-c160943c1f67" volumeName="kubernetes.io/projected/a544105a-5bec-456a-aef6-c160943c1f67-kube-api-access-scht6" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.497922 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a043ea49-97f9-4ae6-83b9-733f12754d94" volumeName="kubernetes.io/secret/a043ea49-97f9-4ae6-83b9-733f12754d94-cluster-storage-operator-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.497939 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" volumeName="kubernetes.io/projected/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-kube-api-access-9wh6b" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.497955 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="510a595a-21bf-48fc-85cd-707bc8f5536f" volumeName="kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.497970 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="810c363b-a4c7-428d-a2fb-285adc29f477" volumeName="kubernetes.io/secret/810c363b-a4c7-428d-a2fb-285adc29f477-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.497985 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76fd9f44-4365-4271-8772-025655c50334" volumeName="kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-binary-copy" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498000 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76fd9f44-4365-4271-8772-025655c50334" volumeName="kubernetes.io/projected/76fd9f44-4365-4271-8772-025655c50334-kube-api-access-9j8fr" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498015 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="989a73ce-3898-4f65-a437-2c7061f9375f" volumeName="kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-client" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498039 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" volumeName="kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-images" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498056 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3863c74-8f22-4c67-bef5-2d0d39df4abd" volumeName="kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-proxy-ca-bundles" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498075 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17912746-74eb-4c78-8c1b-2f66e7ce4299" volumeName="kubernetes.io/configmap/17912746-74eb-4c78-8c1b-2f66e7ce4299-metrics-client-ca" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498092 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="59d3d0d8-1a2a-4d14-8312-d33818acba88" volumeName="kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-env-overrides" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498107 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76fd9f44-4365-4271-8772-025655c50334" volumeName="kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-whereabouts-configmap" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498123 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="addddaac-a31a-4dbf-b78f-87225b11b463" volumeName="kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-bound-sa-token" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498143 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498160 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" volumeName="kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498179 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="634c1df6-de4d-4e26-8c71-d39311cae0ce" volumeName="kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-ovnkube-identity-cm" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498195 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" volumeName="kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cert" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498209 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb0274dc-fac1-41f9-b3e5-77253d851fdf" volumeName="kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-kube-api-access-r4czl" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498226 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce6002bb-4948-45ab-bb1d-ed65e86b6466" volumeName="kubernetes.io/projected/ce6002bb-4948-45ab-bb1d-ed65e86b6466-kube-api-access-87gv4" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498245 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce6b5a46-172b-4575-ba22-ff3c6ea4207f" volumeName="kubernetes.io/empty-dir/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-cache" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498261 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-service-ca" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498279 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9" volumeName="kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498296 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="967bf4ac-f025-4296-8ed9-183a345f6b7c" volumeName="kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-srv-cert" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498311 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="35821f48-b000-4915-847f-a739b6efc5ee" volumeName="kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-bound-sa-token" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498326 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" volumeName="kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-image-import-ca" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498344 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" volumeName="kubernetes.io/projected/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-kube-api-access-bfcv9" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498363 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0beb871c-3bf1-471c-a028-746a650267bf" volumeName="kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498378 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c3863c74-8f22-4c67-bef5-2d0d39df4abd" volumeName="kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498392 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="989a73ce-3898-4f65-a437-2c7061f9375f" volumeName="kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-serving-ca" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498408 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" volumeName="kubernetes.io/projected/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-kube-api-access-2vpxd" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498424 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29828f55-427b-4fe3-8713-03bcd6ac9dec" volumeName="kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-utilities" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498440 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9" volumeName="kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-images" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498458 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f091088-2166-4026-9fa6-62bd83407edb" volumeName="kubernetes.io/secret/7f091088-2166-4026-9fa6-62bd83407edb-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498473 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="813f3ee7-35b5-4ee8-b453-00d16d910eae" volumeName="kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498489 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8d84a7d3-46d1-48e3-83f3-f6b32f16cc76" volumeName="kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-metrics-client-ca" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498513 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="989a73ce-3898-4f65-a437-2c7061f9375f" volumeName="kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-trusted-ca-bundle" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498529 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5598683a-cd32-486d-8839-205829d55cc2" volumeName="kubernetes.io/projected/5598683a-cd32-486d-8839-205829d55cc2-kube-api-access-s2lwr" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498566 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceb419e4-d804-4111-b8d8-8436cc2ee617" volumeName="kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-client" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498582 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e" volumeName="kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498596 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebfbb13d-c3f2-476d-bd89-cb8a13d2acee" volumeName="kubernetes.io/projected/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-kube-api-access-xkrvr" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498612 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0beb871c-3bf1-471c-a028-746a650267bf" volumeName="kubernetes.io/configmap/0beb871c-3bf1-471c-a028-746a650267bf-trusted-ca" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498630 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="810c363b-a4c7-428d-a2fb-285adc29f477" volumeName="kubernetes.io/empty-dir/810c363b-a4c7-428d-a2fb-285adc29f477-available-featuregates" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498645 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="989a73ce-3898-4f65-a437-2c7061f9375f" volumeName="kubernetes.io/projected/989a73ce-3898-4f65-a437-2c7061f9375f-kube-api-access-7fmp4" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498660 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0173b8a7-07b4-407a-80b6-d86754072fd8" volumeName="kubernetes.io/projected/0173b8a7-07b4-407a-80b6-d86754072fd8-kube-api-access-z4cdh" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498674 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f893663c-7c1e-4eda-9839-99c1c0440304" volumeName="kubernetes.io/projected/f893663c-7c1e-4eda-9839-99c1c0440304-kube-api-access-g8d54" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498689 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8d84a7d3-46d1-48e3-83f3-f6b32f16cc76" volumeName="kubernetes.io/empty-dir/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-volume-directive-shadow" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498706 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="989a73ce-3898-4f65-a437-2c7061f9375f" volumeName="kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-audit-policies" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498721 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0a726f44-a509-46b3-a6d5-70afe3b55e9f" volumeName="kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498736 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="56f25fad-089d-4df6-abb1-10d4c76750f1" volumeName="kubernetes.io/configmap/56f25fad-089d-4df6-abb1-10d4c76750f1-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498752 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8d84a7d3-46d1-48e3-83f3-f6b32f16cc76" volumeName="kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-tls" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498767 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6684358b-d7a6-4396-9b4f-ea67d85e4517" volumeName="kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-tls" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498785 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0a726f44-a509-46b3-a6d5-70afe3b55e9f" volumeName="kubernetes.io/configmap/0a726f44-a509-46b3-a6d5-70afe3b55e9f-metrics-client-ca" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498800 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="35821f48-b000-4915-847f-a739b6efc5ee" volumeName="kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498814 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46229484-5fa1-4595-94a0-44477abae90e" volumeName="kubernetes.io/projected/46229484-5fa1-4595-94a0-44477abae90e-kube-api-access-jwk6f" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498829 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f893663c-7c1e-4eda-9839-99c1c0440304" volumeName="kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-service-ca-bundle" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.498843 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="690b447a-19c0-4925-bc9d-d0c86a83a377" volumeName="kubernetes.io/projected/690b447a-19c0-4925-bc9d-d0c86a83a377-kube-api-access-wsxkk" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499001 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="967bf4ac-f025-4296-8ed9-183a345f6b7c" volumeName="kubernetes.io/projected/967bf4ac-f025-4296-8ed9-183a345f6b7c-kube-api-access-hsk29" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499030 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" volumeName="kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499048 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6c8c45e0-2342-499b-aa6b-339b6a722a87" volumeName="kubernetes.io/projected/6c8c45e0-2342-499b-aa6b-339b6a722a87-kube-api-access-gcgg9" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499066 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0a726f44-a509-46b3-a6d5-70afe3b55e9f" volumeName="kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499082 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="634c1df6-de4d-4e26-8c71-d39311cae0ce" volumeName="kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-env-overrides" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499098 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="634c1df6-de4d-4e26-8c71-d39311cae0ce" volumeName="kubernetes.io/secret/634c1df6-de4d-4e26-8c71-d39311cae0ce-webhook-cert" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499118 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74197c50-9a41-40e8-9289-c7e6afbd3737" volumeName="kubernetes.io/secret/74197c50-9a41-40e8-9289-c7e6afbd3737-cloud-controller-manager-operator-tls" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499138 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eb4d8477-c3b5-4e88-aaa9-222ad56d974c" volumeName="kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-certs" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499156 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebfbb13d-c3f2-476d-bd89-cb8a13d2acee" volumeName="kubernetes.io/configmap/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-mcc-auth-proxy-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499173 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29828f55-427b-4fe3-8713-03bcd6ac9dec" volumeName="kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-catalog-content" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499190 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" volumeName="kubernetes.io/projected/4f22eee4-a42d-4d2b-bffa-6c3f29f1f026-kube-api-access-vd6d8" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499209 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" volumeName="kubernetes.io/projected/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-kube-api-access-pt2jq" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499225 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="989a73ce-3898-4f65-a437-2c7061f9375f" volumeName="kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499241 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebfbb13d-c3f2-476d-bd89-cb8a13d2acee" volumeName="kubernetes.io/secret/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-proxy-tls" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499256 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fbb8e73f-7e50-451b-b400-e88a86b51e09" volumeName="kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-tuned" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499276 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d68dcb1-efe4-425f-9b28-1e5575548a32" volumeName="kubernetes.io/configmap/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-cabundle" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499295 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" volumeName="kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cluster-baremetal-operator-tls" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499318 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cedb0b3e-674e-40b9-a10d-45a9f0c5c59c" volumeName="kubernetes.io/configmap/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-iptables-alerter-script" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499334 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="512ba6af-11ad-4217-a1ce-a2ab3ef67ec5" volumeName="kubernetes.io/configmap/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-telemetry-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499351 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6c8c45e0-2342-499b-aa6b-339b6a722a87" volumeName="kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-daemon-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499368 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="800f436c-145d-4281-8d4d-644ba2cb0ebb" volumeName="kubernetes.io/secret/800f436c-145d-4281-8d4d-644ba2cb0ebb-cloud-credential-operator-serving-cert" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499385 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="989a73ce-3898-4f65-a437-2c7061f9375f" volumeName="kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-encryption-config" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499401 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d142201-6e77-4828-b86b-05d4144a2f08" volumeName="kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-trusted-ca-bundle" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499415 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" volumeName="kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-serving-ca" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499430 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="634c1df6-de4d-4e26-8c71-d39311cae0ce" volumeName="kubernetes.io/projected/634c1df6-de4d-4e26-8c71-d39311cae0ce-kube-api-access-xgt75" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499446 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5c2d3b8-41c0-4531-b770-57b7c567fe30" volumeName="kubernetes.io/configmap/a5c2d3b8-41c0-4531-b770-57b7c567fe30-config-volume" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499461 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c178afcf-b713-4c74-b22b-6169ba3123f5" volumeName="kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-metrics-certs" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499478 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fbb8e73f-7e50-451b-b400-e88a86b51e09" volumeName="kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-tmp" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499493 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e065179e-634a-4cbe-bb59-5b01c514e4de" volumeName="kubernetes.io/projected/e065179e-634a-4cbe-bb59-5b01c514e4de-kube-api-access" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499515 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e37d318a-5bf8-46ed-b6de-494102738da7" volumeName="kubernetes.io/projected/e37d318a-5bf8-46ed-b6de-494102738da7-kube-api-access-r57bb" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499531 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6c8c45e0-2342-499b-aa6b-339b6a722a87" volumeName="kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-cni-binary-copy" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499585 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae107ad4-104c-4264-9844-afb3af28b19e" volumeName="kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-utilities" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499602 33572 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c2279404-fa75-4de2-a302-d7b15ead5232" volumeName="kubernetes.io/projected/c2279404-fa75-4de2-a302-d7b15ead5232-kube-api-access-dd5zx" seLinuxMountContext="" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499618 33572 reconstruct.go:97] "Volume reconstruction finished" Dec 04 22:18:56.499578 master-0 kubenswrapper[33572]: I1204 22:18:56.499629 33572 reconciler.go:26] "Reconciler: start to sync state" Dec 04 22:18:56.504274 master-0 kubenswrapper[33572]: I1204 22:18:56.504240 33572 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 04 22:18:56.519200 master-0 kubenswrapper[33572]: I1204 22:18:56.519002 33572 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 04 22:18:56.523610 master-0 kubenswrapper[33572]: I1204 22:18:56.523522 33572 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 04 22:18:56.523681 master-0 kubenswrapper[33572]: I1204 22:18:56.523648 33572 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 04 22:18:56.523713 master-0 kubenswrapper[33572]: I1204 22:18:56.523690 33572 kubelet.go:2335] "Starting kubelet main sync loop" Dec 04 22:18:56.524196 master-0 kubenswrapper[33572]: E1204 22:18:56.523805 33572 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 04 22:18:56.526392 master-0 kubenswrapper[33572]: I1204 22:18:56.526236 33572 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 22:18:56.537814 master-0 kubenswrapper[33572]: I1204 22:18:56.537729 33572 generic.go:334] "Generic (PLEG): container finished" podID="7f091088-2166-4026-9fa6-62bd83407edb" containerID="437ce0db468372672eda2ac00bd5f2a8af4827f3a8e23b48967061bc95032bfb" exitCode=0 Dec 04 22:18:56.542076 master-0 kubenswrapper[33572]: I1204 22:18:56.541955 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/2.log" Dec 04 22:18:56.542639 master-0 kubenswrapper[33572]: I1204 22:18:56.542501 33572 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="b6f9e5d170c5c01abcc938a01683f90fda3ae8ba34521ff9d208045fb85cbe9d" exitCode=1 Dec 04 22:18:56.542639 master-0 kubenswrapper[33572]: I1204 22:18:56.542555 33572 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="7ab8b346978ad6f1cf331a2cd2e464eb81897737f06c530111b331aae07ed9d5" exitCode=0 Dec 04 22:18:56.545273 master-0 kubenswrapper[33572]: I1204 22:18:56.545227 33572 generic.go:334] "Generic (PLEG): container finished" podID="c6a5d14d-0409-4024-b0a8-200fa2594185" containerID="5553ac89bf95798ab2decbb87aaa4e3e8d835fcf542a4ada2023ac05d60471d4" exitCode=0 Dec 04 22:18:56.556511 master-0 kubenswrapper[33572]: I1204 22:18:56.556465 33572 generic.go:334] "Generic (PLEG): container finished" podID="59d3d0d8-1a2a-4d14-8312-d33818acba88" containerID="d14cbc85e41a76d9831e3cb322a42ef6928588924655708cdbc5b0d0983944d9" exitCode=0 Dec 04 22:18:56.561876 master-0 kubenswrapper[33572]: I1204 22:18:56.561834 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-88d48b57d-pp4fd_74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/machine-api-operator/0.log" Dec 04 22:18:56.562461 master-0 kubenswrapper[33572]: I1204 22:18:56.562397 33572 generic.go:334] "Generic (PLEG): container finished" podID="74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9" containerID="6d31ad2a1f5237b4355ed2f39e4e13656076c4a85f80a08d5d712a1a6ab75238" exitCode=255 Dec 04 22:18:56.565938 master-0 kubenswrapper[33572]: I1204 22:18:56.565917 33572 generic.go:334] "Generic (PLEG): container finished" podID="a043ea49-97f9-4ae6-83b9-733f12754d94" containerID="3e05aae4893276e8afe69a36c63278b1771095172b32eb2660a3d4bf0f266404" exitCode=0 Dec 04 22:18:56.568739 master-0 kubenswrapper[33572]: I1204 22:18:56.568723 33572 generic.go:334] "Generic (PLEG): container finished" podID="f893663c-7c1e-4eda-9839-99c1c0440304" containerID="4325f17544835c85c6a52e1b1f681caef960ed59819852b48ea3b6353d61e1b5" exitCode=0 Dec 04 22:18:56.580283 master-0 kubenswrapper[33572]: I1204 22:18:56.580197 33572 generic.go:334] "Generic (PLEG): container finished" podID="4d68dcb1-efe4-425f-9b28-1e5575548a32" containerID="2f8f422694aa4bc57d4ecc64211f7f799287be11acc20da48bbd2da04f761575" exitCode=0 Dec 04 22:18:56.585329 master-0 kubenswrapper[33572]: I1204 22:18:56.585285 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-nk92d_634c1df6-de4d-4e26-8c71-d39311cae0ce/approver/1.log" Dec 04 22:18:56.585907 master-0 kubenswrapper[33572]: I1204 22:18:56.585838 33572 generic.go:334] "Generic (PLEG): container finished" podID="634c1df6-de4d-4e26-8c71-d39311cae0ce" containerID="a1cf3561aceffd368c0ca4d9cb40d000ada9182cc1eeba5246056ae555e8f11e" exitCode=1 Dec 04 22:18:56.593891 master-0 kubenswrapper[33572]: I1204 22:18:56.593768 33572 generic.go:334] "Generic (PLEG): container finished" podID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerID="1bd03b6d56dba3556ff5faee83fe97db0ea1194b6ba6e4b1aac4ae2f4b0e67d8" exitCode=0 Dec 04 22:18:56.597657 master-0 kubenswrapper[33572]: I1204 22:18:56.597600 33572 generic.go:334] "Generic (PLEG): container finished" podID="3f6d05b8-b7b4-4b2d-ace0-d1f59035d161" containerID="7d1cd57ab4d6b616f66a493d64b829158d3913e17906249bebc8057db4a21035" exitCode=0 Dec 04 22:18:56.600638 master-0 kubenswrapper[33572]: I1204 22:18:56.600593 33572 generic.go:334] "Generic (PLEG): container finished" podID="35821f48-b000-4915-847f-a739b6efc5ee" containerID="50388f7f0e80879ab901f5d4bca3a6dc9ea1a39437e9578943365b4294d8b25d" exitCode=0 Dec 04 22:18:56.610353 master-0 kubenswrapper[33572]: I1204 22:18:56.610311 33572 generic.go:334] "Generic (PLEG): container finished" podID="46229484-5fa1-4595-94a0-44477abae90e" containerID="43708b3c1f0fc23d49f5b68e72b2abf6c84e81e5b8fe673a0fccaff92e14b81d" exitCode=0 Dec 04 22:18:56.614313 master-0 kubenswrapper[33572]: I1204 22:18:56.614295 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-74d9cbffbc-nzqgx_5dac8e25-0f51-4c04-929c-060479689a9d/machine-approver-controller/0.log" Dec 04 22:18:56.619157 master-0 kubenswrapper[33572]: I1204 22:18:56.619120 33572 generic.go:334] "Generic (PLEG): container finished" podID="5dac8e25-0f51-4c04-929c-060479689a9d" containerID="028e28ae583843cf573a020510b23f29bb89888d81cbcae42e7af0446b2d3e61" exitCode=255 Dec 04 22:18:56.624515 master-0 kubenswrapper[33572]: E1204 22:18:56.624301 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:18:56.625735 master-0 kubenswrapper[33572]: I1204 22:18:56.625681 33572 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="758bcdf683109d822a1017f454c5645fc9f981b1015625c2d5ef493072ef4678" exitCode=0 Dec 04 22:18:56.625735 master-0 kubenswrapper[33572]: I1204 22:18:56.625720 33572 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="3574d633e7308db3b6dd662bd037451e5d0ed5c34c61a73c66397c77d3caf66e" exitCode=0 Dec 04 22:18:56.625816 master-0 kubenswrapper[33572]: I1204 22:18:56.625728 33572 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="189119b91f6e6ef0f62e51f0cc69d03fbbc0144ce142853e62f56609d2029b1d" exitCode=0 Dec 04 22:18:56.625972 master-0 kubenswrapper[33572]: I1204 22:18:56.625863 33572 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="eade6c63cfbfd85793c4e11745edd4d5a786bcef37074f29af89908e936863d7" exitCode=0 Dec 04 22:18:56.626028 master-0 kubenswrapper[33572]: I1204 22:18:56.625994 33572 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="3903951768e93b52af44e2ee6090549f67bc30f2eeffd34acda2b5e56323b0df" exitCode=0 Dec 04 22:18:56.626107 master-0 kubenswrapper[33572]: I1204 22:18:56.626075 33572 generic.go:334] "Generic (PLEG): container finished" podID="76fd9f44-4365-4271-8772-025655c50334" containerID="20fdbd8f60e4052a44e37c80c735da9d3ff66c7350cb568fd169c055622f648f" exitCode=0 Dec 04 22:18:56.638421 master-0 kubenswrapper[33572]: I1204 22:18:56.638367 33572 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="d9582fc250da782a33466d4d52e589af275376f87d9bdf03fa1cb11c7d23524e" exitCode=0 Dec 04 22:18:56.638421 master-0 kubenswrapper[33572]: I1204 22:18:56.638409 33572 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="ff5921da732d05f72d82c0539e6f4661b6512a68939a31fe1f83fa7bbd8cf1d4" exitCode=0 Dec 04 22:18:56.638421 master-0 kubenswrapper[33572]: I1204 22:18:56.638421 33572 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="553d3584b8fff905a7e34ad91d98d0f31e54579b68090cae0d50c0891bc22dd5" exitCode=0 Dec 04 22:18:56.641135 master-0 kubenswrapper[33572]: I1204 22:18:56.641074 33572 generic.go:334] "Generic (PLEG): container finished" podID="800f436c-145d-4281-8d4d-644ba2cb0ebb" containerID="66fa513342b7f47d4f807e0f29b5398451337b70d0aea1cac07ea70f754f3d14" exitCode=0 Dec 04 22:18:56.643819 master-0 kubenswrapper[33572]: I1204 22:18:56.643785 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-8lq7w_871cb002-67f4-43aa-a41d-7a5b2f340059/network-operator/0.log" Dec 04 22:18:56.643918 master-0 kubenswrapper[33572]: I1204 22:18:56.643830 33572 generic.go:334] "Generic (PLEG): container finished" podID="871cb002-67f4-43aa-a41d-7a5b2f340059" containerID="9d7fd4b64c7f9d10b43359385a6360e49aa71c5085c781ef53642cd82a85d004" exitCode=255 Dec 04 22:18:56.647082 master-0 kubenswrapper[33572]: I1204 22:18:56.647022 33572 generic.go:334] "Generic (PLEG): container finished" podID="2cb8c983acca0c27a191b3f720d4b1e0" containerID="509e3ce53fc945130075276f6099e96d73baf21a6fcaddff5d395b3b94de9c58" exitCode=0 Dec 04 22:18:56.657885 master-0 kubenswrapper[33572]: I1204 22:18:56.657730 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-nxbjw_ce6b5a46-172b-4575-ba22-ff3c6ea4207f/manager/1.log" Dec 04 22:18:56.658739 master-0 kubenswrapper[33572]: I1204 22:18:56.658668 33572 generic.go:334] "Generic (PLEG): container finished" podID="ce6b5a46-172b-4575-ba22-ff3c6ea4207f" containerID="fe4d171382cafc8367d04ba38562e53607a288ace82dafd0c07ed366d2a1ef56" exitCode=1 Dec 04 22:18:56.660497 master-0 kubenswrapper[33572]: I1204 22:18:56.660427 33572 generic.go:334] "Generic (PLEG): container finished" podID="3099f7b5-f904-4d15-aedb-f4e558b813e4" containerID="f9af7ae05881c66c990776ea5e9ecae6917372ad2e83deed7c505b583fa9da46" exitCode=0 Dec 04 22:18:56.667580 master-0 kubenswrapper[33572]: I1204 22:18:56.667540 33572 generic.go:334] "Generic (PLEG): container finished" podID="8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb" containerID="446458854f272c65918d3eef29e63c52aea4a45ba36f434208f291e2e3410da7" exitCode=0 Dec 04 22:18:56.670479 master-0 kubenswrapper[33572]: I1204 22:18:56.670402 33572 generic.go:334] "Generic (PLEG): container finished" podID="690b447a-19c0-4925-bc9d-d0c86a83a377" containerID="a2b4230e9757af974dea8f15ae2efeb4c125ff55b0c9cdc6e359f1aae71c8941" exitCode=0 Dec 04 22:18:56.674195 master-0 kubenswrapper[33572]: I1204 22:18:56.674159 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_4b9fbd90-66d5-4637-9821-22242aa6f6d7/installer/0.log" Dec 04 22:18:56.674307 master-0 kubenswrapper[33572]: I1204 22:18:56.674201 33572 generic.go:334] "Generic (PLEG): container finished" podID="4b9fbd90-66d5-4637-9821-22242aa6f6d7" containerID="05ebda65d53028c7345257866dac633a27c8894eb475430d761e1c0a053ea020" exitCode=1 Dec 04 22:18:56.687823 master-0 kubenswrapper[33572]: I1204 22:18:56.687745 33572 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f" exitCode=0 Dec 04 22:18:56.693831 master-0 kubenswrapper[33572]: I1204 22:18:56.693782 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-5f49d774cd-5m4l9_5598683a-cd32-486d-8839-205829d55cc2/cluster-autoscaler-operator/0.log" Dec 04 22:18:56.694358 master-0 kubenswrapper[33572]: I1204 22:18:56.694310 33572 generic.go:334] "Generic (PLEG): container finished" podID="5598683a-cd32-486d-8839-205829d55cc2" containerID="922ec5ad22c0f758ca0c6af6881b85724b616bc9bf1514cfd7b12d47fc0ff553" exitCode=255 Dec 04 22:18:56.696493 master-0 kubenswrapper[33572]: I1204 22:18:56.696458 33572 generic.go:334] "Generic (PLEG): container finished" podID="986a4de7-3a54-48dc-9599-49cf19ba0ad5" containerID="42c592fcd97dd09de62f2c07d511eb1f7fedb875ba01c50a76d8a639e15849ae" exitCode=0 Dec 04 22:18:56.701092 master-0 kubenswrapper[33572]: I1204 22:18:56.701045 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-85cff47f46-4dv2b_0beb871c-3bf1-471c-a028-746a650267bf/cluster-node-tuning-operator/0.log" Dec 04 22:18:56.701248 master-0 kubenswrapper[33572]: I1204 22:18:56.701116 33572 generic.go:334] "Generic (PLEG): container finished" podID="0beb871c-3bf1-471c-a028-746a650267bf" containerID="3319d82050d7163f2a7f96d02081f2908fac76018c64e251ebfd0f4e73ccabfd" exitCode=1 Dec 04 22:18:56.703676 master-0 kubenswrapper[33572]: I1204 22:18:56.703638 33572 generic.go:334] "Generic (PLEG): container finished" podID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerID="1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3" exitCode=0 Dec 04 22:18:56.706100 master-0 kubenswrapper[33572]: I1204 22:18:56.706047 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/4.log" Dec 04 22:18:56.706185 master-0 kubenswrapper[33572]: I1204 22:18:56.706140 33572 generic.go:334] "Generic (PLEG): container finished" podID="4f22eee4-a42d-4d2b-bffa-6c3f29f1f026" containerID="27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780" exitCode=1 Dec 04 22:18:56.708167 master-0 kubenswrapper[33572]: I1204 22:18:56.708126 33572 generic.go:334] "Generic (PLEG): container finished" podID="0791dc66-67d9-42bd-b7c3-d45dc5513c3b" containerID="f70a0cabfa84fd6dac7eab4d978f050d5e781f995d7f4f93a12a51cc9706d0d9" exitCode=0 Dec 04 22:18:56.710252 master-0 kubenswrapper[33572]: I1204 22:18:56.710203 33572 generic.go:334] "Generic (PLEG): container finished" podID="85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" containerID="00713a1c06d69e4187d092bf84b0d17670a9eda7c3ce1307b7efa35d4e53871c" exitCode=0 Dec 04 22:18:56.712082 master-0 kubenswrapper[33572]: I1204 22:18:56.712047 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba/installer/0.log" Dec 04 22:18:56.712160 master-0 kubenswrapper[33572]: I1204 22:18:56.712095 33572 generic.go:334] "Generic (PLEG): container finished" podID="3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" containerID="7f95f72da52c53d3c8d88cdae7b632b1e707bccffe42c9e45b84331a1108d0c6" exitCode=1 Dec 04 22:18:56.714377 master-0 kubenswrapper[33572]: I1204 22:18:56.714339 33572 generic.go:334] "Generic (PLEG): container finished" podID="989a73ce-3898-4f65-a437-2c7061f9375f" containerID="e1ab85fa23f372e6c12039f42a8215b4ecb7099a306302bdcd4c1624786fb3f7" exitCode=0 Dec 04 22:18:56.717805 master-0 kubenswrapper[33572]: I1204 22:18:56.717774 33572 generic.go:334] "Generic (PLEG): container finished" podID="5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141" containerID="2e48ce38bedd0ef286f4eb2d0319a994f1a61d767e4cb03996c6448334d88c07" exitCode=0 Dec 04 22:18:56.720057 master-0 kubenswrapper[33572]: I1204 22:18:56.720013 33572 generic.go:334] "Generic (PLEG): container finished" podID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerID="577801a549fb8e8b80c730f6cb1e1c0076264ab71677f9e7afd6abe1e4f77036" exitCode=0 Dec 04 22:18:56.724366 master-0 kubenswrapper[33572]: I1204 22:18:56.724342 33572 generic.go:334] "Generic (PLEG): container finished" podID="810c363b-a4c7-428d-a2fb-285adc29f477" containerID="cc16e00745629c42a3771375fb21884cb52774d57fe0324c10a8680dd9a3742b" exitCode=0 Dec 04 22:18:56.724366 master-0 kubenswrapper[33572]: I1204 22:18:56.724359 33572 generic.go:334] "Generic (PLEG): container finished" podID="810c363b-a4c7-428d-a2fb-285adc29f477" containerID="b8a9c51a67f38c6ea4afbc1a4b2e8c17d0b815c4b55531281069c19c0fd8cfa9" exitCode=0 Dec 04 22:18:56.728814 master-0 kubenswrapper[33572]: I1204 22:18:56.728758 33572 generic.go:334] "Generic (PLEG): container finished" podID="2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" containerID="ebceb6eb636a1f740136f2a1db4a9178448d55ff6db47b35ebd00354ae58e8f7" exitCode=0 Dec 04 22:18:56.728910 master-0 kubenswrapper[33572]: I1204 22:18:56.728815 33572 generic.go:334] "Generic (PLEG): container finished" podID="2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae" containerID="95541d3029d5588838c47cb8939ee7fe2e3c3f04da641f8f8e31b33c2e5cfb73" exitCode=0 Dec 04 22:18:56.731493 master-0 kubenswrapper[33572]: I1204 22:18:56.731456 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-67477646d4-bslb5_813f3ee7-35b5-4ee8-b453-00d16d910eae/package-server-manager/0.log" Dec 04 22:18:56.732100 master-0 kubenswrapper[33572]: I1204 22:18:56.732066 33572 generic.go:334] "Generic (PLEG): container finished" podID="813f3ee7-35b5-4ee8-b453-00d16d910eae" containerID="f11072c38e40de60dafeffc2c5ef9e1780820ca0ce672700aba155fa414fe72c" exitCode=1 Dec 04 22:18:56.740623 master-0 kubenswrapper[33572]: I1204 22:18:56.740552 33572 generic.go:334] "Generic (PLEG): container finished" podID="ce6002bb-4948-45ab-bb1d-ed65e86b6466" containerID="c2ad6d2719e3800fef2a35a9686c68acbf17ddb950d85a4469689ef746cce44d" exitCode=0 Dec 04 22:18:56.740623 master-0 kubenswrapper[33572]: I1204 22:18:56.740596 33572 generic.go:334] "Generic (PLEG): container finished" podID="ce6002bb-4948-45ab-bb1d-ed65e86b6466" containerID="0323783c48e18783d0f18adc0e52bb623413c80d32bdfc761472fc94945f10bc" exitCode=0 Dec 04 22:18:56.743947 master-0 kubenswrapper[33572]: I1204 22:18:56.743824 33572 generic.go:334] "Generic (PLEG): container finished" podID="55c4f1e1-1b78-45ec-915d-8055ab3e2786" containerID="1326348e3e2d8bc1da0a2a933c011acb7d00a92e48a2890745437b6f15960271" exitCode=0 Dec 04 22:18:56.746685 master-0 kubenswrapper[33572]: I1204 22:18:56.746650 33572 generic.go:334] "Generic (PLEG): container finished" podID="ceb419e4-d804-4111-b8d8-8436cc2ee617" containerID="255af5caa519126130f0822a951a744700ae3fbbe4597a788a35633eb402cf2a" exitCode=0 Dec 04 22:18:56.749160 master-0 kubenswrapper[33572]: I1204 22:18:56.749133 33572 generic.go:334] "Generic (PLEG): container finished" podID="e37d318a-5bf8-46ed-b6de-494102738da7" containerID="79e1b635bb095edbd66388094922c6134f767f1a2efc7b3eca6e45abd8f571c6" exitCode=0 Dec 04 22:18:56.752880 master-0 kubenswrapper[33572]: I1204 22:18:56.752853 33572 generic.go:334] "Generic (PLEG): container finished" podID="a8636bd7-fa9e-44b9-82df-9d37b398736d" containerID="74597696ddcec56d10e68ca1d29cef5d1bfa40762646f7e9dc8729a8c66636fd" exitCode=0 Dec 04 22:18:56.755169 master-0 kubenswrapper[33572]: I1204 22:18:56.755145 33572 generic.go:334] "Generic (PLEG): container finished" podID="6e011b0a-89e2-47e3-9112-d46a828416b1" containerID="81f5bc53e7bd37d1c3167c411f68ef8d2e2f1eae21a167bd8c740d425e144c3a" exitCode=0 Dec 04 22:18:56.762561 master-0 kubenswrapper[33572]: I1204 22:18:56.762190 33572 generic.go:334] "Generic (PLEG): container finished" podID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerID="5f1c65cf31bac9c169d2527d0952f6b2fb651f148801aa43c79ceb4a8adb4da6" exitCode=0 Dec 04 22:18:56.762561 master-0 kubenswrapper[33572]: I1204 22:18:56.762250 33572 generic.go:334] "Generic (PLEG): container finished" podID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerID="5b29db78fe5a1942ea20ecc7d711d841b8eb39751995722550ca54e6750f1a0c" exitCode=0 Dec 04 22:18:56.768144 master-0 kubenswrapper[33572]: I1204 22:18:56.768099 33572 generic.go:334] "Generic (PLEG): container finished" podID="dbe54b09-0399-4fbe-9f84-dd9dede0ab96" containerID="5a4d99a6b7149fd4133c1e3efcfd35582ffcb1582acaa62e903eb008119e1624" exitCode=0 Dec 04 22:18:56.776453 master-0 kubenswrapper[33572]: I1204 22:18:56.775551 33572 generic.go:334] "Generic (PLEG): container finished" podID="56f25fad-089d-4df6-abb1-10d4c76750f1" containerID="d8a2de466dc95e948ba536210f040992057ba7bc222a8102fb88249ab34f040a" exitCode=0 Dec 04 22:18:56.781377 master-0 kubenswrapper[33572]: I1204 22:18:56.781338 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-nznvn_f1534e25-7add-46a1-8f4e-0065c232aa4e/control-plane-machine-set-operator/1.log" Dec 04 22:18:56.781477 master-0 kubenswrapper[33572]: I1204 22:18:56.781388 33572 generic.go:334] "Generic (PLEG): container finished" podID="f1534e25-7add-46a1-8f4e-0065c232aa4e" containerID="893b1713daa62af184411a3e5a2cfc2bd5735ca25c31751314839bd533678913" exitCode=1 Dec 04 22:18:56.783332 master-0 kubenswrapper[33572]: I1204 22:18:56.783281 33572 generic.go:334] "Generic (PLEG): container finished" podID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerID="719d3f66cbdb2170aefa60d42b234f7eb81fd7d5f45e585cd2b86f0e36930c80" exitCode=0 Dec 04 22:18:56.785502 master-0 kubenswrapper[33572]: I1204 22:18:56.785466 33572 generic.go:334] "Generic (PLEG): container finished" podID="0a726f44-a509-46b3-a6d5-70afe3b55e9f" containerID="b1d3f0ea9fb633db12f795b3c197259244e72196814e421d282a1fe412cb79f2" exitCode=0 Dec 04 22:18:56.788552 master-0 kubenswrapper[33572]: I1204 22:18:56.788527 33572 generic.go:334] "Generic (PLEG): container finished" podID="465637a4-42be-4a65-a859-7af699960138" containerID="206992e5a976be25c0ca246941e52ef047963087cb7fb3d7fae48784f22c1968" exitCode=0 Dec 04 22:18:56.788552 master-0 kubenswrapper[33572]: I1204 22:18:56.788547 33572 generic.go:334] "Generic (PLEG): container finished" podID="465637a4-42be-4a65-a859-7af699960138" containerID="f11190eeabf32ca439cc6dbf2e5f945ac6892b6b5bf3d933639699117a6a4cbd" exitCode=0 Dec 04 22:18:56.788680 master-0 kubenswrapper[33572]: I1204 22:18:56.788555 33572 generic.go:334] "Generic (PLEG): container finished" podID="465637a4-42be-4a65-a859-7af699960138" containerID="8bd59644ccf9cb7c047ca7a95b61cb37f033530818fb51a36548a6089157cac2" exitCode=0 Dec 04 22:18:56.791046 master-0 kubenswrapper[33572]: I1204 22:18:56.791012 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-v7zfw_fb0274dc-fac1-41f9-b3e5-77253d851fdf/manager/1.log" Dec 04 22:18:56.792075 master-0 kubenswrapper[33572]: I1204 22:18:56.792052 33572 generic.go:334] "Generic (PLEG): container finished" podID="fb0274dc-fac1-41f9-b3e5-77253d851fdf" containerID="818c6aaeff094c3713e384dec4d55a28c1f228a4e98ba130afd94743d45d288f" exitCode=1 Dec 04 22:18:56.797909 master-0 kubenswrapper[33572]: I1204 22:18:56.797880 33572 generic.go:334] "Generic (PLEG): container finished" podID="0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" containerID="4c4fa6995a939a53e102917b86fbd0f10791e85887df9e375f44a27329f6b171" exitCode=0 Dec 04 22:18:56.813225 master-0 kubenswrapper[33572]: I1204 22:18:56.813142 33572 generic.go:334] "Generic (PLEG): container finished" podID="da6da420-9631-4bce-b238-96ab361e23e9" containerID="d281c21cd4e6a5bdaa904d1cec96c25970563004aa9943074804eafd85bd5f1e" exitCode=0 Dec 04 22:18:56.818851 master-0 kubenswrapper[33572]: I1204 22:18:56.818812 33572 generic.go:334] "Generic (PLEG): container finished" podID="ae107ad4-104c-4264-9844-afb3af28b19e" containerID="e77c322db09ee028391834636928860ad589dd50d5763a9eb98bf7d157a2104d" exitCode=0 Dec 04 22:18:56.818937 master-0 kubenswrapper[33572]: I1204 22:18:56.818855 33572 generic.go:334] "Generic (PLEG): container finished" podID="ae107ad4-104c-4264-9844-afb3af28b19e" containerID="54a32de727a29737d3f9e1ca99dbe42daef248c481ccfc250f9a1754750f20c0" exitCode=0 Dec 04 22:18:56.822348 master-0 kubenswrapper[33572]: I1204 22:18:56.822310 33572 generic.go:334] "Generic (PLEG): container finished" podID="e065179e-634a-4cbe-bb59-5b01c514e4de" containerID="aec30a53010adc6ee6176e40e860c2639cbdf974b27b2d24e1d71f75f8a5c427" exitCode=0 Dec 04 22:18:56.824671 master-0 kubenswrapper[33572]: E1204 22:18:56.824628 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:18:56.825339 master-0 kubenswrapper[33572]: I1204 22:18:56.825304 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/2.log" Dec 04 22:18:56.826051 master-0 kubenswrapper[33572]: I1204 22:18:56.825997 33572 generic.go:334] "Generic (PLEG): container finished" podID="a3899a38-39b8-4b48-81e5-4d8854ecc8ab" containerID="45f93308614301d84cb0176bf1dabc3de4bbaffff580b91a3cb5db6707e27be7" exitCode=1 Dec 04 22:18:56.828773 master-0 kubenswrapper[33572]: I1204 22:18:56.828730 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/5.log" Dec 04 22:18:56.831836 master-0 kubenswrapper[33572]: I1204 22:18:56.831775 33572 generic.go:334] "Generic (PLEG): container finished" podID="addddaac-a31a-4dbf-b78f-87225b11b463" containerID="35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e" exitCode=1 Dec 04 22:18:56.836243 master-0 kubenswrapper[33572]: I1204 22:18:56.836185 33572 generic.go:334] "Generic (PLEG): container finished" podID="ebfbb13d-c3f2-476d-bd89-cb8a13d2acee" containerID="989d170e7a52318dc012c9f9d9615ad932d427b1e6a54fccb0b6d83f4ebb7d45" exitCode=0 Dec 04 22:18:56.838513 master-0 kubenswrapper[33572]: I1204 22:18:56.838475 33572 generic.go:334] "Generic (PLEG): container finished" podID="2d142201-6e77-4828-b86b-05d4144a2f08" containerID="1129d1c5176ef3c828bda41dc553996cb75881e0c9229783b32fa908eaa25ec0" exitCode=0 Dec 04 22:18:56.843948 master-0 kubenswrapper[33572]: I1204 22:18:56.843761 33572 generic.go:334] "Generic (PLEG): container finished" podID="24648a41-875f-4e98-8b21-3bdd38dffa32" containerID="0e28b0cb43f14ec5571ccac1eddb3ba4fa32d2d2a461323c61e5fc655cd04d29" exitCode=0 Dec 04 22:18:56.856240 master-0 kubenswrapper[33572]: I1204 22:18:56.856186 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/config-sync-controllers/0.log" Dec 04 22:18:56.856898 master-0 kubenswrapper[33572]: I1204 22:18:56.856861 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/cluster-cloud-controller-manager/0.log" Dec 04 22:18:56.856952 master-0 kubenswrapper[33572]: I1204 22:18:56.856918 33572 generic.go:334] "Generic (PLEG): container finished" podID="74197c50-9a41-40e8-9289-c7e6afbd3737" containerID="d4cd669f8e4bd3008a9642035eec139a13bd3a586fb003b3adb4d948956c28f6" exitCode=1 Dec 04 22:18:56.856952 master-0 kubenswrapper[33572]: I1204 22:18:56.856943 33572 generic.go:334] "Generic (PLEG): container finished" podID="74197c50-9a41-40e8-9289-c7e6afbd3737" containerID="d7411bec11d115b15a691f3d3010646ecd1f289830d7539f76d0467f6cd83226" exitCode=1 Dec 04 22:18:56.859131 master-0 kubenswrapper[33572]: I1204 22:18:56.859100 33572 generic.go:334] "Generic (PLEG): container finished" podID="a544105a-5bec-456a-aef6-c160943c1f67" containerID="419aa165bec1fc028d5393e03a6724568d6a2d80fb3a00accfe0a6d847f186e7" exitCode=0 Dec 04 22:18:57.225701 master-0 kubenswrapper[33572]: E1204 22:18:57.225568 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:18:57.439088 master-0 kubenswrapper[33572]: I1204 22:18:57.438997 33572 apiserver.go:52] "Watching apiserver" Dec 04 22:18:57.465080 master-0 kubenswrapper[33572]: I1204 22:18:57.464983 33572 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 22:18:58.026682 master-0 kubenswrapper[33572]: E1204 22:18:58.026602 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:18:59.627685 master-0 kubenswrapper[33572]: E1204 22:18:59.627596 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:19:00.896680 master-0 kubenswrapper[33572]: I1204 22:19:00.896557 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b89698aa356a3bc32694e2b098f9a900/kube-apiserver-check-endpoints/0.log" Dec 04 22:19:00.898522 master-0 kubenswrapper[33572]: I1204 22:19:00.898442 33572 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="5fcdcfec6584d4b81237f292d9431a4c2eccc32a97cb24f1ad54b8dd23d476bb" exitCode=255 Dec 04 22:19:02.827920 master-0 kubenswrapper[33572]: E1204 22:19:02.827811 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:19:07.828346 master-0 kubenswrapper[33572]: E1204 22:19:07.828250 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:19:12.829635 master-0 kubenswrapper[33572]: E1204 22:19:12.829466 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:19:17.830358 master-0 kubenswrapper[33572]: E1204 22:19:17.830261 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:19:22.831489 master-0 kubenswrapper[33572]: E1204 22:19:22.831388 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:19:27.832634 master-0 kubenswrapper[33572]: E1204 22:19:27.832541 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:19:32.832858 master-0 kubenswrapper[33572]: E1204 22:19:32.832740 33572 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 22:19:34.223787 master-0 kubenswrapper[33572]: I1204 22:19:34.223744 33572 generic.go:334] "Generic (PLEG): container finished" podID="c178afcf-b713-4c74-b22b-6169ba3123f5" containerID="0036ee313e2e8fbc7aa4a79880a6b001a94f998abd62378bddfdf0a04bcdd8e0" exitCode=0 Dec 04 22:19:34.288979 master-0 kubenswrapper[33572]: I1204 22:19:34.287955 33572 manager.go:324] Recovery completed Dec 04 22:19:34.376061 master-0 kubenswrapper[33572]: I1204 22:19:34.375427 33572 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 04 22:19:34.376061 master-0 kubenswrapper[33572]: I1204 22:19:34.375476 33572 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 04 22:19:34.376061 master-0 kubenswrapper[33572]: I1204 22:19:34.375543 33572 state_mem.go:36] "Initialized new in-memory state store" Dec 04 22:19:34.376061 master-0 kubenswrapper[33572]: I1204 22:19:34.375845 33572 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 04 22:19:34.376061 master-0 kubenswrapper[33572]: I1204 22:19:34.375859 33572 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 04 22:19:34.376061 master-0 kubenswrapper[33572]: I1204 22:19:34.375895 33572 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Dec 04 22:19:34.376061 master-0 kubenswrapper[33572]: I1204 22:19:34.375902 33572 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Dec 04 22:19:34.376061 master-0 kubenswrapper[33572]: I1204 22:19:34.375909 33572 policy_none.go:49] "None policy: Start" Dec 04 22:19:34.384544 master-0 kubenswrapper[33572]: I1204 22:19:34.384476 33572 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 04 22:19:34.384616 master-0 kubenswrapper[33572]: I1204 22:19:34.384584 33572 state_mem.go:35] "Initializing new in-memory state store" Dec 04 22:19:34.384971 master-0 kubenswrapper[33572]: I1204 22:19:34.384943 33572 state_mem.go:75] "Updated machine memory state" Dec 04 22:19:34.384971 master-0 kubenswrapper[33572]: I1204 22:19:34.384965 33572 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Dec 04 22:19:34.404249 master-0 kubenswrapper[33572]: I1204 22:19:34.404199 33572 manager.go:334] "Starting Device Plugin manager" Dec 04 22:19:34.404320 master-0 kubenswrapper[33572]: I1204 22:19:34.404279 33572 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 04 22:19:34.404320 master-0 kubenswrapper[33572]: I1204 22:19:34.404295 33572 server.go:79] "Starting device plugin registration server" Dec 04 22:19:34.404958 master-0 kubenswrapper[33572]: I1204 22:19:34.404773 33572 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 04 22:19:34.404958 master-0 kubenswrapper[33572]: I1204 22:19:34.404792 33572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 04 22:19:34.404958 master-0 kubenswrapper[33572]: I1204 22:19:34.404920 33572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 04 22:19:34.407915 master-0 kubenswrapper[33572]: I1204 22:19:34.405009 33572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 04 22:19:34.407915 master-0 kubenswrapper[33572]: I1204 22:19:34.405017 33572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 04 22:19:34.506020 master-0 kubenswrapper[33572]: I1204 22:19:34.505955 33572 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 22:19:34.508893 master-0 kubenswrapper[33572]: I1204 22:19:34.508844 33572 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 04 22:19:34.508963 master-0 kubenswrapper[33572]: I1204 22:19:34.508916 33572 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 04 22:19:34.508963 master-0 kubenswrapper[33572]: I1204 22:19:34.508935 33572 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 04 22:19:34.509119 master-0 kubenswrapper[33572]: I1204 22:19:34.509093 33572 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 04 22:19:34.523238 master-0 kubenswrapper[33572]: I1204 22:19:34.523177 33572 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Dec 04 22:19:34.523457 master-0 kubenswrapper[33572]: I1204 22:19:34.523316 33572 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 04 22:19:37.833947 master-0 kubenswrapper[33572]: I1204 22:19:37.833796 33572 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 04 22:19:37.834796 master-0 kubenswrapper[33572]: I1204 22:19:37.834721 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:37.835696 master-0 kubenswrapper[33572]: I1204 22:19:37.835131 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx","openshift-ingress/router-default-5465c8b4db-8vm66","openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg","openshift-multus/multus-admission-controller-8dbbb5754-c9fx2","openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4","openshift-network-operator/iptables-alerter-c747h","openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b","openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p","openshift-etcd/installer-1-master-0","openshift-insights/insights-operator-55965856b6-7vlpp","openshift-kube-apiserver/installer-1-master-0","openshift-kube-apiserver/installer-3-master-0","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn","openshift-machine-api/machine-api-operator-88d48b57d-pp4fd","openshift-cluster-node-tuning-operator/tuned-jn88h","openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5","openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb","openshift-etcd/installer-2-master-0","openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq","openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp","openshift-kube-scheduler/installer-5-master-0","openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9","openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn","openshift-controller-manager/controller-manager-86785576d9-t7jrz","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-marketplace/community-operators-vvkjf","openshift-network-node-identity/network-node-identity-nk92d","openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk","openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-marketplace/redhat-operators-zt44t","openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt","openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk","openshift-kube-scheduler/installer-4-master-0","openshift-marketplace/redhat-marketplace-sdrkm","openshift-monitoring/kube-state-metrics-5857974f64-qqxk9","openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz","openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn","openshift-dns/node-resolver-6mgn6","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf","openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc","openshift-kube-controller-manager/installer-2-master-0","openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5","openshift-machine-config-operator/machine-config-server-wmm89","openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz","openshift-monitoring/node-exporter-p5qlk","openshift-network-diagnostics/network-check-target-6jkkl","openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt","openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg","openshift-marketplace/marketplace-operator-f797b99b6-m9m4h","openshift-ovn-kubernetes/ovnkube-node-8nxc5","openshift-dns/dns-default-vvs9c","openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68","openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh","openshift-multus/network-metrics-daemon-9pfhj","openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx","openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq","openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d","openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx","openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx","openshift-dns-operator/dns-operator-7c56cf9b74-sshsd","openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj","openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn","openshift-monitoring/metrics-server-55c77559c8-g74sm","assisted-installer/assisted-installer-controller-mxfnl","openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw","openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw","openshift-etcd/etcd-master-0","openshift-ingress-operator/ingress-operator-8649c48786-qlkgh","openshift-kube-controller-manager/installer-3-master-0","openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs","openshift-apiserver/apiserver-8db7f8d79-rlqbz","openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8","openshift-service-ca/service-ca-77c99c46b8-fpnwr","openshift-network-operator/network-operator-79767b7ff9-8lq7w","openshift-ingress-canary/ingress-canary-7cr8g","openshift-machine-config-operator/machine-config-daemon-ppnv8","openshift-marketplace/certified-operators-sw6sx","openshift-multus/multus-additional-cni-plugins-5tpnf","openshift-multus/multus-dgpw9","openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x","openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr","openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx"] Dec 04 22:19:37.835696 master-0 kubenswrapper[33572]: I1204 22:19:37.835536 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mxfnl" Dec 04 22:19:37.844437 master-0 kubenswrapper[33572]: I1204 22:19:37.844325 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 22:19:37.845148 master-0 kubenswrapper[33572]: I1204 22:19:37.845089 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 22:19:37.845388 master-0 kubenswrapper[33572]: I1204 22:19:37.845345 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 22:19:37.845634 master-0 kubenswrapper[33572]: I1204 22:19:37.845584 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.846364 master-0 kubenswrapper[33572]: I1204 22:19:37.846302 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 22:19:37.846650 master-0 kubenswrapper[33572]: I1204 22:19:37.846594 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 22:19:37.846940 master-0 kubenswrapper[33572]: I1204 22:19:37.846880 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 04 22:19:37.847136 master-0 kubenswrapper[33572]: I1204 22:19:37.847091 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 22:19:37.847388 master-0 kubenswrapper[33572]: I1204 22:19:37.847343 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 04 22:19:37.849565 master-0 kubenswrapper[33572]: I1204 22:19:37.849528 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.849994 master-0 kubenswrapper[33572]: I1204 22:19:37.849942 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 22:19:37.850216 master-0 kubenswrapper[33572]: I1204 22:19:37.850172 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 22:19:37.850476 master-0 kubenswrapper[33572]: I1204 22:19:37.850429 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 04 22:19:37.850824 master-0 kubenswrapper[33572]: I1204 22:19:37.850780 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 04 22:19:37.851098 master-0 kubenswrapper[33572]: I1204 22:19:37.851056 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 04 22:19:37.853626 master-0 kubenswrapper[33572]: I1204 22:19:37.852143 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 22:19:37.853626 master-0 kubenswrapper[33572]: I1204 22:19:37.852406 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 22:19:37.853626 master-0 kubenswrapper[33572]: I1204 22:19:37.853168 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 22:19:37.853626 master-0 kubenswrapper[33572]: I1204 22:19:37.853389 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.858128 master-0 kubenswrapper[33572]: I1204 22:19:37.854190 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.858128 master-0 kubenswrapper[33572]: I1204 22:19:37.854809 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 04 22:19:37.858128 master-0 kubenswrapper[33572]: I1204 22:19:37.855023 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 22:19:37.858128 master-0 kubenswrapper[33572]: I1204 22:19:37.855276 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 22:19:37.858128 master-0 kubenswrapper[33572]: I1204 22:19:37.855613 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.858128 master-0 kubenswrapper[33572]: I1204 22:19:37.855608 33572 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="2dd4c24f-2a12-4c0a-8040-f17042299847" Dec 04 22:19:37.864677 master-0 kubenswrapper[33572]: I1204 22:19:37.859124 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 22:19:37.864677 master-0 kubenswrapper[33572]: I1204 22:19:37.859385 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.864677 master-0 kubenswrapper[33572]: I1204 22:19:37.859494 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 22:19:37.864677 master-0 kubenswrapper[33572]: I1204 22:19:37.859132 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 22:19:37.864677 master-0 kubenswrapper[33572]: I1204 22:19:37.859421 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:19:37.864677 master-0 kubenswrapper[33572]: I1204 22:19:37.859897 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.864677 master-0 kubenswrapper[33572]: I1204 22:19:37.860375 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 22:19:37.864677 master-0 kubenswrapper[33572]: I1204 22:19:37.860456 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.864677 master-0 kubenswrapper[33572]: I1204 22:19:37.861586 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 04 22:19:37.864677 master-0 kubenswrapper[33572]: I1204 22:19:37.863085 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" Dec 04 22:19:37.878358 master-0 kubenswrapper[33572]: I1204 22:19:37.866848 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 22:19:37.878358 master-0 kubenswrapper[33572]: I1204 22:19:37.867187 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 22:19:37.878358 master-0 kubenswrapper[33572]: I1204 22:19:37.867449 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 22:19:37.878358 master-0 kubenswrapper[33572]: I1204 22:19:37.867849 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 22:19:37.878358 master-0 kubenswrapper[33572]: I1204 22:19:37.870435 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 22:19:37.885354 master-0 kubenswrapper[33572]: I1204 22:19:37.882790 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.885354 master-0 kubenswrapper[33572]: I1204 22:19:37.883083 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.885354 master-0 kubenswrapper[33572]: I1204 22:19:37.883798 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 22:19:37.885354 master-0 kubenswrapper[33572]: I1204 22:19:37.884619 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 22:19:37.885888 master-0 kubenswrapper[33572]: I1204 22:19:37.885608 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.888603 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.888703 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.889309 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.889405 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.889770 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.890633 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.891042 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.891572 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.891767 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.893056 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.893740 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.893972 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.895324 master-0 kubenswrapper[33572]: I1204 22:19:37.895149 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 22:19:37.897888 master-0 kubenswrapper[33572]: I1204 22:19:37.896378 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 22:19:37.897888 master-0 kubenswrapper[33572]: E1204 22:19:37.896750 33572 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:19:37.898733 master-0 kubenswrapper[33572]: E1204 22:19:37.898260 33572 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-master-0\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:37.902137 master-0 kubenswrapper[33572]: I1204 22:19:37.901953 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 22:19:37.902517 master-0 kubenswrapper[33572]: I1204 22:19:37.902409 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 22:19:37.904642 master-0 kubenswrapper[33572]: E1204 22:19:37.902995 33572 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Dec 04 22:19:37.905597 master-0 kubenswrapper[33572]: I1204 22:19:37.905553 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 04 22:19:37.907718 master-0 kubenswrapper[33572]: I1204 22:19:37.907663 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 04 22:19:37.908797 master-0 kubenswrapper[33572]: I1204 22:19:37.908738 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 04 22:19:37.908875 master-0 kubenswrapper[33572]: I1204 22:19:37.908831 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 22:19:37.909280 master-0 kubenswrapper[33572]: I1204 22:19:37.909234 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 22:19:37.909382 master-0 kubenswrapper[33572]: I1204 22:19:37.909332 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 22:19:37.910708 master-0 kubenswrapper[33572]: I1204 22:19:37.910628 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 22:19:37.910948 master-0 kubenswrapper[33572]: I1204 22:19:37.910872 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 22:19:37.912630 master-0 kubenswrapper[33572]: I1204 22:19:37.912495 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 22:19:37.915418 master-0 kubenswrapper[33572]: I1204 22:19:37.915363 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 22:19:37.917940 master-0 kubenswrapper[33572]: I1204 22:19:37.917047 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 22:19:37.917940 master-0 kubenswrapper[33572]: I1204 22:19:37.917347 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 04 22:19:37.917940 master-0 kubenswrapper[33572]: I1204 22:19:37.917356 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 22:19:37.917940 master-0 kubenswrapper[33572]: I1204 22:19:37.917441 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 22:19:37.923122 master-0 kubenswrapper[33572]: I1204 22:19:37.918075 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 22:19:37.923122 master-0 kubenswrapper[33572]: I1204 22:19:37.918158 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 22:19:37.923122 master-0 kubenswrapper[33572]: I1204 22:19:37.918253 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 22:19:37.923122 master-0 kubenswrapper[33572]: I1204 22:19:37.919484 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 22:19:37.923122 master-0 kubenswrapper[33572]: I1204 22:19:37.919963 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 04 22:19:37.923122 master-0 kubenswrapper[33572]: I1204 22:19:37.920294 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.923809 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.923919 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.923955 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.923999 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.924043 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.924052 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.924181 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.924257 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.924343 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.924417 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.924474 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.924908 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.924963 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.925086 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 22:19:37.925623 master-0 kubenswrapper[33572]: I1204 22:19:37.925312 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 22:19:37.927725 master-0 kubenswrapper[33572]: I1204 22:19:37.927667 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 04 22:19:37.929017 master-0 kubenswrapper[33572]: I1204 22:19:37.928952 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 22:19:37.929081 master-0 kubenswrapper[33572]: I1204 22:19:37.929046 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 22:19:37.929246 master-0 kubenswrapper[33572]: I1204 22:19:37.929174 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 22:19:37.929854 master-0 kubenswrapper[33572]: I1204 22:19:37.929699 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 22:19:37.929854 master-0 kubenswrapper[33572]: I1204 22:19:37.929660 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerStarted","Data":"3879333ec3106dc5e5897a72b9f25043008a8c4c0e423bb23d06f11ee99e9552"} Dec 04 22:19:37.929854 master-0 kubenswrapper[33572]: I1204 22:19:37.929821 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerDied","Data":"437ce0db468372672eda2ac00bd5f2a8af4827f3a8e23b48967061bc95032bfb"} Dec 04 22:19:37.929994 master-0 kubenswrapper[33572]: I1204 22:19:37.929866 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 04 22:19:37.929994 master-0 kubenswrapper[33572]: I1204 22:19:37.929925 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" event={"ID":"7f091088-2166-4026-9fa6-62bd83407edb","Type":"ContainerStarted","Data":"6bf34321670741046368ce4bbb20bccce653f978f24b21e6e4db413ab4cd0c8b"} Dec 04 22:19:37.929994 master-0 kubenswrapper[33572]: I1204 22:19:37.929987 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"587901d613877303166a73aefe83b729a828ee57d294468839ecb48ee62967aa"} Dec 04 22:19:37.930089 master-0 kubenswrapper[33572]: I1204 22:19:37.930017 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"b6f9e5d170c5c01abcc938a01683f90fda3ae8ba34521ff9d208045fb85cbe9d"} Dec 04 22:19:37.930089 master-0 kubenswrapper[33572]: I1204 22:19:37.930063 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 04 22:19:37.930148 master-0 kubenswrapper[33572]: I1204 22:19:37.930111 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"7ab8b346978ad6f1cf331a2cd2e464eb81897737f06c530111b331aae07ed9d5"} Dec 04 22:19:37.930148 master-0 kubenswrapper[33572]: I1204 22:19:37.930140 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"ef9e33374e7feeece814a917871fabee9e5fb56a0fa6d544c75f1256bc6c0f94"} Dec 04 22:19:37.930210 master-0 kubenswrapper[33572]: I1204 22:19:37.930166 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" event={"ID":"c6a5d14d-0409-4024-b0a8-200fa2594185","Type":"ContainerStarted","Data":"4e5f4d666e715187131125caa7b8db325dd82e37d31be42d4f697d2f2db4f71e"} Dec 04 22:19:37.930210 master-0 kubenswrapper[33572]: I1204 22:19:37.930196 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" event={"ID":"c6a5d14d-0409-4024-b0a8-200fa2594185","Type":"ContainerDied","Data":"5553ac89bf95798ab2decbb87aaa4e3e8d835fcf542a4ada2023ac05d60471d4"} Dec 04 22:19:37.930274 master-0 kubenswrapper[33572]: I1204 22:19:37.930225 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" event={"ID":"c6a5d14d-0409-4024-b0a8-200fa2594185","Type":"ContainerStarted","Data":"1f66bf17e6a9a3d1673b91cfae48275a45a440300946a8bf5061bac66c63db97"} Dec 04 22:19:37.930311 master-0 kubenswrapper[33572]: I1204 22:19:37.930268 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"ff0f980e1e849c21f6412c540c1a8c9abeff149bce406310f67bdb69c4eae768"} Dec 04 22:19:37.930352 master-0 kubenswrapper[33572]: I1204 22:19:37.930300 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"4164cab9c50a981b577e39cbb489f1522a739da479bf036162f662ad7cf84d9e"} Dec 04 22:19:37.930352 master-0 kubenswrapper[33572]: I1204 22:19:37.930332 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"12c4aa15e4e79d5f90b97077c400d618cdd6a7f09f25df0096cef1db7225b99d"} Dec 04 22:19:37.930417 master-0 kubenswrapper[33572]: I1204 22:19:37.930358 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"7290093cc147531531f377286d1c48e3803031a7cc41744c297aa00505901855"} Dec 04 22:19:37.930417 master-0 kubenswrapper[33572]: I1204 22:19:37.930385 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"072d9d8a34bba009b433a8865da7ea50c856bf5a8fcc704a213e14db6134cc03"} Dec 04 22:19:37.930417 master-0 kubenswrapper[33572]: I1204 22:19:37.930408 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"55e5df41db2945971119fe0a034f6c7c7f38f1e44c695ddf59539c8fa0491a30"} Dec 04 22:19:37.930522 master-0 kubenswrapper[33572]: I1204 22:19:37.930442 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"5e65b113fd0cc6bd898a4b738d10907e4d7312801f7a18d6e95d69cd06443a6c"} Dec 04 22:19:37.930522 master-0 kubenswrapper[33572]: I1204 22:19:37.930471 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"d74bdf11b81a168a1eb50f57289a14094033b1e6fe3938a39885cff3f029fbfd"} Dec 04 22:19:37.930683 master-0 kubenswrapper[33572]: I1204 22:19:37.930496 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerDied","Data":"d14cbc85e41a76d9831e3cb322a42ef6928588924655708cdbc5b0d0983944d9"} Dec 04 22:19:37.930683 master-0 kubenswrapper[33572]: I1204 22:19:37.930486 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:37.930683 master-0 kubenswrapper[33572]: I1204 22:19:37.930623 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" event={"ID":"59d3d0d8-1a2a-4d14-8312-d33818acba88","Type":"ContainerStarted","Data":"d2150947f6b280de7d29efbeffc7ac274ffd5788d0ca351df6e324f3c07fb86b"} Dec 04 22:19:37.930774 master-0 kubenswrapper[33572]: I1204 22:19:37.930705 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" event={"ID":"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9","Type":"ContainerStarted","Data":"1bc5ab124190b2c59d84b58250cc263ddcafcb9537dd2db02384165b00676c7f"} Dec 04 22:19:37.930807 master-0 kubenswrapper[33572]: I1204 22:19:37.930785 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" event={"ID":"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9","Type":"ContainerDied","Data":"6d31ad2a1f5237b4355ed2f39e4e13656076c4a85f80a08d5d712a1a6ab75238"} Dec 04 22:19:37.930850 master-0 kubenswrapper[33572]: I1204 22:19:37.930827 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" event={"ID":"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9","Type":"ContainerStarted","Data":"7031d386f42300ef917c16f433aec3d9b72a6769b546f2943379602f68aa4683"} Dec 04 22:19:37.930885 master-0 kubenswrapper[33572]: I1204 22:19:37.930852 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" event={"ID":"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9","Type":"ContainerStarted","Data":"c33c6e351b6426a43cd389bbd81cef5f132f38999fc440de5ea48da556537499"} Dec 04 22:19:37.930918 master-0 kubenswrapper[33572]: I1204 22:19:37.930880 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" event={"ID":"a043ea49-97f9-4ae6-83b9-733f12754d94","Type":"ContainerStarted","Data":"5f1510dd754c8015c3efa13b31a418b33c7dc7f1e77672856392caad09ab716a"} Dec 04 22:19:37.930966 master-0 kubenswrapper[33572]: I1204 22:19:37.930911 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" event={"ID":"a043ea49-97f9-4ae6-83b9-733f12754d94","Type":"ContainerDied","Data":"3e05aae4893276e8afe69a36c63278b1771095172b32eb2660a3d4bf0f266404"} Dec 04 22:19:37.930966 master-0 kubenswrapper[33572]: I1204 22:19:37.930944 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" event={"ID":"a043ea49-97f9-4ae6-83b9-733f12754d94","Type":"ContainerStarted","Data":"db48b3f4e1a29b857cceceb534352169660eed12f652161e8b97983c91525c06"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.930972 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerStarted","Data":"b15cd21f3f4e9fee4d49615520e3cb875b8d92374b9511d0ad4dc25bdd542ba5"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.931000 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerDied","Data":"4325f17544835c85c6a52e1b1f681caef960ed59819852b48ea3b6353d61e1b5"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.931041 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" event={"ID":"f893663c-7c1e-4eda-9839-99c1c0440304","Type":"ContainerStarted","Data":"58c253cb55a2596a7104eeae0b0779984afb0bde33ab17e052e97f3e58be779f"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.931068 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" event={"ID":"4d68dcb1-efe4-425f-9b28-1e5575548a32","Type":"ContainerStarted","Data":"db27458e6b27bc7ef79661747271bca2ab81c5f5d722426e70bfbf3ba534f396"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.931094 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" event={"ID":"4d68dcb1-efe4-425f-9b28-1e5575548a32","Type":"ContainerDied","Data":"2f8f422694aa4bc57d4ecc64211f7f799287be11acc20da48bbd2da04f761575"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.931120 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" event={"ID":"4d68dcb1-efe4-425f-9b28-1e5575548a32","Type":"ContainerStarted","Data":"caf36cfc8384a756669c5effc9f040f914b8e0fafbb77841a2ef74350bfc51bf"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.931151 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerStarted","Data":"2b90cccc4060f63e3151c04577b704a9e40c2c1995c15db065507afb9359b261"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.931184 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerDied","Data":"a1cf3561aceffd368c0ca4d9cb40d000ada9182cc1eeba5246056ae555e8f11e"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.931227 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerStarted","Data":"b255dc2ba6c02f78e7fa3f3206067dc5c657701a0d9a3acc7e7566b70c0f286c"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.931255 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-nk92d" event={"ID":"634c1df6-de4d-4e26-8c71-d39311cae0ce","Type":"ContainerStarted","Data":"d20bd8f7df5ba066210630b0736a496814188135f419c1b214059e7501c8fbf9"} Dec 04 22:19:37.931289 master-0 kubenswrapper[33572]: I1204 22:19:37.931283 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vvs9c" event={"ID":"a5c2d3b8-41c0-4531-b770-57b7c567fe30","Type":"ContainerStarted","Data":"493fc52c68401fa5964bb3eeaf4d67a35d8bc2236f565e9f9553b7ddae6747d8"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.931314 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vvs9c" event={"ID":"a5c2d3b8-41c0-4531-b770-57b7c567fe30","Type":"ContainerStarted","Data":"45450357cf840130f50887c8a0378cc1abdb04813e3dcc85b0c07540beaa459f"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.931341 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vvs9c" event={"ID":"a5c2d3b8-41c0-4531-b770-57b7c567fe30","Type":"ContainerStarted","Data":"9c65ccceea882cd7d898803d924bbc08d4f3c8fef9c388b0db802fee0be2d9fc"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.931373 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerStarted","Data":"4c72e9186536692b281fe4e714a6c8b9d1a2250b3c26e8d8330699b4c2ec401d"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.931414 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerDied","Data":"1bd03b6d56dba3556ff5faee83fe97db0ea1194b6ba6e4b1aac4ae2f4b0e67d8"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.931446 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerStarted","Data":"56273487d972eb4bce7bb2a2d532d0ecc4790cf320f72d236deb00d6ee12d734"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.931474 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerStarted","Data":"4d9524d2f6c8db6ea1b8d01f9923d2a3d6c267b7a2c9858906c3336929be1a8b"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.931603 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerDied","Data":"7d1cd57ab4d6b616f66a493d64b829158d3913e17906249bebc8057db4a21035"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.931664 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerStarted","Data":"9b118c3eb1526e32a59593fb41286a1e5da44aab9049917049f670cf866c2e43"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.931736 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" event={"ID":"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161","Type":"ContainerStarted","Data":"c382efd8856765d7e3a7c1f5148c4c397e023bc0d7fa9282b9bc8277f0af2687"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.931971 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" event={"ID":"35821f48-b000-4915-847f-a739b6efc5ee","Type":"ContainerStarted","Data":"158753a86c5c01314d89c2674122c45b776d4868ad7bb53382d3dcedd2977cf8"} Dec 04 22:19:37.932018 master-0 kubenswrapper[33572]: I1204 22:19:37.932022 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" event={"ID":"35821f48-b000-4915-847f-a739b6efc5ee","Type":"ContainerDied","Data":"50388f7f0e80879ab901f5d4bca3a6dc9ea1a39437e9578943365b4294d8b25d"} Dec 04 22:19:37.932349 master-0 kubenswrapper[33572]: I1204 22:19:37.932053 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" event={"ID":"35821f48-b000-4915-847f-a739b6efc5ee","Type":"ContainerStarted","Data":"b5aeee4aff8bc78bf6e36ba938b9609781a2d34417a21d03fdc2c6a101065131"} Dec 04 22:19:37.932349 master-0 kubenswrapper[33572]: I1204 22:19:37.932084 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9pfhj" event={"ID":"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa","Type":"ContainerStarted","Data":"b75befb683524bef4216c78e58648138696c5c0ab8c9682dcb3f075c7c87b206"} Dec 04 22:19:37.932349 master-0 kubenswrapper[33572]: I1204 22:19:37.932112 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9pfhj" event={"ID":"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa","Type":"ContainerStarted","Data":"5c975667384a07097695d15e2e30aab1bd6d4d9f872c5d99e129896563421798"} Dec 04 22:19:37.932349 master-0 kubenswrapper[33572]: I1204 22:19:37.932139 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9pfhj" event={"ID":"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa","Type":"ContainerStarted","Data":"fc5560b4f5b75417c091bf8b734a41efa2795ce2d8cceb8a89a66960f1ba3320"} Dec 04 22:19:37.932349 master-0 kubenswrapper[33572]: I1204 22:19:37.932167 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" event={"ID":"b966c210-5415-4fa5-88ab-c85aba979b28","Type":"ContainerStarted","Data":"e64f5f283df42fbfd3b016ddfaa5b8ed71386c26b5f0eb7a21d4b6a37b395d52"} Dec 04 22:19:37.932349 master-0 kubenswrapper[33572]: I1204 22:19:37.932211 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" event={"ID":"b966c210-5415-4fa5-88ab-c85aba979b28","Type":"ContainerStarted","Data":"9ff695d5f754bc1ab4c8a5e3ad28cb942f185e2cf70cdd2d8a2eeb6d3f679b39"} Dec 04 22:19:37.932349 master-0 kubenswrapper[33572]: I1204 22:19:37.932238 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" event={"ID":"46229484-5fa1-4595-94a0-44477abae90e","Type":"ContainerStarted","Data":"c83311c437c02b54931377c7c49f736c6deca7ea65c74397bdf6ab810158ea6e"} Dec 04 22:19:37.932349 master-0 kubenswrapper[33572]: I1204 22:19:37.932268 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" event={"ID":"46229484-5fa1-4595-94a0-44477abae90e","Type":"ContainerDied","Data":"43708b3c1f0fc23d49f5b68e72b2abf6c84e81e5b8fe673a0fccaff92e14b81d"} Dec 04 22:19:37.932349 master-0 kubenswrapper[33572]: I1204 22:19:37.932296 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" event={"ID":"46229484-5fa1-4595-94a0-44477abae90e","Type":"ContainerStarted","Data":"51d6ac9cf90d7c8eb2861e16058d7ad6c28fadb7bb22861653b05ced3e77a61e"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.933346 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.933954 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wmm89" event={"ID":"eb4d8477-c3b5-4e88-aaa9-222ad56d974c","Type":"ContainerStarted","Data":"3250f93800f14f1984b89093aa1038684a73aea8a159904e7ccc7f265450fb5b"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934014 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wmm89" event={"ID":"eb4d8477-c3b5-4e88-aaa9-222ad56d974c","Type":"ContainerStarted","Data":"4e849604a662b099203e2c576ae634e61f44af1ebad609cae720f6f60b5023c0"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934122 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" event={"ID":"5dac8e25-0f51-4c04-929c-060479689a9d","Type":"ContainerStarted","Data":"5d2fefeec2561a7c75cacf5399d1c0370782912b0bc1f1c8faf916fa302e41f1"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934161 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" event={"ID":"5dac8e25-0f51-4c04-929c-060479689a9d","Type":"ContainerDied","Data":"028e28ae583843cf573a020510b23f29bb89888d81cbcae42e7af0446b2d3e61"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934191 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" event={"ID":"5dac8e25-0f51-4c04-929c-060479689a9d","Type":"ContainerStarted","Data":"82f29b76b48da3d841d1256de9fef86cdb6553d971418660ee7c3b3bf00fff6f"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934217 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" event={"ID":"5dac8e25-0f51-4c04-929c-060479689a9d","Type":"ContainerStarted","Data":"09c1c58555576a7a5ad0c40263cb1b24f2532f6f0895e3889a790f41c5622cf5"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934241 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerStarted","Data":"bcaaf6a96d954f901cc05fe39c7b7764e445e886db16581ddfb04f2c4ced3d82"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934270 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"758bcdf683109d822a1017f454c5645fc9f981b1015625c2d5ef493072ef4678"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934357 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"3574d633e7308db3b6dd662bd037451e5d0ed5c34c61a73c66397c77d3caf66e"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934404 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"189119b91f6e6ef0f62e51f0cc69d03fbbc0144ce142853e62f56609d2029b1d"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934430 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"eade6c63cfbfd85793c4e11745edd4d5a786bcef37074f29af89908e936863d7"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934456 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"3903951768e93b52af44e2ee6090549f67bc30f2eeffd34acda2b5e56323b0df"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934487 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerDied","Data":"20fdbd8f60e4052a44e37c80c735da9d3ff66c7350cb568fd169c055622f648f"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934544 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5tpnf" event={"ID":"76fd9f44-4365-4271-8772-025655c50334","Type":"ContainerStarted","Data":"d4a10292f308562f1eb811fcc21e768edeacf6e3dfd89600c7297e0e58e34ffc"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934577 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c747h" event={"ID":"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c","Type":"ContainerStarted","Data":"445f62d39aa04dcf1c8ebad8cd7e2899244dd127c8c97b181ddff4af36c8b535"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934615 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c747h" event={"ID":"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c","Type":"ContainerStarted","Data":"32713ac531267a8df9a5155d2161f2837fe39d413f156b0faad6e6aed3651be0"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934642 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" event={"ID":"34cad3de-8f3f-48cd-bd39-8745fad19e65","Type":"ContainerStarted","Data":"522ebc4422c9f26169d8e98928a3c5499603a3d90a45136b87f723bed13e8748"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934672 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" event={"ID":"34cad3de-8f3f-48cd-bd39-8745fad19e65","Type":"ContainerStarted","Data":"bbf9fb5a77c001a00a8ba9089cd2dbff84e9018cac8414c0fa2ee4f2f5ac52a2"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934750 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" event={"ID":"34cad3de-8f3f-48cd-bd39-8745fad19e65","Type":"ContainerStarted","Data":"8322a14628afc897780b45b61f17a00da7b18029f93a7a74b52c8e380031ff4f"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934785 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"fb4afb592b5c30cfd21e213860a9cae209891a86353f6f65689e3455958a2f39"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934813 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"efa3762feffc3da59b6f9bcafd79da1dfd2e009c93cfb906986a1a37a50f7d8d"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934850 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"dcca3d65f2eb9a58cbe582258c5a8066f1e6748f3a54afa3247bc56a9f4f23d0"} Dec 04 22:19:37.934848 master-0 kubenswrapper[33572]: I1204 22:19:37.934876 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"8ee64a9d18e9766817acb72d7fa9c4b992b2a148db0509af5fadc5499a6f837a"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.934901 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"ff3db8bd41312f8815f56e2a1d1c76c27943763c50e5bb1dafc09d4915bc599d"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.934926 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"d9582fc250da782a33466d4d52e589af275376f87d9bdf03fa1cb11c7d23524e"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.934955 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"ff5921da732d05f72d82c0539e6f4661b6512a68939a31fe1f83fa7bbd8cf1d4"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.934983 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"553d3584b8fff905a7e34ad91d98d0f31e54579b68090cae0d50c0891bc22dd5"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935009 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"f09554581756c0f7969c0e40141de26184513807dc2c20e4d5041730923d9d0c"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935045 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" event={"ID":"800f436c-145d-4281-8d4d-644ba2cb0ebb","Type":"ContainerStarted","Data":"0fa9349d6f2854385acfa0a5510d95ae0764adbe71a4b80f503e71769e643a05"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935075 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" event={"ID":"800f436c-145d-4281-8d4d-644ba2cb0ebb","Type":"ContainerDied","Data":"66fa513342b7f47d4f807e0f29b5398451337b70d0aea1cac07ea70f754f3d14"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935101 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" event={"ID":"800f436c-145d-4281-8d4d-644ba2cb0ebb","Type":"ContainerStarted","Data":"b55d508726ae8df0222e55687319ab2bf975a4e6e2983f8b547ae30ae19307c0"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935128 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" event={"ID":"800f436c-145d-4281-8d4d-644ba2cb0ebb","Type":"ContainerStarted","Data":"1f07b0c4938e582ee1cf91b0ddf3d74df5a116aa0098efb24e115a5e8116176b"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935157 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" event={"ID":"871cb002-67f4-43aa-a41d-7a5b2f340059","Type":"ContainerStarted","Data":"59c9203f641c765d2eee366e0bf083a83f8954539e6ae9b99846d431ed362e41"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935186 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" event={"ID":"871cb002-67f4-43aa-a41d-7a5b2f340059","Type":"ContainerDied","Data":"9d7fd4b64c7f9d10b43359385a6360e49aa71c5085c781ef53642cd82a85d004"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935251 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" event={"ID":"871cb002-67f4-43aa-a41d-7a5b2f340059","Type":"ContainerStarted","Data":"10dc2002c2d044f5ea69805bbc631643d30665f6c77465c4461544acec4eebbc"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935282 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"5d13043a38ae0fb09211cd5a587bc3304b77f315cf7a3d95f4c81e25cbe2aabc"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935355 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"b9c2a59d2b2d384d9b6ed01768b63f9f489ffd4ed0753bd5fb34a22342dcc2b9"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935386 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"d0d775c28298e632a37100becb71b0843ebb15158d232ea527d7de5420ce8047"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935458 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerDied","Data":"509e3ce53fc945130075276f6099e96d73baf21a6fcaddff5d395b3b94de9c58"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935551 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"e976d29655bd6a4804e44aefca912f97ea12da492eb732dd0d56f5c8ee61e225"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935597 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" event={"ID":"17912746-74eb-4c78-8c1b-2f66e7ce4299","Type":"ContainerStarted","Data":"81f6c807c7b5b7b7589f341741012990ba8bc408248c52e232edf6a36c144642"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935629 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" event={"ID":"17912746-74eb-4c78-8c1b-2f66e7ce4299","Type":"ContainerStarted","Data":"62bb8fd14dd9c2077e5122579d140ab222e13a81de3357a0f0b9c3f9b8580e24"} Dec 04 22:19:37.935699 master-0 kubenswrapper[33572]: I1204 22:19:37.935660 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" event={"ID":"17912746-74eb-4c78-8c1b-2f66e7ce4299","Type":"ContainerStarted","Data":"ab56dbe7257c4c7482b150a1ba0d82ac0c93f28c32d4b4b263e8fd93ae1aee0c"} Dec 04 22:19:37.936255 master-0 kubenswrapper[33572]: I1204 22:19:37.935690 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" event={"ID":"17912746-74eb-4c78-8c1b-2f66e7ce4299","Type":"ContainerStarted","Data":"73496f020ec19048256b7ee616b5604b8f6faef21ddc2795a2639ad6cafa0a2c"} Dec 04 22:19:37.936255 master-0 kubenswrapper[33572]: I1204 22:19:37.935766 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dgpw9" event={"ID":"6c8c45e0-2342-499b-aa6b-339b6a722a87","Type":"ContainerStarted","Data":"412b36a625c3b7b5d3033bd3f5f3ec14a8a2f1b82af2acf7233fc8da02c22531"} Dec 04 22:19:37.936255 master-0 kubenswrapper[33572]: I1204 22:19:37.935986 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dgpw9" event={"ID":"6c8c45e0-2342-499b-aa6b-339b6a722a87","Type":"ContainerStarted","Data":"3aa682501427a1a26306dd6e6ffbe29276935fb92a5916c957736c383157a162"} Dec 04 22:19:37.936255 master-0 kubenswrapper[33572]: I1204 22:19:37.935863 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 04 22:19:37.936255 master-0 kubenswrapper[33572]: I1204 22:19:37.936102 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerStarted","Data":"9bd36bdfa3dbe917fb415f401e0843138d225eb2cdd038a07c1fc4862acaf2a9"} Dec 04 22:19:37.936255 master-0 kubenswrapper[33572]: I1204 22:19:37.936161 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerDied","Data":"fe4d171382cafc8367d04ba38562e53607a288ace82dafd0c07ed366d2a1ef56"} Dec 04 22:19:37.936255 master-0 kubenswrapper[33572]: I1204 22:19:37.936197 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerStarted","Data":"66b2e3479fe940e234e57684065e0fce7888af45d6710422bc86d256ccfc2307"} Dec 04 22:19:37.936255 master-0 kubenswrapper[33572]: I1204 22:19:37.936224 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" event={"ID":"ce6b5a46-172b-4575-ba22-ff3c6ea4207f","Type":"ContainerStarted","Data":"fe41c35d4fc12b10f7c0380ded0175f838a7cb9e3aad0aa5a08446be17e65126"} Dec 04 22:19:37.936255 master-0 kubenswrapper[33572]: I1204 22:19:37.936258 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"3099f7b5-f904-4d15-aedb-f4e558b813e4","Type":"ContainerDied","Data":"f9af7ae05881c66c990776ea5e9ecae6917372ad2e83deed7c505b583fa9da46"} Dec 04 22:19:37.936613 master-0 kubenswrapper[33572]: I1204 22:19:37.936287 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"3099f7b5-f904-4d15-aedb-f4e558b813e4","Type":"ContainerDied","Data":"bc39063be03324c773b296bb527536f85c03a71f7444ce95b585b37a77beb76b"} Dec 04 22:19:37.936613 master-0 kubenswrapper[33572]: I1204 22:19:37.936315 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc39063be03324c773b296bb527536f85c03a71f7444ce95b585b37a77beb76b" Dec 04 22:19:37.936613 master-0 kubenswrapper[33572]: I1204 22:19:37.936342 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6jkkl" event={"ID":"510a595a-21bf-48fc-85cd-707bc8f5536f","Type":"ContainerStarted","Data":"07470ecb67c001251340fae3151b0ef12e1a2a108ad2fba4324431951a35b097"} Dec 04 22:19:37.936613 master-0 kubenswrapper[33572]: I1204 22:19:37.936381 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6jkkl" event={"ID":"510a595a-21bf-48fc-85cd-707bc8f5536f","Type":"ContainerStarted","Data":"4fc051e954a566d97cf4dcb3626713517bc5479301f571be1eec860a1f2d884c"} Dec 04 22:19:37.936613 master-0 kubenswrapper[33572]: I1204 22:19:37.936408 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88063b8731e190e3ef35fcb2f8650f0d31e7321d57a43954195df2634f632310" Dec 04 22:19:37.936613 master-0 kubenswrapper[33572]: I1204 22:19:37.936430 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" event={"ID":"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb","Type":"ContainerStarted","Data":"1a4d2b917d0f536f861e86755d3bf6744689e0554629cdb4b05a8419c9269007"} Dec 04 22:19:37.936613 master-0 kubenswrapper[33572]: I1204 22:19:37.936465 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" event={"ID":"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb","Type":"ContainerStarted","Data":"ba166c0c83b63968d9c53772f494598b095ed5d017e4a288c9e60bcf13979dcd"} Dec 04 22:19:37.936613 master-0 kubenswrapper[33572]: I1204 22:19:37.936491 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" event={"ID":"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb","Type":"ContainerDied","Data":"446458854f272c65918d3eef29e63c52aea4a45ba36f434208f291e2e3410da7"} Dec 04 22:19:37.936613 master-0 kubenswrapper[33572]: I1204 22:19:37.936552 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" event={"ID":"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb","Type":"ContainerStarted","Data":"8ccadfcf02bee77f4b3f98d491b1ee8f4b7c03cb21fbe9104543e0f3a4c0e10a"} Dec 04 22:19:37.936613 master-0 kubenswrapper[33572]: I1204 22:19:37.936581 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" event={"ID":"690b447a-19c0-4925-bc9d-d0c86a83a377","Type":"ContainerStarted","Data":"d5f941a3f84766c224cb550f03c9798dd97a48eb4d1b0dc82d2ca740885ed464"} Dec 04 22:19:37.936893 master-0 kubenswrapper[33572]: I1204 22:19:37.936623 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" event={"ID":"690b447a-19c0-4925-bc9d-d0c86a83a377","Type":"ContainerDied","Data":"a2b4230e9757af974dea8f15ae2efeb4c125ff55b0c9cdc6e359f1aae71c8941"} Dec 04 22:19:37.936893 master-0 kubenswrapper[33572]: I1204 22:19:37.936652 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" event={"ID":"690b447a-19c0-4925-bc9d-d0c86a83a377","Type":"ContainerStarted","Data":"e18ea7a7e8b99e9b5c5fa288ef3f3657d52b7fcf4eb2859562b3331202004223"} Dec 04 22:19:37.936893 master-0 kubenswrapper[33572]: I1204 22:19:37.936684 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" event={"ID":"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d","Type":"ContainerStarted","Data":"50d9f03783c661accee22d1e4308b7f9da15faf71fda445f1589dfc2e32aea11"} Dec 04 22:19:37.936893 master-0 kubenswrapper[33572]: I1204 22:19:37.936714 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" event={"ID":"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d","Type":"ContainerStarted","Data":"ebcf83d7998d4cc60b59e0a4ee1b7d80a2c88668db04583317334cbd65922154"} Dec 04 22:19:37.936893 master-0 kubenswrapper[33572]: I1204 22:19:37.936741 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"4b9fbd90-66d5-4637-9821-22242aa6f6d7","Type":"ContainerDied","Data":"05ebda65d53028c7345257866dac633a27c8894eb475430d761e1c0a053ea020"} Dec 04 22:19:37.936893 master-0 kubenswrapper[33572]: I1204 22:19:37.936773 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"4b9fbd90-66d5-4637-9821-22242aa6f6d7","Type":"ContainerDied","Data":"5aea2f5066e056e4a369b8871e8461c5b3fa8918d7cce8402f38ffd4c90c32d6"} Dec 04 22:19:37.937061 master-0 kubenswrapper[33572]: I1204 22:19:37.936932 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aea2f5066e056e4a369b8871e8461c5b3fa8918d7cce8402f38ffd4c90c32d6" Dec 04 22:19:37.937061 master-0 kubenswrapper[33572]: I1204 22:19:37.936963 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"5fcdcfec6584d4b81237f292d9431a4c2eccc32a97cb24f1ad54b8dd23d476bb"} Dec 04 22:19:37.937061 master-0 kubenswrapper[33572]: I1204 22:19:37.936993 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65"} Dec 04 22:19:37.937154 master-0 kubenswrapper[33572]: I1204 22:19:37.937094 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d"} Dec 04 22:19:37.937154 master-0 kubenswrapper[33572]: I1204 22:19:37.937127 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321"} Dec 04 22:19:37.937218 master-0 kubenswrapper[33572]: I1204 22:19:37.937154 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c"} Dec 04 22:19:37.937218 master-0 kubenswrapper[33572]: I1204 22:19:37.937184 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerDied","Data":"84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f"} Dec 04 22:19:37.937285 master-0 kubenswrapper[33572]: I1204 22:19:37.937236 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"4f7c6491dc22a01b1d98b832f138731bbe081d987cf8c5e14ed87abcbbcb568a"} Dec 04 22:19:37.937285 master-0 kubenswrapper[33572]: I1204 22:19:37.937268 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" event={"ID":"5598683a-cd32-486d-8839-205829d55cc2","Type":"ContainerStarted","Data":"92528b95e13ec8f264bec256aad90296e21be58deea24ba77dc8ff80b36c0304"} Dec 04 22:19:37.937355 master-0 kubenswrapper[33572]: I1204 22:19:37.937296 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" event={"ID":"5598683a-cd32-486d-8839-205829d55cc2","Type":"ContainerDied","Data":"922ec5ad22c0f758ca0c6af6881b85724b616bc9bf1514cfd7b12d47fc0ff553"} Dec 04 22:19:37.937355 master-0 kubenswrapper[33572]: I1204 22:19:37.937325 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" event={"ID":"5598683a-cd32-486d-8839-205829d55cc2","Type":"ContainerStarted","Data":"9cd533ead3cae5dfe671d9ed582ce0c4fe3846e3703c3c3ebeec9d68db23459a"} Dec 04 22:19:37.937419 master-0 kubenswrapper[33572]: I1204 22:19:37.937349 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" event={"ID":"5598683a-cd32-486d-8839-205829d55cc2","Type":"ContainerStarted","Data":"f19df2e06dca5a2f80ab8037e49629477ed1cac1328bfc7445b4bdab076568fc"} Dec 04 22:19:37.937419 master-0 kubenswrapper[33572]: I1204 22:19:37.937379 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"986a4de7-3a54-48dc-9599-49cf19ba0ad5","Type":"ContainerDied","Data":"42c592fcd97dd09de62f2c07d511eb1f7fedb875ba01c50a76d8a639e15849ae"} Dec 04 22:19:37.937475 master-0 kubenswrapper[33572]: I1204 22:19:37.937420 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"986a4de7-3a54-48dc-9599-49cf19ba0ad5","Type":"ContainerDied","Data":"1fee06d19aff5ef21eb8427a31bd857aa51bbbcb2fe5924a93729689e0a74832"} Dec 04 22:19:37.937475 master-0 kubenswrapper[33572]: I1204 22:19:37.937443 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fee06d19aff5ef21eb8427a31bd857aa51bbbcb2fe5924a93729689e0a74832" Dec 04 22:19:37.937585 master-0 kubenswrapper[33572]: I1204 22:19:37.937469 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" event={"ID":"967bf4ac-f025-4296-8ed9-183a345f6b7c","Type":"ContainerStarted","Data":"83b6f31f110e8f986286c8858fee161cd2cfbc203132898174a8f254b84462a0"} Dec 04 22:19:37.937585 master-0 kubenswrapper[33572]: I1204 22:19:37.937529 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" event={"ID":"967bf4ac-f025-4296-8ed9-183a345f6b7c","Type":"ContainerStarted","Data":"aab8d5d7d7caaf80016fb84803d68f187962ac87f50be8e340ea0edecd46547b"} Dec 04 22:19:37.937585 master-0 kubenswrapper[33572]: I1204 22:19:37.937559 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" event={"ID":"0beb871c-3bf1-471c-a028-746a650267bf","Type":"ContainerStarted","Data":"ca5799c309b09795ff95214f2ec9158f268801b85d2051e30751956963a75745"} Dec 04 22:19:37.937673 master-0 kubenswrapper[33572]: I1204 22:19:37.937585 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" event={"ID":"0beb871c-3bf1-471c-a028-746a650267bf","Type":"ContainerDied","Data":"3319d82050d7163f2a7f96d02081f2908fac76018c64e251ebfd0f4e73ccabfd"} Dec 04 22:19:37.937673 master-0 kubenswrapper[33572]: I1204 22:19:37.937616 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" event={"ID":"0beb871c-3bf1-471c-a028-746a650267bf","Type":"ContainerStarted","Data":"c83e316239457de6d2cf065ee11c69192c6233457017b9e9bdae1e03d84ad9fc"} Dec 04 22:19:37.937673 master-0 kubenswrapper[33572]: I1204 22:19:37.937654 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerStarted","Data":"0036ee313e2e8fbc7aa4a79880a6b001a94f998abd62378bddfdf0a04bcdd8e0"} Dec 04 22:19:37.937764 master-0 kubenswrapper[33572]: I1204 22:19:37.937188 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.937681 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerDied","Data":"1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3"} Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.938028 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerStarted","Data":"10dda04dd9f1ca247b35249d5e7333d86ebc7e3902573b98ac28839fb9bcb514"} Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.938086 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerStarted","Data":"c21b1f28f2ecd3a6f853caa962c1e919058a4a8d42a7386884dd5b88c192ad87"} Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.938117 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerDied","Data":"27f9dcb8a2a3043d50959f9bd25f4a8719c19a0127b6e4e0dd3d41cad4c9e780"} Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.938153 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" event={"ID":"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026","Type":"ContainerStarted","Data":"df11c4a8f3347747aecb87e080a5126c781276285c92708ad28d5159ae4229dc"} Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.938203 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0791dc66-67d9-42bd-b7c3-d45dc5513c3b","Type":"ContainerDied","Data":"f70a0cabfa84fd6dac7eab4d978f050d5e781f995d7f4f93a12a51cc9706d0d9"} Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.938298 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0791dc66-67d9-42bd-b7c3-d45dc5513c3b","Type":"ContainerDied","Data":"87f5905070547f303913fb037b80fa8791f83c49f484c86a7ba733227eaa9eb2"} Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.938390 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87f5905070547f303913fb037b80fa8791f83c49f484c86a7ba733227eaa9eb2" Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.938429 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" event={"ID":"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f","Type":"ContainerDied","Data":"00713a1c06d69e4187d092bf84b0d17670a9eda7c3ce1307b7efa35d4e53871c"} Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.938587 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr" event={"ID":"85a7edee-7a4c-4f4f-b537-d1ce3a9f812f","Type":"ContainerDied","Data":"69107f4cb3ef5b3b0251fc55638d465f044f2dd2f76a36beea2e418eac9fab2d"} Dec 04 22:19:37.938721 master-0 kubenswrapper[33572]: I1204 22:19:37.938673 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69107f4cb3ef5b3b0251fc55638d465f044f2dd2f76a36beea2e418eac9fab2d" Dec 04 22:19:37.939132 master-0 kubenswrapper[33572]: I1204 22:19:37.938761 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba","Type":"ContainerDied","Data":"7f95f72da52c53d3c8d88cdae7b632b1e707bccffe42c9e45b84331a1108d0c6"} Dec 04 22:19:37.939132 master-0 kubenswrapper[33572]: I1204 22:19:37.938809 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba","Type":"ContainerDied","Data":"ec44f98a134fed3f7d27e7c218ca88ef4cd2ac21b667420e0029267e424b27bd"} Dec 04 22:19:37.939132 master-0 kubenswrapper[33572]: I1204 22:19:37.938909 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec44f98a134fed3f7d27e7c218ca88ef4cd2ac21b667420e0029267e424b27bd" Dec 04 22:19:37.939132 master-0 kubenswrapper[33572]: I1204 22:19:37.938940 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" event={"ID":"989a73ce-3898-4f65-a437-2c7061f9375f","Type":"ContainerStarted","Data":"28fef5f99f6f6e677e593fa649674c67bdc15138dbeae6953397e00648b6d669"} Dec 04 22:19:37.939132 master-0 kubenswrapper[33572]: I1204 22:19:37.938980 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" event={"ID":"989a73ce-3898-4f65-a437-2c7061f9375f","Type":"ContainerDied","Data":"e1ab85fa23f372e6c12039f42a8215b4ecb7099a306302bdcd4c1624786fb3f7"} Dec 04 22:19:37.939132 master-0 kubenswrapper[33572]: I1204 22:19:37.939012 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" event={"ID":"989a73ce-3898-4f65-a437-2c7061f9375f","Type":"ContainerStarted","Data":"cf87cc00ba78c6e3cc8680200b1afa8d433e342bc7744db35e1f64b4a3e5a078"} Dec 04 22:19:37.939132 master-0 kubenswrapper[33572]: I1204 22:19:37.939043 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" event={"ID":"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141","Type":"ContainerStarted","Data":"420541efa65ce9474ef160ccf8e59df368e797415eddfcf4b7828985afa52ca7"} Dec 04 22:19:37.939132 master-0 kubenswrapper[33572]: I1204 22:19:37.939072 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" event={"ID":"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141","Type":"ContainerStarted","Data":"0a1af3e7058502a8428a136432bedb480886bf3096e6470d4e79d520cc6f00b5"} Dec 04 22:19:37.939132 master-0 kubenswrapper[33572]: I1204 22:19:37.939103 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" event={"ID":"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141","Type":"ContainerDied","Data":"2e48ce38bedd0ef286f4eb2d0319a994f1a61d767e4cb03996c6448334d88c07"} Dec 04 22:19:37.939395 master-0 kubenswrapper[33572]: I1204 22:19:37.939149 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" event={"ID":"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141","Type":"ContainerStarted","Data":"7d02c679c1b193ea195c44f77ce5059c11b500930cda814d106399c1a88668f1"} Dec 04 22:19:37.939395 master-0 kubenswrapper[33572]: I1204 22:19:37.939161 33572 scope.go:117] "RemoveContainer" containerID="1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3" Dec 04 22:19:37.939395 master-0 kubenswrapper[33572]: I1204 22:19:37.939231 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 22:19:37.939532 master-0 kubenswrapper[33572]: I1204 22:19:37.939187 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" event={"ID":"b86ff0e8-2c72-4dc6-ac55-3c21940d044f","Type":"ContainerStarted","Data":"f60afa3e3fc7300e00d2058f8d9e1e5f1ef0102a3ee02b211373a7e4b06ab2d2"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.939603 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" event={"ID":"b86ff0e8-2c72-4dc6-ac55-3c21940d044f","Type":"ContainerDied","Data":"577801a549fb8e8b80c730f6cb1e1c0076264ab71677f9e7afd6abe1e4f77036"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.939665 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" event={"ID":"b86ff0e8-2c72-4dc6-ac55-3c21940d044f","Type":"ContainerStarted","Data":"2d1e4c21a00903c83707b4497502b9b73283be3d30ce9a5aaab7e54bf8f72dba"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.939775 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" event={"ID":"6684358b-d7a6-4396-9b4f-ea67d85e4517","Type":"ContainerStarted","Data":"f4a6e4c5c5359ab21e9030099147343225d4aeaef29fb463a8e6710e457570df"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.939814 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" event={"ID":"6684358b-d7a6-4396-9b4f-ea67d85e4517","Type":"ContainerStarted","Data":"d806d9899e8fa078727408a39e9114b2d7cbb567d62907d0beaaea2425600e9f"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.939883 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" event={"ID":"6684358b-d7a6-4396-9b4f-ea67d85e4517","Type":"ContainerStarted","Data":"de08a4b22951aecb57c029d3a74e638dc3b7212560569cf21e54820113aad20f"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940014 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerStarted","Data":"edea31e1fa600a4ae379a373cf8ee62b0384aee39c0285d544acdd5941a71cf8"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940046 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerDied","Data":"cc16e00745629c42a3771375fb21884cb52774d57fe0324c10a8680dd9a3742b"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940092 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerDied","Data":"b8a9c51a67f38c6ea4afbc1a4b2e8c17d0b815c4b55531281069c19c0fd8cfa9"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940125 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" event={"ID":"810c363b-a4c7-428d-a2fb-285adc29f477","Type":"ContainerStarted","Data":"0a8ac4004225e98679de5f00828ef4b72b059bfd913e3c0b107c1aef5ccb1667"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940155 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7cr8g" event={"ID":"651c0fad-1577-4a7f-8718-ec2fd2f06c3e","Type":"ContainerStarted","Data":"2faf8b075190630ed17989d83d75b4c309209b6e0d5c61ff8a357ef81ff71f02"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940187 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7cr8g" event={"ID":"651c0fad-1577-4a7f-8718-ec2fd2f06c3e","Type":"ContainerStarted","Data":"28f1828f69f1c4d0d02aa7e938a400a27e1212d35ae3b194af92227cb1a24b54"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940289 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvkjf" event={"ID":"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae","Type":"ContainerStarted","Data":"8070413a1606f0293af50b080ba5194c2bb89b5ae8414595ea0e41476a830534"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940325 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvkjf" event={"ID":"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae","Type":"ContainerDied","Data":"ebceb6eb636a1f740136f2a1db4a9178448d55ff6db47b35ebd00354ae58e8f7"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940373 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvkjf" event={"ID":"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae","Type":"ContainerDied","Data":"95541d3029d5588838c47cb8939ee7fe2e3c3f04da641f8f8e31b33c2e5cfb73"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940399 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvkjf" event={"ID":"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae","Type":"ContainerStarted","Data":"3a3d3c261f26f43a69f444c9052f030961843ae0bbf89248b5cd01b597da7064"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940429 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" event={"ID":"813f3ee7-35b5-4ee8-b453-00d16d910eae","Type":"ContainerStarted","Data":"34e81e1de548a0f7a7581ef26a98220c98e277cec852a516b3aef35984f983d0"} Dec 04 22:19:37.940540 master-0 kubenswrapper[33572]: I1204 22:19:37.940458 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" event={"ID":"813f3ee7-35b5-4ee8-b453-00d16d910eae","Type":"ContainerDied","Data":"f11072c38e40de60dafeffc2c5ef9e1780820ca0ce672700aba155fa414fe72c"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940487 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" event={"ID":"813f3ee7-35b5-4ee8-b453-00d16d910eae","Type":"ContainerStarted","Data":"0744bc69885cb8b27025aac2761602ed1dd53a9e628a283b5e3ef1e171da58fa"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940663 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" event={"ID":"813f3ee7-35b5-4ee8-b453-00d16d910eae","Type":"ContainerStarted","Data":"4c52307b147fc1f96631f9272147cbdbb3ffe8d871369692fc386dc96586c86f"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940696 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt44t" event={"ID":"ce6002bb-4948-45ab-bb1d-ed65e86b6466","Type":"ContainerStarted","Data":"f4890763f3394b36e511904c8ed5db27be23eefd277f0bd8a125d2e665ac4c24"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940718 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt44t" event={"ID":"ce6002bb-4948-45ab-bb1d-ed65e86b6466","Type":"ContainerDied","Data":"c2ad6d2719e3800fef2a35a9686c68acbf17ddb950d85a4469689ef746cce44d"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940737 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt44t" event={"ID":"ce6002bb-4948-45ab-bb1d-ed65e86b6466","Type":"ContainerDied","Data":"0323783c48e18783d0f18adc0e52bb623413c80d32bdfc761472fc94945f10bc"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940760 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zt44t" event={"ID":"ce6002bb-4948-45ab-bb1d-ed65e86b6466","Type":"ContainerStarted","Data":"60aac9ad737c32ff74467da7a19ec918b06f0d3f5c0137f4d12c177366392be7"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940778 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" event={"ID":"55c4f1e1-1b78-45ec-915d-8055ab3e2786","Type":"ContainerStarted","Data":"63e7406a61d2d8793cce7442c337061b250815115e1952c3a631583a81662033"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940803 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" event={"ID":"55c4f1e1-1b78-45ec-915d-8055ab3e2786","Type":"ContainerStarted","Data":"546a138e440107c12bdd5a4d067e8d3169d68e83def0740c6c5856f29b09acfd"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940816 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" event={"ID":"55c4f1e1-1b78-45ec-915d-8055ab3e2786","Type":"ContainerDied","Data":"1326348e3e2d8bc1da0a2a933c011acb7d00a92e48a2890745437b6f15960271"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940831 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" event={"ID":"55c4f1e1-1b78-45ec-915d-8055ab3e2786","Type":"ContainerStarted","Data":"7e789ca56ff169f6e94ee218684222ae21d64421f7ece2b51fa33799d9ab1ccc"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940843 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" event={"ID":"ceb419e4-d804-4111-b8d8-8436cc2ee617","Type":"ContainerStarted","Data":"41a26220612508dc13a06873ec1bc278ee38a1762a54d12aed432fef2bd1f57f"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940855 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" event={"ID":"ceb419e4-d804-4111-b8d8-8436cc2ee617","Type":"ContainerDied","Data":"255af5caa519126130f0822a951a744700ae3fbbe4597a788a35633eb402cf2a"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940869 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" event={"ID":"ceb419e4-d804-4111-b8d8-8436cc2ee617","Type":"ContainerStarted","Data":"969ff9f891439021550438d6b301fbd4182d12700047c33403f3351c6773134a"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940887 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" event={"ID":"e37d318a-5bf8-46ed-b6de-494102738da7","Type":"ContainerStarted","Data":"b3dfadb55c93406611410f9ae78bf9ce21b1ad64df79b0b3e8022aaceefccc9f"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940901 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" event={"ID":"e37d318a-5bf8-46ed-b6de-494102738da7","Type":"ContainerDied","Data":"79e1b635bb095edbd66388094922c6134f767f1a2efc7b3eca6e45abd8f571c6"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940915 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" event={"ID":"e37d318a-5bf8-46ed-b6de-494102738da7","Type":"ContainerStarted","Data":"ea7c4bd82fb1342059c82a627bee548e2c08bf5d38caa7c1c50de763eb8e9db2"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940930 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" event={"ID":"a8636bd7-fa9e-44b9-82df-9d37b398736d","Type":"ContainerStarted","Data":"ecb995ffd687db0f9c53116cb470abf630b360b0cf85e7d21f6f2cc7513d1f11"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940943 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" event={"ID":"a8636bd7-fa9e-44b9-82df-9d37b398736d","Type":"ContainerDied","Data":"74597696ddcec56d10e68ca1d29cef5d1bfa40762646f7e9dc8729a8c66636fd"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940957 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" event={"ID":"a8636bd7-fa9e-44b9-82df-9d37b398736d","Type":"ContainerStarted","Data":"cf7bfab376e6cd33db16a88818ab6aaf3d6cee23a9706c34952502952f7ad2f6"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940972 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"6e011b0a-89e2-47e3-9112-d46a828416b1","Type":"ContainerDied","Data":"81f5bc53e7bd37d1c3167c411f68ef8d2e2f1eae21a167bd8c740d425e144c3a"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.940987 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"6e011b0a-89e2-47e3-9112-d46a828416b1","Type":"ContainerDied","Data":"fcbb33183ef82fa1ce0f5d881f45fbf26ffc5cbcffaeaf0d3d41a2fb848e78fa"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941000 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcbb33183ef82fa1ce0f5d881f45fbf26ffc5cbcffaeaf0d3d41a2fb848e78fa" Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941011 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"f07a7e680c95366ba4f1d333748a37d99d6fcd0c588c7658749adf9e44cb7229"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941025 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"aab39ce7c056462df6f1a5933a3a5e925b99a0bd484dd0b16b296ab5327006ba"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941036 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"51dfa6423a699c653fb4188616f00305edb215a14ee4fd1dcde5706013f4ee8d"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941047 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"669f49b80171e40aea73e838597bed75920e67751d5f839f6934dbce1fedc710"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941062 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"4f2b4d655e0bf0ff49528fd7206e3f4bbadf9c2ae53c24cb531027c7c4811ac5"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941080 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerStarted","Data":"6ac117d3f888173d5f0f8aae01fddab59b22a163a9082dce4aa284a60ea267f0"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941093 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerDied","Data":"5f1c65cf31bac9c169d2527d0952f6b2fb651f148801aa43c79ceb4a8adb4da6"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941110 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerDied","Data":"5b29db78fe5a1942ea20ecc7d711d841b8eb39751995722550ca54e6750f1a0c"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941205 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerStarted","Data":"ee18ee29a901424dbc24deaf0f12aa2ff23c84383c4ff6df05066daea9fbaebe"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941231 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"dbe54b09-0399-4fbe-9f84-dd9dede0ab96","Type":"ContainerDied","Data":"5a4d99a6b7149fd4133c1e3efcfd35582ffcb1582acaa62e903eb008119e1624"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941252 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"dbe54b09-0399-4fbe-9f84-dd9dede0ab96","Type":"ContainerDied","Data":"d377be8d0441b589958c5adc3aad9974e2610bf718707f9842352d0cb595d25f"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941267 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d377be8d0441b589958c5adc3aad9974e2610bf718707f9842352d0cb595d25f" Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941283 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" event={"ID":"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b","Type":"ContainerStarted","Data":"6528b925137b080482c3192c669cf9e80d961a303d43da94e9d385cdc0ad11de"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941305 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" event={"ID":"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b","Type":"ContainerStarted","Data":"58d0dda8eccec66389fe599f3b47f626740432dcc607d4fd26c725e332dfe13e"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941324 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" event={"ID":"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b","Type":"ContainerStarted","Data":"7d740441407a329d552a22d88957f884c55899842a2703505cfb149663a1e6ff"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941350 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" event={"ID":"56f25fad-089d-4df6-abb1-10d4c76750f1","Type":"ContainerStarted","Data":"c9ce59505b093a4eba51c54c1e5c9ce08ff10211501d1a1158af9490fff34501"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941370 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" event={"ID":"56f25fad-089d-4df6-abb1-10d4c76750f1","Type":"ContainerDied","Data":"d8a2de466dc95e948ba536210f040992057ba7bc222a8102fb88249ab34f040a"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941389 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" event={"ID":"56f25fad-089d-4df6-abb1-10d4c76750f1","Type":"ContainerStarted","Data":"53415730f490fe20266a28cc0d158a34109d64529c7a77302dc65c26e1712dde"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941408 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6mgn6" event={"ID":"c2279404-fa75-4de2-a302-d7b15ead5232","Type":"ContainerStarted","Data":"5b4129a1c8cb6bbffa14cdb3068fee5202673442596555d1d242e01740dd7ca6"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941431 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6mgn6" event={"ID":"c2279404-fa75-4de2-a302-d7b15ead5232","Type":"ContainerStarted","Data":"21f00834ca375d484385e21696e2d8aa4483b916d5393969f0246cfd9dc0471a"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941449 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" event={"ID":"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5","Type":"ContainerStarted","Data":"8633710bdfee80e425263a29ac50ccbbe837adb739d1f5f7dd89215987ff9bbf"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941472 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" event={"ID":"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5","Type":"ContainerStarted","Data":"b9ac4ee53782e9fd4b340ed2b43fd3025db3cb82bd0881252f116248836951ce"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941489 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" event={"ID":"f1534e25-7add-46a1-8f4e-0065c232aa4e","Type":"ContainerStarted","Data":"a2b5c7a83b5284024cd6037f9bc3ab61da8f64f7c1155c2976623ada6236de54"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941531 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" event={"ID":"f1534e25-7add-46a1-8f4e-0065c232aa4e","Type":"ContainerDied","Data":"893b1713daa62af184411a3e5a2cfc2bd5735ca25c31751314839bd533678913"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941552 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" event={"ID":"f1534e25-7add-46a1-8f4e-0065c232aa4e","Type":"ContainerStarted","Data":"264531cb97973b0deb400a67899ce39a8e7e6bd105e2fd0acd10b7958dc4add3"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941570 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-mxfnl" event={"ID":"9160fec1-743a-470e-b48f-95a7ddf1c0b2","Type":"ContainerDied","Data":"719d3f66cbdb2170aefa60d42b234f7eb81fd7d5f45e585cd2b86f0e36930c80"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941589 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-mxfnl" event={"ID":"9160fec1-743a-470e-b48f-95a7ddf1c0b2","Type":"ContainerDied","Data":"102176ffeb29c005b794a7107abe8379110f11cc84e8df32b1ece06e515aee64"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941604 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="102176ffeb29c005b794a7107abe8379110f11cc84e8df32b1ece06e515aee64" Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941626 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5qlk" event={"ID":"0a726f44-a509-46b3-a6d5-70afe3b55e9f","Type":"ContainerStarted","Data":"545625b842711fdda5eaa303742dcd8f82ccc1ed17d0148b2d986d425a02efdb"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941648 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5qlk" event={"ID":"0a726f44-a509-46b3-a6d5-70afe3b55e9f","Type":"ContainerStarted","Data":"36fa51be80da1679d1224935a888d5f59bb5a385358c5ced2fca2235368c4bfe"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941664 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5qlk" event={"ID":"0a726f44-a509-46b3-a6d5-70afe3b55e9f","Type":"ContainerDied","Data":"b1d3f0ea9fb633db12f795b3c197259244e72196814e421d282a1fe412cb79f2"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941685 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5qlk" event={"ID":"0a726f44-a509-46b3-a6d5-70afe3b55e9f","Type":"ContainerStarted","Data":"0c849ebda1ef05c2e7568afd8bbf5411d8e51e42f17fd972708d247af11d0983"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941707 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerStarted","Data":"ade905f65dc817c49631ae16c039b2f7a28b57152bfbb968cd152562a26b9a76"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941848 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerDied","Data":"206992e5a976be25c0ca246941e52ef047963087cb7fb3d7fae48784f22c1968"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941937 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerDied","Data":"f11190eeabf32ca439cc6dbf2e5f945ac6892b6b5bf3d933639699117a6a4cbd"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941951 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerDied","Data":"8bd59644ccf9cb7c047ca7a95b61cb37f033530818fb51a36548a6089157cac2"} Dec 04 22:19:37.942346 master-0 kubenswrapper[33572]: I1204 22:19:37.941963 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" event={"ID":"465637a4-42be-4a65-a859-7af699960138","Type":"ContainerStarted","Data":"ab0050370c98df57df4580a564eddd250b5f7184edab5e925b32343ceb83d58b"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.942607 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.941977 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerStarted","Data":"c8721bb45eb3cc953ffc024aeb4fb1727d1b4358f45d475db67be3cd695e49da"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943660 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerDied","Data":"818c6aaeff094c3713e384dec4d55a28c1f228a4e98ba130afd94743d45d288f"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943674 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerStarted","Data":"6944a3b8b194a602e8e8da3a2a8db2470b7b88d403a9cce77f1224cd0d653cf1"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943684 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" event={"ID":"fb0274dc-fac1-41f9-b3e5-77253d851fdf","Type":"ContainerStarted","Data":"5af5cfe128eaa351f012440567883f2b0f5ad3e1b0e50ea2b67166561450dd28"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943719 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" event={"ID":"72679051-6a4b-4991-85c4-e5d2cbbc6ed7","Type":"ContainerStarted","Data":"8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943731 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" event={"ID":"72679051-6a4b-4991-85c4-e5d2cbbc6ed7","Type":"ContainerStarted","Data":"8688dbdb4594010fcb7f8a4ca909c4672a49bcb4f5b75dd3a827d0833c1ed0fe"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943745 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab","Type":"ContainerDied","Data":"4c4fa6995a939a53e102917b86fbd0f10791e85887df9e375f44a27329f6b171"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943765 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab","Type":"ContainerDied","Data":"93283789117b0576a3fa7e5b0b96c59c1e9ecb75d010ca6d14ffe858c88069c2"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943775 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93283789117b0576a3fa7e5b0b96c59c1e9ecb75d010ca6d14ffe858c88069c2" Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943788 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" event={"ID":"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e","Type":"ContainerStarted","Data":"62384a8d67b4be58bf2e7f4ec5f4cd98b0f7dbf3c8121990f105d429db9c0a66"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943799 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" event={"ID":"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e","Type":"ContainerStarted","Data":"c66dd95c0f8de04a122a0868d06a2262d196a460f011d7a83f8a847ec862a494"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943811 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" event={"ID":"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e","Type":"ContainerStarted","Data":"650ca7f20c2d1cb1f57ba5643ad53b21f17eea7d93316d18d3c9ccbd27770c35"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943822 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jn88h" event={"ID":"fbb8e73f-7e50-451b-b400-e88a86b51e09","Type":"ContainerStarted","Data":"bfa4071d7b4f3516f069aeaba27743542e3344c32b39bb74e634ff273d539b31"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943835 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jn88h" event={"ID":"fbb8e73f-7e50-451b-b400-e88a86b51e09","Type":"ContainerStarted","Data":"93503d425e63d7ea9a2401b9ba117e4c428dd61716b8edbd280c970e0f14741d"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943852 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" event={"ID":"bda1cb0d-26cf-4b94-b359-432492112888","Type":"ContainerStarted","Data":"97e7fad06874576807015929933db6e964b960f7f73a618318b8ef08df129459"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943868 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" event={"ID":"bda1cb0d-26cf-4b94-b359-432492112888","Type":"ContainerStarted","Data":"773c5e8477795f70534d191a1a57bd8d7b1ab26d3d0db825f97bcaae1e3dd144"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943879 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" event={"ID":"0173b8a7-07b4-407a-80b6-d86754072fd8","Type":"ContainerStarted","Data":"fef3967d923683ebb718e4e4b14da4f44280d20b91b49309470c2a559a417975"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943891 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" event={"ID":"0173b8a7-07b4-407a-80b6-d86754072fd8","Type":"ContainerStarted","Data":"fa56f8a9b20a66ea35b68dc55286c46ebdc98bcc65664051a0ce154f588cd501"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943901 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" event={"ID":"0173b8a7-07b4-407a-80b6-d86754072fd8","Type":"ContainerStarted","Data":"99da9d5b3d27d57501f5191969d7c3ca653c3d4bf3252f476bdc359e5ff9e271"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943912 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" event={"ID":"da6da420-9631-4bce-b238-96ab361e23e9","Type":"ContainerDied","Data":"d281c21cd4e6a5bdaa904d1cec96c25970563004aa9943074804eafd85bd5f1e"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943929 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x" event={"ID":"da6da420-9631-4bce-b238-96ab361e23e9","Type":"ContainerDied","Data":"27e285c843ede4a182c19163bdbdbd9ef215a975e244f28e8e13fb302c8fae17"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943938 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27e285c843ede4a182c19163bdbdbd9ef215a975e244f28e8e13fb302c8fae17" Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943949 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" event={"ID":"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76","Type":"ContainerStarted","Data":"b0b856c01858e3a541b23da67afa5b732b7e863db4e3256d48d200dfe4e813a1"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943960 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" event={"ID":"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76","Type":"ContainerStarted","Data":"e45c3db8e761eb5b44659f9feeda0856ca624c4d5c1890015c38703f5a40670b"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943972 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" event={"ID":"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76","Type":"ContainerStarted","Data":"f52cf2210341d466000caf1944f36a1a658725324f859c977eae23b9f624b896"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943982 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" event={"ID":"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76","Type":"ContainerStarted","Data":"857d5516010228f43de819bef01594930c5dd807f2fd4da2fb1509f797fa1774"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.943992 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdrkm" event={"ID":"ae107ad4-104c-4264-9844-afb3af28b19e","Type":"ContainerStarted","Data":"25be24c776edf99f501f87f528c64d0bdb9dfd3345a31d68783da8815130b293"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944005 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdrkm" event={"ID":"ae107ad4-104c-4264-9844-afb3af28b19e","Type":"ContainerDied","Data":"e77c322db09ee028391834636928860ad589dd50d5763a9eb98bf7d157a2104d"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944019 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdrkm" event={"ID":"ae107ad4-104c-4264-9844-afb3af28b19e","Type":"ContainerDied","Data":"54a32de727a29737d3f9e1ca99dbe42daef248c481ccfc250f9a1754750f20c0"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944028 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sdrkm" event={"ID":"ae107ad4-104c-4264-9844-afb3af28b19e","Type":"ContainerStarted","Data":"c9c2e653f3b9114eb61aa0f9377a8f57b59a4f433c04f9168d0a7788bc429f4a"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944039 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" event={"ID":"e065179e-634a-4cbe-bb59-5b01c514e4de","Type":"ContainerStarted","Data":"2a99a3b20bc07c50baf33232b49049d3fc9873a89ffad171bcaa3c8be2482524"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944050 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" event={"ID":"e065179e-634a-4cbe-bb59-5b01c514e4de","Type":"ContainerDied","Data":"aec30a53010adc6ee6176e40e860c2639cbdf974b27b2d24e1d71f75f8a5c427"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944071 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" event={"ID":"e065179e-634a-4cbe-bb59-5b01c514e4de","Type":"ContainerStarted","Data":"faca5825f225200d539012cd15637c81be7952566db3c330569664dfc0412aa0"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944086 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerStarted","Data":"464365c5b542ffa135ae1b7b53dc0c8855618211c6ebb6a47be42bdf1f3e9e4e"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944102 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerDied","Data":"45f93308614301d84cb0176bf1dabc3de4bbaffff580b91a3cb5db6707e27be7"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944113 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerStarted","Data":"4187b1d7d08b53bf4814dc4dfbd0a6ff2e8881049bae7dd2ea8c02223e861224"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944126 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" event={"ID":"a3899a38-39b8-4b48-81e5-4d8854ecc8ab","Type":"ContainerStarted","Data":"331b7dcdf8e2d87b8a376fb7581bf5553e5209af627a95ddb5944fe5be12cb6d"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944136 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerDied","Data":"35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944149 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"7d50a2402c9a263e61e9e85f0a1f6b2e94c325a730cd34ce03c35d10609073b3"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944158 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"479597d06e399852cde3f4983981240e3b9a935772d2dd22d716d20e734ab158"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944172 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" event={"ID":"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee","Type":"ContainerStarted","Data":"8e8fdeb690fdc94cc15bed644ba01ada6bb64532ff187c42fd17e48621eea529"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944184 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" event={"ID":"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee","Type":"ContainerStarted","Data":"b50d3dde2385d41664f1a848281a1446b281ce5db83aba57c400f3d223be8bb9"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944193 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" event={"ID":"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee","Type":"ContainerDied","Data":"989d170e7a52318dc012c9f9d9615ad932d427b1e6a54fccb0b6d83f4ebb7d45"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944205 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" event={"ID":"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee","Type":"ContainerStarted","Data":"8f9c23aa8f546cd0849cf585c0cd6540010999bcb6db0df49677dfec81935af9"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944216 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-7vlpp" event={"ID":"2d142201-6e77-4828-b86b-05d4144a2f08","Type":"ContainerStarted","Data":"c86201b566b2834b19c08527807fc66ebfecefd94445a119b31c2c29928e06b2"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944227 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-7vlpp" event={"ID":"2d142201-6e77-4828-b86b-05d4144a2f08","Type":"ContainerDied","Data":"1129d1c5176ef3c828bda41dc553996cb75881e0c9229783b32fa908eaa25ec0"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944241 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-7vlpp" event={"ID":"2d142201-6e77-4828-b86b-05d4144a2f08","Type":"ContainerStarted","Data":"1ed0b431491d7769d0a806c2775a07d37b29bbfb434d8d9d3536f46e64b03c26"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944252 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" event={"ID":"e7a7f632-2442-4837-b068-c22b03c71fb0","Type":"ContainerStarted","Data":"a27f538b778d0ea81aed785fdee98c12d30d1580e7e14efd70e9fce3d624a217"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944264 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" event={"ID":"e7a7f632-2442-4837-b068-c22b03c71fb0","Type":"ContainerStarted","Data":"4d378b74b84c73d0247bb3e1b1ed1084c6d9b778481316b2ed8d047f73fdee53"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944274 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" event={"ID":"24648a41-875f-4e98-8b21-3bdd38dffa32","Type":"ContainerStarted","Data":"a0dd18e9d53c04bd711b1f1f59edffb36005d2ce1832890d22002f1e0075036b"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944285 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" event={"ID":"24648a41-875f-4e98-8b21-3bdd38dffa32","Type":"ContainerDied","Data":"0e28b0cb43f14ec5571ccac1eddb3ba4fa32d2d2a461323c61e5fc655cd04d29"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944298 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" event={"ID":"24648a41-875f-4e98-8b21-3bdd38dffa32","Type":"ContainerStarted","Data":"aa68f9d56263db1347f5b685dc0ebf74ef5421224e859f78ff8a3e6563e1d376"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944308 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerStarted","Data":"a43528b1d37a8cf8c729e8c6970f63fadaafd2a2a5d053a62faf1bb166b17472"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944325 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerStarted","Data":"21c5db260fc9c003b5979ff05e774d8d4b66aafbdc7ee070b9faa8bb51459bea"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944336 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerStarted","Data":"b1f6fb04eb6c1c9d551e263a6b6af6d08c8b7f2c8d5ec4566af25c8704b19d39"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944346 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerDied","Data":"d4cd669f8e4bd3008a9642035eec139a13bd3a586fb003b3adb4d948956c28f6"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944356 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerDied","Data":"d7411bec11d115b15a691f3d3010646ecd1f289830d7539f76d0467f6cd83226"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944367 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" event={"ID":"74197c50-9a41-40e8-9289-c7e6afbd3737","Type":"ContainerStarted","Data":"4fb8b7eb82d3c4d48b5f1cf2512cb01480c4c7ee72d7b27dc0bc0ec05cc4c756"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944377 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" event={"ID":"a544105a-5bec-456a-aef6-c160943c1f67","Type":"ContainerStarted","Data":"d3af1e9bd4b92fa430f21c986760b6d7883d675546cc4919c1dfaa7e778ab068"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944393 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" event={"ID":"a544105a-5bec-456a-aef6-c160943c1f67","Type":"ContainerDied","Data":"419aa165bec1fc028d5393e03a6724568d6a2d80fb3a00accfe0a6d847f186e7"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944405 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" event={"ID":"a544105a-5bec-456a-aef6-c160943c1f67","Type":"ContainerStarted","Data":"66619c69e4b847552d57d3d3a8444e9c2e4fc1e181b2bea1f6875b0e80bcc878"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944415 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerDied","Data":"5fcdcfec6584d4b81237f292d9431a4c2eccc32a97cb24f1ad54b8dd23d476bb"} Dec 04 22:19:37.945183 master-0 kubenswrapper[33572]: I1204 22:19:37.944429 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerDied","Data":"0036ee313e2e8fbc7aa4a79880a6b001a94f998abd62378bddfdf0a04bcdd8e0"} Dec 04 22:19:37.950616 master-0 kubenswrapper[33572]: I1204 22:19:37.949179 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 22:19:37.953664 master-0 kubenswrapper[33572]: E1204 22:19:37.953625 33572 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"openshift-kube-scheduler-master-0\" already exists" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:19:37.957599 master-0 kubenswrapper[33572]: I1204 22:19:37.957477 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 22:19:37.974128 master-0 kubenswrapper[33572]: I1204 22:19:37.974086 33572 scope.go:117] "RemoveContainer" containerID="1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3" Dec 04 22:19:37.974942 master-0 kubenswrapper[33572]: E1204 22:19:37.974886 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3\": container with ID starting with 1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3 not found: ID does not exist" containerID="1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3" Dec 04 22:19:37.975182 master-0 kubenswrapper[33572]: I1204 22:19:37.974954 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3"} err="failed to get container status \"1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3\": rpc error: code = NotFound desc = could not find container \"1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3\": container with ID starting with 1df0f24654d6d32fdd5edd7216e0459cb97e5fad05543f3707bd1dec02f585b3 not found: ID does not exist" Dec 04 22:19:37.978001 master-0 kubenswrapper[33572]: I1204 22:19:37.977784 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 22:19:37.982761 master-0 kubenswrapper[33572]: I1204 22:19:37.982725 33572 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 04 22:19:37.987096 master-0 kubenswrapper[33572]: I1204 22:19:37.986902 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-ovn\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.987096 master-0 kubenswrapper[33572]: I1204 22:19:37.986945 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.987096 master-0 kubenswrapper[33572]: I1204 22:19:37.986981 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d7pj\" (UniqueName: \"kubernetes.io/projected/f1534e25-7add-46a1-8f4e-0065c232aa4e-kube-api-access-4d7pj\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:19:37.987096 master-0 kubenswrapper[33572]: I1204 22:19:37.987005 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/800f436c-145d-4281-8d4d-644ba2cb0ebb-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:19:37.987096 master-0 kubenswrapper[33572]: I1204 22:19:37.987027 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-trusted-ca-bundle\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:37.987444 master-0 kubenswrapper[33572]: I1204 22:19:37.987392 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-socket-dir-parent\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.987529 master-0 kubenswrapper[33572]: I1204 22:19:37.987459 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:37.987681 master-0 kubenswrapper[33572]: I1204 22:19:37.987648 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:19:37.987739 master-0 kubenswrapper[33572]: I1204 22:19:37.987698 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-catalog-content\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:19:37.987739 master-0 kubenswrapper[33572]: I1204 22:19:37.987726 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pctsn\" (UniqueName: \"kubernetes.io/projected/c178afcf-b713-4c74-b22b-6169ba3123f5-kube-api-access-pctsn\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:37.987804 master-0 kubenswrapper[33572]: I1204 22:19:37.987748 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:19:37.987804 master-0 kubenswrapper[33572]: I1204 22:19:37.987774 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smtnh\" (UniqueName: \"kubernetes.io/projected/17912746-74eb-4c78-8c1b-2f66e7ce4299-kube-api-access-smtnh\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:37.987804 master-0 kubenswrapper[33572]: I1204 22:19:37.987795 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-service-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:37.987899 master-0 kubenswrapper[33572]: I1204 22:19:37.987816 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc47q\" (UniqueName: \"kubernetes.io/projected/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-kube-api-access-jc47q\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:37.987899 master-0 kubenswrapper[33572]: I1204 22:19:37.987844 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/74197c50-9a41-40e8-9289-c7e6afbd3737-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:37.987899 master-0 kubenswrapper[33572]: I1204 22:19:37.987864 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-catalog-content\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:19:37.988015 master-0 kubenswrapper[33572]: I1204 22:19:37.987870 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/465637a4-42be-4a65-a859-7af699960138-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:19:37.988015 master-0 kubenswrapper[33572]: I1204 22:19:37.987952 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngkqz\" (UniqueName: \"kubernetes.io/projected/800f436c-145d-4281-8d4d-644ba2cb0ebb-kube-api-access-ngkqz\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:19:37.988015 master-0 kubenswrapper[33572]: I1204 22:19:37.987981 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2279404-fa75-4de2-a302-d7b15ead5232-hosts-file\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:19:37.988015 master-0 kubenswrapper[33572]: I1204 22:19:37.988006 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:37.988153 master-0 kubenswrapper[33572]: I1204 22:19:37.988028 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-serving-ca\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.988153 master-0 kubenswrapper[33572]: I1204 22:19:37.988050 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-config\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:37.988153 master-0 kubenswrapper[33572]: I1204 22:19:37.988073 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-tmpfs\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:37.988153 master-0 kubenswrapper[33572]: I1204 22:19:37.988094 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:37.988153 master-0 kubenswrapper[33572]: I1204 22:19:37.988118 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-os-release\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:37.988153 master-0 kubenswrapper[33572]: I1204 22:19:37.988134 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/465637a4-42be-4a65-a859-7af699960138-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:19:37.988468 master-0 kubenswrapper[33572]: I1204 22:19:37.988139 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:37.988468 master-0 kubenswrapper[33572]: I1204 22:19:37.988212 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-netns\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.988468 master-0 kubenswrapper[33572]: I1204 22:19:37.988239 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-config\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:37.988468 master-0 kubenswrapper[33572]: I1204 22:19:37.988262 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-utilities\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:19:37.988468 master-0 kubenswrapper[33572]: I1204 22:19:37.988287 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55c4f1e1-1b78-45ec-915d-8055ab3e2786-proxy-tls\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:37.988946 master-0 kubenswrapper[33572]: I1204 22:19:37.988748 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-utilities\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:19:37.988946 master-0 kubenswrapper[33572]: I1204 22:19:37.988777 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mttq\" (UniqueName: \"kubernetes.io/projected/465637a4-42be-4a65-a859-7af699960138-kube-api-access-4mttq\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:19:37.988946 master-0 kubenswrapper[33572]: I1204 22:19:37.988790 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-tmpfs\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:37.988946 master-0 kubenswrapper[33572]: I1204 22:19:37.988810 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbpk\" (UniqueName: \"kubernetes.io/projected/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-kube-api-access-xdbpk\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:19:37.988946 master-0 kubenswrapper[33572]: I1204 22:19:37.988832 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:37.988946 master-0 kubenswrapper[33572]: I1204 22:19:37.988853 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-serving-cert\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:37.988946 master-0 kubenswrapper[33572]: I1204 22:19:37.988870 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/989a73ce-3898-4f65-a437-2c7061f9375f-audit-dir\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:37.988946 master-0 kubenswrapper[33572]: I1204 22:19:37.988888 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/465637a4-42be-4a65-a859-7af699960138-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:19:37.988946 master-0 kubenswrapper[33572]: I1204 22:19:37.988906 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.988956 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-trusted-ca-bundle\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.988991 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989027 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/465637a4-42be-4a65-a859-7af699960138-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989043 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-serving-ca\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989073 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989034 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989131 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9rxt\" (UniqueName: \"kubernetes.io/projected/29828f55-427b-4fe3-8713-03bcd6ac9dec-kube-api-access-t9rxt\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989158 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-config\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989191 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3863c74-8f22-4c67-bef5-2d0d39df4abd-serving-cert\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989260 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-systemd\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989307 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-slash\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989355 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr65l\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-kube-api-access-lr65l\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989400 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2lwr\" (UniqueName: \"kubernetes.io/projected/5598683a-cd32-486d-8839-205829d55cc2-kube-api-access-s2lwr\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989426 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-trusted-ca-bundle\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.989455 master-0 kubenswrapper[33572]: I1204 22:19:37.989436 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-certs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:19:37.989946 master-0 kubenswrapper[33572]: I1204 22:19:37.989534 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2ghk\" (UniqueName: \"kubernetes.io/projected/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-kube-api-access-g2ghk\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:19:37.989946 master-0 kubenswrapper[33572]: I1204 22:19:37.989564 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6684358b-d7a6-4396-9b4f-ea67d85e4517-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:37.989946 master-0 kubenswrapper[33572]: I1204 22:19:37.989600 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:19:37.989946 master-0 kubenswrapper[33572]: I1204 22:19:37.989635 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vpxd\" (UniqueName: \"kubernetes.io/projected/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-kube-api-access-2vpxd\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:37.989946 master-0 kubenswrapper[33572]: I1204 22:19:37.989739 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gj4j\" (UniqueName: \"kubernetes.io/projected/ae107ad4-104c-4264-9844-afb3af28b19e-kube-api-access-9gj4j\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:19:37.989946 master-0 kubenswrapper[33572]: I1204 22:19:37.989765 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fmp4\" (UniqueName: \"kubernetes.io/projected/989a73ce-3898-4f65-a437-2c7061f9375f-kube-api-access-7fmp4\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:37.989946 master-0 kubenswrapper[33572]: I1204 22:19:37.989787 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:19:37.989946 master-0 kubenswrapper[33572]: I1204 22:19:37.989815 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd6d8\" (UniqueName: \"kubernetes.io/projected/4f22eee4-a42d-4d2b-bffa-6c3f29f1f026-kube-api-access-vd6d8\") pod \"csi-snapshot-controller-6b958b6f94-w7hnc\" (UID: \"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" Dec 04 22:19:37.990206 master-0 kubenswrapper[33572]: I1204 22:19:37.989965 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-multus\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.990206 master-0 kubenswrapper[33572]: I1204 22:19:37.989996 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:37.990206 master-0 kubenswrapper[33572]: I1204 22:19:37.990022 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-config\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:19:37.990206 master-0 kubenswrapper[33572]: I1204 22:19:37.990023 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:19:37.990206 master-0 kubenswrapper[33572]: I1204 22:19:37.990046 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-trusted-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:37.990206 master-0 kubenswrapper[33572]: I1204 22:19:37.990071 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/813f3ee7-35b5-4ee8-b453-00d16d910eae-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:19:37.990206 master-0 kubenswrapper[33572]: I1204 22:19:37.990201 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-utilities\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990233 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9z4k\" (UniqueName: \"kubernetes.io/projected/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-api-access-b9z4k\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990271 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17912746-74eb-4c78-8c1b-2f66e7ce4299-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990295 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7d9j\" (UniqueName: \"kubernetes.io/projected/e7a7f632-2442-4837-b068-c22b03c71fb0-kube-api-access-c7d9j\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990316 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990332 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-config\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990337 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990390 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990420 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a544105a-5bec-456a-aef6-c160943c1f67-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990421 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-utilities\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990574 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a544105a-5bec-456a-aef6-c160943c1f67-config\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990637 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990675 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990720 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990784 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990807 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a544105a-5bec-456a-aef6-c160943c1f67-config\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990816 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a544105a-5bec-456a-aef6-c160943c1f67-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990830 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-env-overrides\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.990846 master-0 kubenswrapper[33572]: I1204 22:19:37.990868 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.990905 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.990942 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f091088-2166-4026-9fa6-62bd83407edb-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.990964 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f893663c-7c1e-4eda-9839-99c1c0440304-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991017 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-env-overrides\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991067 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991110 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ndk\" (UniqueName: \"kubernetes.io/projected/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-kube-api-access-w2ndk\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991129 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991144 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991166 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w592\" (UniqueName: \"kubernetes.io/projected/813f3ee7-35b5-4ee8-b453-00d16d910eae-kube-api-access-8w592\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991185 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-config\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991204 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcgg9\" (UniqueName: \"kubernetes.io/projected/6c8c45e0-2342-499b-aa6b-339b6a722a87-kube-api-access-gcgg9\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991221 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-config\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991242 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-images\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991244 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/addddaac-a31a-4dbf-b78f-87225b11b463-metrics-tls\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991260 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/74197c50-9a41-40e8-9289-c7e6afbd3737-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991280 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwx5k\" (UniqueName: \"kubernetes.io/projected/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-kube-api-access-mwx5k\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991297 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991313 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-catalog-content\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991324 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f091088-2166-4026-9fa6-62bd83407edb-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991330 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991397 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-binary-copy\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991438 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-catalog-content\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991440 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-tuned\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991483 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991560 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991568 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-tuned\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991597 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-encryption-config\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.991649 master-0 kubenswrapper[33572]: I1204 22:19:37.991682 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.991687 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-binary-copy\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.991725 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-encryption-config\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.991729 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-client\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.991766 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-config\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.991806 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-cabundle\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.991870 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwk6f\" (UniqueName: \"kubernetes.io/projected/46229484-5fa1-4595-94a0-44477abae90e-kube-api-access-jwk6f\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.991992 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0beb871c-3bf1-471c-a028-746a650267bf-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992018 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8636bd7-fa9e-44b9-82df-9d37b398736d-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992041 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq55v\" (UniqueName: \"kubernetes.io/projected/34cad3de-8f3f-48cd-bd39-8745fad19e65-kube-api-access-tq55v\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992214 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc4z5\" (UniqueName: \"kubernetes.io/projected/c3863c74-8f22-4c67-bef5-2d0d39df4abd-kube-api-access-pc4z5\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992268 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-kubernetes\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992305 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992346 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992364 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0beb871c-3bf1-471c-a028-746a650267bf-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992388 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2d142201-6e77-4828-b86b-05d4144a2f08-snapshots\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992432 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5dac8e25-0f51-4c04-929c-060479689a9d-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992579 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2d142201-6e77-4828-b86b-05d4144a2f08-snapshots\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992577 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992619 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpksd\" (UniqueName: \"kubernetes.io/projected/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-kube-api-access-gpksd\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992641 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6vjb\" (UniqueName: \"kubernetes.io/projected/4d68dcb1-efe4-425f-9b28-1e5575548a32-kube-api-access-r6vjb\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992667 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-modprobe-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992692 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-ovnkube-identity-cm\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992715 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-conf-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992759 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-etc-kubernetes\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992847 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:37.992862 master-0 kubenswrapper[33572]: I1204 22:19:37.992877 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-k8s-cni-cncf-io\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.992907 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.992961 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-system-cni-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993022 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r2fn\" (UniqueName: \"kubernetes.io/projected/bda1cb0d-26cf-4b94-b359-432492112888-kube-api-access-8r2fn\") pod \"network-check-source-85d8db45d4-5gbc4\" (UID: \"bda1cb0d-26cf-4b94-b359-432492112888\") " pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993063 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993100 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-srv-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993140 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-root\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993182 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993219 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8636bd7-fa9e-44b9-82df-9d37b398736d-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993261 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993303 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-stats-auth\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993339 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-conf\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993367 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-metrics-tls\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993479 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56f25fad-089d-4df6-abb1-10d4c76750f1-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993591 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690b447a-19c0-4925-bc9d-d0c86a83a377-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993657 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b966c210-5415-4fa5-88ab-c85aba979b28-tls-certificates\") pod \"prometheus-operator-admission-webhook-7c85c4dffd-mp4qx\" (UID: \"b966c210-5415-4fa5-88ab-c85aba979b28\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:19:37.993927 master-0 kubenswrapper[33572]: I1204 22:19:37.993884 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-audit-log\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.993954 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6b4p\" (UniqueName: \"kubernetes.io/projected/fbb8e73f-7e50-451b-b400-e88a86b51e09-kube-api-access-c6b4p\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994011 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5c2d3b8-41c0-4531-b770-57b7c567fe30-config-volume\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994040 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-audit-log\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994065 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c178afcf-b713-4c74-b22b-6169ba3123f5-service-ca-bundle\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994094 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690b447a-19c0-4925-bc9d-d0c86a83a377-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994126 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994190 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46229484-5fa1-4595-94a0-44477abae90e-serving-cert\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994281 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a043ea49-97f9-4ae6-83b9-733f12754d94-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994348 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsk29\" (UniqueName: \"kubernetes.io/projected/967bf4ac-f025-4296-8ed9-183a345f6b7c-kube-api-access-hsk29\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994408 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g7n9\" (UniqueName: \"kubernetes.io/projected/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-kube-api-access-4g7n9\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994479 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5c2d3b8-41c0-4531-b770-57b7c567fe30-metrics-tls\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994587 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46229484-5fa1-4595-94a0-44477abae90e-serving-cert\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994611 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87gv4\" (UniqueName: \"kubernetes.io/projected/ce6002bb-4948-45ab-bb1d-ed65e86b6466-kube-api-access-87gv4\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:19:37.994698 master-0 kubenswrapper[33572]: I1204 22:19:37.994673 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/fb0274dc-fac1-41f9-b3e5-77253d851fdf-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:37.995240 master-0 kubenswrapper[33572]: I1204 22:19:37.994745 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-bound-sa-token\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:19:37.995240 master-0 kubenswrapper[33572]: I1204 22:19:37.994885 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-env-overrides\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:19:37.995240 master-0 kubenswrapper[33572]: I1204 22:19:37.994936 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:37.995240 master-0 kubenswrapper[33572]: I1204 22:19:37.994969 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:37.995240 master-0 kubenswrapper[33572]: I1204 22:19:37.994998 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-kubelet\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.995240 master-0 kubenswrapper[33572]: I1204 22:19:37.995021 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:37.995240 master-0 kubenswrapper[33572]: I1204 22:19:37.995052 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jmv\" (UniqueName: \"kubernetes.io/projected/0a726f44-a509-46b3-a6d5-70afe3b55e9f-kube-api-access-k8jmv\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:37.995240 master-0 kubenswrapper[33572]: I1204 22:19:37.995138 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:37.995240 master-0 kubenswrapper[33572]: I1204 22:19:37.995193 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r57bb\" (UniqueName: \"kubernetes.io/projected/e37d318a-5bf8-46ed-b6de-494102738da7-kube-api-access-r57bb\") pod \"csi-snapshot-controller-operator-6bc8656fdc-xhndk\" (UID: \"e37d318a-5bf8-46ed-b6de-494102738da7\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.995357 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lclkg\" (UniqueName: \"kubernetes.io/projected/871cb002-67f4-43aa-a41d-7a5b2f340059-kube-api-access-lclkg\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.995400 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.995597 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.995638 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-netns\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.995675 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.995717 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-apiservice-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.995834 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-proxy-ca-bundles\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.995879 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.995969 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8636bd7-fa9e-44b9-82df-9d37b398736d-kube-api-access\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996008 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/800f436c-145d-4281-8d4d-644ba2cb0ebb-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996037 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996091 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-etc-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996122 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cct\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-kube-api-access-m4cct\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996170 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996187 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-serving-cert\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996202 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-multus-certs\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996327 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996386 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996425 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-sys\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996472 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cfhv\" (UniqueName: \"kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-kube-api-access-2cfhv\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996494 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996619 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-cni-binary-copy\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996726 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-srv-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996900 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:19:37.996997 master-0 kubenswrapper[33572]: I1204 22:19:37.996957 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cblk\" (UniqueName: \"kubernetes.io/projected/2d142201-6e77-4828-b86b-05d4144a2f08-kube-api-access-6cblk\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997093 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-cni-binary-copy\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997093 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-node-bootstrap-token\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997219 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-wtmp\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997298 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5t2f\" (UniqueName: \"kubernetes.io/projected/7f091088-2166-4026-9fa6-62bd83407edb-kube-api-access-s5t2f\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997374 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997382 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24648a41-875f-4e98-8b21-3bdd38dffa32-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997478 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5598683a-cd32-486d-8839-205829d55cc2-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997549 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997593 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-images\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997642 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8gh2\" (UniqueName: \"kubernetes.io/projected/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-kube-api-access-n8gh2\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997681 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-volume-directive-shadow\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997720 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997763 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997803 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997727 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24648a41-875f-4e98-8b21-3bdd38dffa32-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997904 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-node-pullsecrets\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997956 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-volume-directive-shadow\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.997975 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/810c363b-a4c7-428d-a2fb-285adc29f477-available-featuregates\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998033 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2jq\" (UniqueName: \"kubernetes.io/projected/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-kube-api-access-pt2jq\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998080 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-config\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998091 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/810c363b-a4c7-428d-a2fb-285adc29f477-available-featuregates\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998114 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a726f44-a509-46b3-a6d5-70afe3b55e9f-metrics-client-ca\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998158 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-host\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998244 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998263 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-netd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998373 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4rft\" (UniqueName: \"kubernetes.io/projected/59d3d0d8-1a2a-4d14-8312-d33818acba88-kube-api-access-d4rft\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998418 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/addddaac-a31a-4dbf-b78f-87225b11b463-trusted-ca\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998458 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-bin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:37.998480 master-0 kubenswrapper[33572]: I1204 22:19:37.998542 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8g99\" (UniqueName: \"kubernetes.io/projected/55c4f1e1-1b78-45ec-915d-8055ab3e2786-kube-api-access-b8g99\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.998609 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35821f48-b000-4915-847f-a739b6efc5ee-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.998708 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7l9n\" (UniqueName: \"kubernetes.io/projected/ceb419e4-d804-4111-b8d8-8436cc2ee617-kube-api-access-c7l9n\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.998767 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-cnibin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.998829 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.998893 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scht6\" (UniqueName: \"kubernetes.io/projected/a544105a-5bec-456a-aef6-c160943c1f67-kube-api-access-scht6\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.998950 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/addddaac-a31a-4dbf-b78f-87225b11b463-trusted-ca\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.998957 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.998980 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35821f48-b000-4915-847f-a739b6efc5ee-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999025 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-rootfs\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999085 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd5zx\" (UniqueName: \"kubernetes.io/projected/c2279404-fa75-4de2-a302-d7b15ead5232-kube-api-access-dd5zx\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999150 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w82st\" (UniqueName: \"kubernetes.io/projected/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-kube-api-access-w82st\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999212 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-encryption-config\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999279 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46229484-5fa1-4595-94a0-44477abae90e-config\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999338 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-var-lib-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999402 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999464 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690b447a-19c0-4925-bc9d-d0c86a83a377-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999569 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-serving-ca\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999634 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999664 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999694 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-hostroot\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999757 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690b447a-19c0-4925-bc9d-d0c86a83a377-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999754 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999779 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46229484-5fa1-4595-94a0-44477abae90e-config\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999828 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4cdh\" (UniqueName: \"kubernetes.io/projected/0173b8a7-07b4-407a-80b6-d86754072fd8-kube-api-access-z4cdh\") pod \"migrator-74b7b57c65-nzpb5\" (UID: \"0173b8a7-07b4-407a-80b6-d86754072fd8\") " pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999861 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-cnibin\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999885 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999909 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999932 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:37.999958 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5598683a-cd32-486d-8839-205829d55cc2-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:38.000026 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-mcd-auth-proxy-config\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:38.000126 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch6s4\" (UniqueName: \"kubernetes.io/projected/5dac8e25-0f51-4c04-929c-060479689a9d-kube-api-access-ch6s4\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:38.000141 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:38.000171 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-proxy-tls\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:38.000230 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:38.000203 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:38.000262 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:38.000237 master-0 kubenswrapper[33572]: I1204 22:19:38.000295 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000331 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e065179e-634a-4cbe-bb59-5b01c514e4de-config\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000364 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-host-slash\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000407 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-node-log\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000563 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000584 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e065179e-634a-4cbe-bb59-5b01c514e4de-config\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000644 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-metrics-certs\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000714 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit-dir\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000756 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-audit-policies\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000785 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e065179e-634a-4cbe-bb59-5b01c514e4de-serving-cert\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000931 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysconfig\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000979 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.000997 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e065179e-634a-4cbe-bb59-5b01c514e4de-serving-cert\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001031 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-key\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001066 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgt75\" (UniqueName: \"kubernetes.io/projected/634c1df6-de4d-4e26-8c71-d39311cae0ce-kube-api-access-xgt75\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001094 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001123 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-catalog-content\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001159 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001285 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-catalog-content\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001305 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001367 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-cache\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001419 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001456 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-cache\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001460 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-tmp\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001577 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-client\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001629 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbb8e73f-7e50-451b-b400-e88a86b51e09-tmp\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001644 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-os-release\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001645 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001673 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c6a5d14d-0409-4024-b0a8-200fa2594185-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001704 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f893663c-7c1e-4eda-9839-99c1c0440304-serving-cert\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001812 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e065179e-634a-4cbe-bb59-5b01c514e4de-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001838 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-utilities\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001889 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.001927 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae107ad4-104c-4264-9844-afb3af28b19e-utilities\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002013 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810c363b-a4c7-428d-a2fb-285adc29f477-serving-cert\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002047 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002072 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w8vs\" (UniqueName: \"kubernetes.io/projected/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-kube-api-access-2w8vs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002088 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-whereabouts-configmap\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002102 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002084 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-etcd-client\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002164 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f893663c-7c1e-4eda-9839-99c1c0440304-serving-cert\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002184 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/871cb002-67f4-43aa-a41d-7a5b2f340059-metrics-tls\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002256 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-systemd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002303 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/634c1df6-de4d-4e26-8c71-d39311cae0ce-webhook-cert\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002343 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002468 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/871cb002-67f4-43aa-a41d-7a5b2f340059-metrics-tls\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:19:38.002483 master-0 kubenswrapper[33572]: I1204 22:19:38.002562 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfcv9\" (UniqueName: \"kubernetes.io/projected/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-kube-api-access-bfcv9\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.002645 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq44d\" (UniqueName: \"kubernetes.io/projected/74197c50-9a41-40e8-9289-c7e6afbd3737-kube-api-access-hq44d\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.002696 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wh6b\" (UniqueName: \"kubernetes.io/projected/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-kube-api-access-9wh6b\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.002740 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/871cb002-67f4-43aa-a41d-7a5b2f340059-host-etc-kube\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.002791 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.002848 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb0274dc-fac1-41f9-b3e5-77253d851fdf-cache\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.002945 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j8fr\" (UniqueName: \"kubernetes.io/projected/76fd9f44-4365-4271-8772-025655c50334-kube-api-access-9j8fr\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.002985 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/634c1df6-de4d-4e26-8c71-d39311cae0ce-webhook-cert\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003018 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/fb0274dc-fac1-41f9-b3e5-77253d851fdf-cache\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003042 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003121 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-webhook-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003191 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-utilities\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003303 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-utilities\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003341 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003426 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1534e25-7add-46a1-8f4e-0065c232aa4e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003432 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/0beb871c-3bf1-471c-a028-746a650267bf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003496 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003594 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-systemd-units\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003660 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cert\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003718 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003779 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-client\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003838 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003899 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovn-node-metrics-cert\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003907 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.003962 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxkk\" (UniqueName: \"kubernetes.io/projected/690b447a-19c0-4925-bc9d-d0c86a83a377-kube-api-access-wsxkk\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004048 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004108 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4czl\" (UniqueName: \"kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-kube-api-access-r4czl\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004126 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-etcd-client\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004166 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-client-ca\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004244 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkg8s\" (UniqueName: \"kubernetes.io/projected/6684358b-d7a6-4396-9b4f-ea67d85e4517-kube-api-access-qkg8s\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004317 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24648a41-875f-4e98-8b21-3bdd38dffa32-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004484 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-system-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004588 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-daemon-config\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004637 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-image-import-ca\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004677 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jlvp\" (UniqueName: \"kubernetes.io/projected/a043ea49-97f9-4ae6-83b9-733f12754d94-kube-api-access-6jlvp\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004869 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24648a41-875f-4e98-8b21-3bdd38dffa32-config\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004905 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfklr\" (UniqueName: \"kubernetes.io/projected/c6a5d14d-0409-4024-b0a8-200fa2594185-kube-api-access-bfklr\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004922 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-daemon-config\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004955 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-image-import-ca\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.004933 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-serving-cert\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005021 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005062 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-script-lib\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005096 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-serving-cert\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005239 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24648a41-875f-4e98-8b21-3bdd38dffa32-config\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005262 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nrj9\" (UniqueName: \"kubernetes.io/projected/810c363b-a4c7-428d-a2fb-285adc29f477-kube-api-access-2nrj9\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005312 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-run\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005351 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-lib-modules\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005392 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-var-lib-kubelet\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005419 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ceb419e4-d804-4111-b8d8-8436cc2ee617-serving-cert\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005441 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005478 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-bin\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005541 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f091088-2166-4026-9fa6-62bd83407edb-config\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005576 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-sys\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005609 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-script-lib\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005633 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:19:38.005652 master-0 kubenswrapper[33572]: I1204 22:19:38.005702 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.005778 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkrvr\" (UniqueName: \"kubernetes.io/projected/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-kube-api-access-xkrvr\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.005869 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8d54\" (UniqueName: \"kubernetes.io/projected/f893663c-7c1e-4eda-9839-99c1c0440304-kube-api-access-g8d54\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006003 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f091088-2166-4026-9fa6-62bd83407edb-config\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006028 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d142201-6e77-4828-b86b-05d4144a2f08-serving-cert\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006118 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-textfile\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006175 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-images\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006216 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-catalog-content\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006237 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-textfile\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006236 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/35821f48-b000-4915-847f-a739b6efc5ee-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006259 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f25fad-089d-4df6-abb1-10d4c76750f1-config\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006320 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006358 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce6002bb-4948-45ab-bb1d-ed65e86b6466-catalog-content\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006404 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-config\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006466 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-kubelet\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006609 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcw8f\" (UniqueName: \"kubernetes.io/projected/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-kube-api-access-kcw8f\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006681 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006698 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovnkube-config\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006745 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56f25fad-089d-4df6-abb1-10d4c76750f1-config\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006874 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-metrics-certs\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006927 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-iptables-alerter-script\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.006984 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvrr5\" (UniqueName: \"kubernetes.io/projected/0beb871c-3bf1-471c-a028-746a650267bf-kube-api-access-dvrr5\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.007028 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-log-socket\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.007073 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f25fad-089d-4df6-abb1-10d4c76750f1-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.007114 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-ca-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.007157 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.007109 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76fd9f44-4365-4271-8772-025655c50334-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.007200 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nkh\" (UniqueName: \"kubernetes.io/projected/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-kube-api-access-g5nkh\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.007337 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vpbl\" (UniqueName: \"kubernetes.io/projected/a5c2d3b8-41c0-4531-b770-57b7c567fe30-kube-api-access-5vpbl\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.007386 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-default-certificate\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.007346 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56f25fad-089d-4df6-abb1-10d4c76750f1-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:19:38.009063 master-0 kubenswrapper[33572]: I1204 22:19:38.007431 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:38.018210 master-0 kubenswrapper[33572]: I1204 22:19:38.018148 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 22:19:38.038645 master-0 kubenswrapper[33572]: I1204 22:19:38.038142 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 22:19:38.044241 master-0 kubenswrapper[33572]: I1204 22:19:38.044178 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-ovnkube-identity-cm\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:19:38.057391 master-0 kubenswrapper[33572]: I1204 22:19:38.057324 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 22:19:38.065092 master-0 kubenswrapper[33572]: I1204 22:19:38.065013 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/59d3d0d8-1a2a-4d14-8312-d33818acba88-ovn-node-metrics-cert\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.078619 master-0 kubenswrapper[33572]: I1204 22:19:38.078577 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 22:19:38.082828 master-0 kubenswrapper[33572]: I1204 22:19:38.082754 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-key\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:19:38.097767 master-0 kubenswrapper[33572]: I1204 22:19:38.097586 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 22:19:38.109040 master-0 kubenswrapper[33572]: I1204 22:19:38.108933 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-socket-dir-parent\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.109235 master-0 kubenswrapper[33572]: I1204 22:19:38.109074 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-socket-dir-parent\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.109235 master-0 kubenswrapper[33572]: I1204 22:19:38.109103 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/74197c50-9a41-40e8-9289-c7e6afbd3737-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:38.109235 master-0 kubenswrapper[33572]: I1204 22:19:38.109174 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2279404-fa75-4de2-a302-d7b15ead5232-hosts-file\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:19:38.109235 master-0 kubenswrapper[33572]: I1204 22:19:38.109215 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/74197c50-9a41-40e8-9289-c7e6afbd3737-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:38.109626 master-0 kubenswrapper[33572]: I1204 22:19:38.109231 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:38.109626 master-0 kubenswrapper[33572]: I1204 22:19:38.109335 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:38.109626 master-0 kubenswrapper[33572]: I1204 22:19:38.109338 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-os-release\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.109626 master-0 kubenswrapper[33572]: I1204 22:19:38.109390 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.109626 master-0 kubenswrapper[33572]: I1204 22:19:38.109431 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-netns\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.109626 master-0 kubenswrapper[33572]: I1204 22:19:38.109457 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-os-release\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.109626 master-0 kubenswrapper[33572]: I1204 22:19:38.109546 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.109626 master-0 kubenswrapper[33572]: I1204 22:19:38.109560 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.109626 master-0 kubenswrapper[33572]: I1204 22:19:38.109551 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2279404-fa75-4de2-a302-d7b15ead5232-hosts-file\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:19:38.109626 master-0 kubenswrapper[33572]: I1204 22:19:38.109626 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.110253 master-0 kubenswrapper[33572]: I1204 22:19:38.109653 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-netns\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.110253 master-0 kubenswrapper[33572]: I1204 22:19:38.109695 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/989a73ce-3898-4f65-a437-2c7061f9375f-audit-dir\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.110253 master-0 kubenswrapper[33572]: I1204 22:19:38.109749 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.110253 master-0 kubenswrapper[33572]: I1204 22:19:38.109827 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/989a73ce-3898-4f65-a437-2c7061f9375f-audit-dir\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.110253 master-0 kubenswrapper[33572]: I1204 22:19:38.109837 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.110253 master-0 kubenswrapper[33572]: I1204 22:19:38.109898 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-systemd\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.110253 master-0 kubenswrapper[33572]: I1204 22:19:38.109960 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-slash\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.110253 master-0 kubenswrapper[33572]: I1204 22:19:38.109998 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-slash\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.110253 master-0 kubenswrapper[33572]: I1204 22:19:38.109958 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-systemd\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.110253 master-0 kubenswrapper[33572]: I1204 22:19:38.110195 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-multus\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110315 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110352 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110395 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-multus\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110423 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110547 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110583 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110654 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110704 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110840 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110882 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110894 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110923 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.110990 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.111046 master-0 kubenswrapper[33572]: I1204 22:19:38.111062 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111066 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111113 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-kubernetes\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111207 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-kubernetes\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111254 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111307 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111389 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-modprobe-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111442 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-conf-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111494 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-etc-kubernetes\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111551 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-conf-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111586 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-modprobe-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111612 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-etc-kubernetes\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111616 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111678 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111687 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-k8s-cni-cncf-io\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111741 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-k8s-cni-cncf-io\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111778 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111830 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-system-cni-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111885 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:38.111933 master-0 kubenswrapper[33572]: I1204 22:19:38.111901 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-root\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.111963 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-system-cni-dir\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.111977 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-root\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112058 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-conf\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112325 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-conf\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112341 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-kubelet\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112383 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112402 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-kubelet\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112479 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112545 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112590 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112632 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-netns\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112633 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112665 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112705 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112675 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-netns\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112764 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a8636bd7-fa9e-44b9-82df-9d37b398736d-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112808 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-etc-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112860 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112901 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-multus-certs\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.112975 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:38.113071 master-0 kubenswrapper[33572]: I1204 22:19:38.113101 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-sys\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113209 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-wtmp\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113282 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113321 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113358 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-node-pullsecrets\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113420 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-host\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113455 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-netd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113534 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-bin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113635 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-cnibin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113699 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113739 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-rootfs\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113808 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-var-lib-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113900 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-cnibin\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113937 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.113973 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.114022 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-multus-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.114030 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.114117 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-hostroot\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.114151 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-host\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.114255 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-cni-bin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.114264 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.114307 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-run-multus-certs\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.114334 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-netd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.114390 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-hostroot\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.114383 master-0 kubenswrapper[33572]: I1204 22:19:38.114405 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-sys\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114527 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-wtmp\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114588 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114646 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-node-pullsecrets\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114687 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-cnibin\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114697 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-etc-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114769 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-run-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114780 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114819 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114827 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-host-slash\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114871 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-node-log\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114879 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-rootfs\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114916 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114929 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-var-lib-openvswitch\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.114987 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76fd9f44-4365-4271-8772-025655c50334-cnibin\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115003 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit-dir\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115044 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115062 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysconfig\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115092 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-node-log\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115099 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115157 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115186 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysconfig\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115240 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115262 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115280 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-os-release\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115343 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-host-slash\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115362 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-etc-sysctl-d\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115407 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115456 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-audit-dir\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115568 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115612 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-systemd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115649 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115733 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-systemd\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115770 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115831 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/871cb002-67f4-43aa-a41d-7a5b2f340059-host-etc-kube\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115877 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-os-release\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115916 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:38.115919 master-0 kubenswrapper[33572]: I1204 22:19:38.115929 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116031 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/871cb002-67f4-43aa-a41d-7a5b2f340059-host-etc-kube\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116114 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-systemd-units\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116173 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-systemd-units\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116173 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116224 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116285 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116412 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-system-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116494 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/fb0274dc-fac1-41f9-b3e5-77253d851fdf-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116585 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-run\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116598 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-system-cni-dir\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116623 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-lib-modules\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116663 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-var-lib-kubelet\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116675 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-run\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116701 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-bin\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116747 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-sys\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116754 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-cni-bin\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116784 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-var-lib-kubelet\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116858 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-sys\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116887 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-kubelet\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116922 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c8c45e0-2342-499b-aa6b-339b6a722a87-host-var-lib-kubelet\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.116939 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbb8e73f-7e50-451b-b400-e88a86b51e09-lib-modules\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.117087 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-log-socket\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.117251 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-ovn\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.117275 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-log-socket\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.117294 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.117579 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.117596 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/59d3d0d8-1a2a-4d14-8312-d33818acba88-run-ovn\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:38.118927 master-0 kubenswrapper[33572]: I1204 22:19:38.118796 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 22:19:38.127641 master-0 kubenswrapper[33572]: I1204 22:19:38.127582 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-iptables-alerter-script\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:19:38.138814 master-0 kubenswrapper[33572]: I1204 22:19:38.138758 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 22:19:38.142423 master-0 kubenswrapper[33572]: I1204 22:19:38.142356 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4d68dcb1-efe4-425f-9b28-1e5575548a32-signing-cabundle\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:19:38.157368 master-0 kubenswrapper[33572]: I1204 22:19:38.157269 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 22:19:38.169532 master-0 kubenswrapper[33572]: I1204 22:19:38.169447 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:38.177994 master-0 kubenswrapper[33572]: I1204 22:19:38.177938 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 22:19:38.195950 master-0 kubenswrapper[33572]: W1204 22:19:38.195834 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda906debd0c35952850935aee2d607cce.slice/crio-06ad253a69dfd20589df7f3b17abd33c9557352365f8841b659e67f49cf21906 WatchSource:0}: Error finding container 06ad253a69dfd20589df7f3b17abd33c9557352365f8841b659e67f49cf21906: Status 404 returned error can't find the container with id 06ad253a69dfd20589df7f3b17abd33c9557352365f8841b659e67f49cf21906 Dec 04 22:19:38.197060 master-0 kubenswrapper[33572]: I1204 22:19:38.196998 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 22:19:38.199661 master-0 kubenswrapper[33572]: I1204 22:19:38.199557 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:38.199661 master-0 kubenswrapper[33572]: I1204 22:19:38.199659 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:38.200064 master-0 kubenswrapper[33572]: I1204 22:19:38.199716 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:38.200064 master-0 kubenswrapper[33572]: I1204 22:19:38.199786 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:38.204656 master-0 kubenswrapper[33572]: I1204 22:19:38.204594 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.204767 master-0 kubenswrapper[33572]: I1204 22:19:38.204710 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.212595 master-0 kubenswrapper[33572]: I1204 22:19:38.207600 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:38.212595 master-0 kubenswrapper[33572]: I1204 22:19:38.208114 33572 scope.go:117] "RemoveContainer" containerID="5fcdcfec6584d4b81237f292d9431a4c2eccc32a97cb24f1ad54b8dd23d476bb" Dec 04 22:19:38.212595 master-0 kubenswrapper[33572]: I1204 22:19:38.208249 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.212595 master-0 kubenswrapper[33572]: I1204 22:19:38.208304 33572 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.212595 master-0 kubenswrapper[33572]: I1204 22:19:38.208334 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.212595 master-0 kubenswrapper[33572]: I1204 22:19:38.208458 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.218687 master-0 kubenswrapper[33572]: I1204 22:19:38.218175 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:38.218687 master-0 kubenswrapper[33572]: I1204 22:19:38.218389 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 22:19:38.225394 master-0 kubenswrapper[33572]: I1204 22:19:38.225287 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/634c1df6-de4d-4e26-8c71-d39311cae0ce-env-overrides\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:19:38.228181 master-0 kubenswrapper[33572]: I1204 22:19:38.228120 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 04 22:19:38.256443 master-0 kubenswrapper[33572]: I1204 22:19:38.256368 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:19:38.263937 master-0 kubenswrapper[33572]: I1204 22:19:38.263879 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 04 22:19:38.303186 master-0 kubenswrapper[33572]: I1204 22:19:38.301659 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 04 22:19:38.303186 master-0 kubenswrapper[33572]: I1204 22:19:38.303033 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 04 22:19:38.303563 master-0 kubenswrapper[33572]: I1204 22:19:38.303176 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a906debd0c35952850935aee2d607cce","Type":"ContainerStarted","Data":"06ad253a69dfd20589df7f3b17abd33c9557352365f8841b659e67f49cf21906"} Dec 04 22:19:38.311622 master-0 kubenswrapper[33572]: I1204 22:19:38.311546 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 04 22:19:38.318099 master-0 kubenswrapper[33572]: I1204 22:19:38.318033 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:38.320928 master-0 kubenswrapper[33572]: I1204 22:19:38.320868 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:38.329238 master-0 kubenswrapper[33572]: I1204 22:19:38.329195 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 04 22:19:38.339623 master-0 kubenswrapper[33572]: I1204 22:19:38.339584 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:38.345990 master-0 kubenswrapper[33572]: I1204 22:19:38.345954 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 04 22:19:38.355204 master-0 kubenswrapper[33572]: I1204 22:19:38.355121 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/fb0274dc-fac1-41f9-b3e5-77253d851fdf-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.357294 master-0 kubenswrapper[33572]: I1204 22:19:38.357270 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 04 22:19:38.363547 master-0 kubenswrapper[33572]: I1204 22:19:38.363474 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-ca-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:38.377522 master-0 kubenswrapper[33572]: I1204 22:19:38.377415 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 04 22:19:38.396905 master-0 kubenswrapper[33572]: I1204 22:19:38.396865 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 22:19:38.400442 master-0 kubenswrapper[33572]: I1204 22:19:38.400380 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-serving-ca\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.417335 master-0 kubenswrapper[33572]: I1204 22:19:38.417277 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 22:19:38.422821 master-0 kubenswrapper[33572]: I1204 22:19:38.422781 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-etcd-client\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.430326 master-0 kubenswrapper[33572]: I1204 22:19:38.430277 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock\") pod \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " Dec 04 22:19:38.430556 master-0 kubenswrapper[33572]: I1204 22:19:38.430486 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir\") pod \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " Dec 04 22:19:38.430672 master-0 kubenswrapper[33572]: I1204 22:19:38.430563 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dbe54b09-0399-4fbe-9f84-dd9dede0ab96" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:19:38.430672 master-0 kubenswrapper[33572]: I1204 22:19:38.430563 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock" (OuterVolumeSpecName: "var-lock") pod "dbe54b09-0399-4fbe-9f84-dd9dede0ab96" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:19:38.433659 master-0 kubenswrapper[33572]: I1204 22:19:38.433568 33572 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:19:38.433659 master-0 kubenswrapper[33572]: I1204 22:19:38.433648 33572 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:19:38.437664 master-0 kubenswrapper[33572]: I1204 22:19:38.437605 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 22:19:38.440790 master-0 kubenswrapper[33572]: I1204 22:19:38.440654 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-serving-cert\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.457922 master-0 kubenswrapper[33572]: I1204 22:19:38.457862 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 22:19:38.459906 master-0 kubenswrapper[33572]: I1204 22:19:38.459840 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/989a73ce-3898-4f65-a437-2c7061f9375f-encryption-config\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.478740 master-0 kubenswrapper[33572]: I1204 22:19:38.478697 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 22:19:38.486928 master-0 kubenswrapper[33572]: I1204 22:19:38.486881 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5c2d3b8-41c0-4531-b770-57b7c567fe30-config-volume\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:19:38.498258 master-0 kubenswrapper[33572]: I1204 22:19:38.497990 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 22:19:38.505161 master-0 kubenswrapper[33572]: I1204 22:19:38.505048 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5c2d3b8-41c0-4531-b770-57b7c567fe30-metrics-tls\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:19:38.518559 master-0 kubenswrapper[33572]: I1204 22:19:38.518462 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 22:19:38.522000 master-0 kubenswrapper[33572]: I1204 22:19:38.521927 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-audit-policies\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.536912 master-0 kubenswrapper[33572]: I1204 22:19:38.536849 33572 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 04 22:19:38.537134 master-0 kubenswrapper[33572]: I1204 22:19:38.537026 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 22:19:38.557969 master-0 kubenswrapper[33572]: I1204 22:19:38.557889 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 22:19:38.578528 master-0 kubenswrapper[33572]: I1204 22:19:38.578459 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 22:19:38.588070 master-0 kubenswrapper[33572]: I1204 22:19:38.588011 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/989a73ce-3898-4f65-a437-2c7061f9375f-trusted-ca-bundle\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:38.598113 master-0 kubenswrapper[33572]: I1204 22:19:38.598072 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 22:19:38.619580 master-0 kubenswrapper[33572]: I1204 22:19:38.617608 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 22:19:38.637749 master-0 kubenswrapper[33572]: I1204 22:19:38.637665 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 22:19:38.643703 master-0 kubenswrapper[33572]: I1204 22:19:38.643616 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8636bd7-fa9e-44b9-82df-9d37b398736d-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:38.657572 master-0 kubenswrapper[33572]: I1204 22:19:38.657477 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 22:19:38.678196 master-0 kubenswrapper[33572]: I1204 22:19:38.678103 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 22:19:38.684940 master-0 kubenswrapper[33572]: I1204 22:19:38.684843 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8636bd7-fa9e-44b9-82df-9d37b398736d-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:38.697877 master-0 kubenswrapper[33572]: I1204 22:19:38.697793 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 04 22:19:38.705029 master-0 kubenswrapper[33572]: I1204 22:19:38.704971 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b966c210-5415-4fa5-88ab-c85aba979b28-tls-certificates\") pod \"prometheus-operator-admission-webhook-7c85c4dffd-mp4qx\" (UID: \"b966c210-5415-4fa5-88ab-c85aba979b28\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:19:38.717663 master-0 kubenswrapper[33572]: I1204 22:19:38.717605 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 22:19:38.721272 master-0 kubenswrapper[33572]: I1204 22:19:38.721199 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-metrics-certs\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:38.738278 master-0 kubenswrapper[33572]: I1204 22:19:38.738130 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 22:19:38.744021 master-0 kubenswrapper[33572]: I1204 22:19:38.743960 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-stats-auth\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:38.757483 master-0 kubenswrapper[33572]: I1204 22:19:38.757381 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 22:19:38.765834 master-0 kubenswrapper[33572]: I1204 22:19:38.765776 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c178afcf-b713-4c74-b22b-6169ba3123f5-service-ca-bundle\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:38.778375 master-0 kubenswrapper[33572]: I1204 22:19:38.778299 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 22:19:38.796904 master-0 kubenswrapper[33572]: I1204 22:19:38.796826 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 22:19:38.800567 master-0 kubenswrapper[33572]: I1204 22:19:38.800486 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3863c74-8f22-4c67-bef5-2d0d39df4abd-serving-cert\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:38.818155 master-0 kubenswrapper[33572]: I1204 22:19:38.817607 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 22:19:38.823004 master-0 kubenswrapper[33572]: I1204 22:19:38.822950 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-config\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:38.840639 master-0 kubenswrapper[33572]: I1204 22:19:38.840582 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 22:19:38.849663 master-0 kubenswrapper[33572]: I1204 22:19:38.849447 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c178afcf-b713-4c74-b22b-6169ba3123f5-default-certificate\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:38.877983 master-0 kubenswrapper[33572]: I1204 22:19:38.877810 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 22:19:38.878870 master-0 kubenswrapper[33572]: I1204 22:19:38.878825 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 22:19:38.886600 master-0 kubenswrapper[33572]: I1204 22:19:38.886562 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-proxy-ca-bundles\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:38.897928 master-0 kubenswrapper[33572]: I1204 22:19:38.897879 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 22:19:38.905635 master-0 kubenswrapper[33572]: I1204 22:19:38.905590 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-client-ca\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:38.916186 master-0 kubenswrapper[33572]: I1204 22:19:38.916155 33572 request.go:700] Waited for 1.010099846s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 04 22:19:38.918655 master-0 kubenswrapper[33572]: I1204 22:19:38.918608 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 22:19:38.940539 master-0 kubenswrapper[33572]: I1204 22:19:38.938313 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 22:19:38.958091 master-0 kubenswrapper[33572]: I1204 22:19:38.958023 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 22:19:38.977599 master-0 kubenswrapper[33572]: I1204 22:19:38.977554 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 22:19:38.986605 master-0 kubenswrapper[33572]: I1204 22:19:38.986546 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-serving-cert\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:38.987439 master-0 kubenswrapper[33572]: E1204 22:19:38.987377 33572 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.987601 master-0 kubenswrapper[33572]: E1204 22:19:38.987570 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/800f436c-145d-4281-8d4d-644ba2cb0ebb-cloud-credential-operator-serving-cert podName:800f436c-145d-4281-8d4d-644ba2cb0ebb nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.487539525 +0000 UTC m=+43.215065204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/800f436c-145d-4281-8d4d-644ba2cb0ebb-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-698c598cfc-lgmqn" (UID: "800f436c-145d-4281-8d4d-644ba2cb0ebb") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.987859 master-0 kubenswrapper[33572]: E1204 22:19:38.987746 33572 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.987943 master-0 kubenswrapper[33572]: E1204 22:19:38.987912 33572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.988040 master-0 kubenswrapper[33572]: E1204 22:19:38.988000 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cluster-baremetal-operator-tls podName:a3899a38-39b8-4b48-81e5-4d8854ecc8ab nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.487952467 +0000 UTC m=+43.215478156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-78f758c7b9-44srj" (UID: "a3899a38-39b8-4b48-81e5-4d8854ecc8ab") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.988134 master-0 kubenswrapper[33572]: E1204 22:19:38.988058 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-samples-operator-tls podName:a7b2270b-2afc-4bf5-ae1a-5ccf9814657b nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.488041249 +0000 UTC m=+43.215566948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-samples-operator-tls") pod "cluster-samples-operator-797cfd8b47-j469d" (UID: "a7b2270b-2afc-4bf5-ae1a-5ccf9814657b") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.988747 master-0 kubenswrapper[33572]: E1204 22:19:38.988687 33572 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.988850 master-0 kubenswrapper[33572]: E1204 22:19:38.988769 33572 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.988850 master-0 kubenswrapper[33572]: E1204 22:19:38.988773 33572 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.988850 master-0 kubenswrapper[33572]: E1204 22:19:38.988827 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-service-ca-bundle podName:2d142201-6e77-4828-b86b-05d4144a2f08 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.4887956 +0000 UTC m=+43.216321279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-service-ca-bundle") pod "insights-operator-55965856b6-7vlpp" (UID: "2d142201-6e77-4828-b86b-05d4144a2f08") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.989207 master-0 kubenswrapper[33572]: E1204 22:19:38.988849 33572 configmap.go:193] Couldn't get configMap openshift-machine-api/baremetal-kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.989207 master-0 kubenswrapper[33572]: E1204 22:19:38.989041 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-auth-proxy-config podName:55c4f1e1-1b78-45ec-915d-8055ab3e2786 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.488846141 +0000 UTC m=+43.216371820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-auth-proxy-config") pod "machine-config-operator-dc5d7666f-d7mvx" (UID: "55c4f1e1-1b78-45ec-915d-8055ab3e2786") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.989207 master-0 kubenswrapper[33572]: E1204 22:19:38.989181 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55c4f1e1-1b78-45ec-915d-8055ab3e2786-proxy-tls podName:55c4f1e1-1b78-45ec-915d-8055ab3e2786 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.4891551 +0000 UTC m=+43.216680969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/55c4f1e1-1b78-45ec-915d-8055ab3e2786-proxy-tls") pod "machine-config-operator-dc5d7666f-d7mvx" (UID: "55c4f1e1-1b78-45ec-915d-8055ab3e2786") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.989207 master-0 kubenswrapper[33572]: E1204 22:19:38.989210 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-config podName:a3899a38-39b8-4b48-81e5-4d8854ecc8ab nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.489200931 +0000 UTC m=+43.216726900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-config") pod "cluster-baremetal-operator-78f758c7b9-44srj" (UID: "a3899a38-39b8-4b48-81e5-4d8854ecc8ab") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.989433 master-0 kubenswrapper[33572]: E1204 22:19:38.989246 33572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.989433 master-0 kubenswrapper[33572]: E1204 22:19:38.989278 33572 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.989433 master-0 kubenswrapper[33572]: E1204 22:19:38.989289 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert podName:651c0fad-1577-4a7f-8718-ec2fd2f06c3e nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.489279074 +0000 UTC m=+43.216804983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert") pod "ingress-canary-7cr8g" (UID: "651c0fad-1577-4a7f-8718-ec2fd2f06c3e") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.989433 master-0 kubenswrapper[33572]: E1204 22:19:38.989408 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap podName:8d84a7d3-46d1-48e3-83f3-f6b32f16cc76 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.489382847 +0000 UTC m=+43.216908726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-5857974f64-qqxk9" (UID: "8d84a7d3-46d1-48e3-83f3-f6b32f16cc76") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.990261 master-0 kubenswrapper[33572]: E1204 22:19:38.990220 33572 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.990347 master-0 kubenswrapper[33572]: E1204 22:19:38.990299 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-trusted-ca-bundle podName:2d142201-6e77-4828-b86b-05d4144a2f08 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.490281102 +0000 UTC m=+43.217806791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-trusted-ca-bundle") pod "insights-operator-55965856b6-7vlpp" (UID: "2d142201-6e77-4828-b86b-05d4144a2f08") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.990347 master-0 kubenswrapper[33572]: E1204 22:19:38.990321 33572 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.990484 master-0 kubenswrapper[33572]: E1204 22:19:38.990387 33572 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.990484 master-0 kubenswrapper[33572]: E1204 22:19:38.990412 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-certs podName:eb4d8477-c3b5-4e88-aaa9-222ad56d974c nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.490386905 +0000 UTC m=+43.217912594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-certs") pod "machine-config-server-wmm89" (UID: "eb4d8477-c3b5-4e88-aaa9-222ad56d974c") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.990484 master-0 kubenswrapper[33572]: E1204 22:19:38.990463 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config podName:17912746-74eb-4c78-8c1b-2f66e7ce4299 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.490443766 +0000 UTC m=+43.217969455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-5974b6b869-jm2hq" (UID: "17912746-74eb-4c78-8c1b-2f66e7ce4299") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.990484 master-0 kubenswrapper[33572]: E1204 22:19:38.990474 33572 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.990735 master-0 kubenswrapper[33572]: E1204 22:19:38.990529 33572 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-3h94rftr47kot: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.990735 master-0 kubenswrapper[33572]: E1204 22:19:38.990560 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6684358b-d7a6-4396-9b4f-ea67d85e4517-metrics-client-ca podName:6684358b-d7a6-4396-9b4f-ea67d85e4517 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.490545609 +0000 UTC m=+43.218071298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/6684358b-d7a6-4396-9b4f-ea67d85e4517-metrics-client-ca") pod "prometheus-operator-6c74d9cb9f-9cnnh" (UID: "6684358b-d7a6-4396-9b4f-ea67d85e4517") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.990735 master-0 kubenswrapper[33572]: E1204 22:19:38.990592 33572 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.990735 master-0 kubenswrapper[33572]: E1204 22:19:38.990612 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle podName:72679051-6a4b-4991-85c4-e5d2cbbc6ed7 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.49058322 +0000 UTC m=+43.218109129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle") pod "metrics-server-55c77559c8-g74sm" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.990735 master-0 kubenswrapper[33572]: E1204 22:19:38.990648 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca podName:b86ff0e8-2c72-4dc6-ac55-3c21940d044f nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.490635032 +0000 UTC m=+43.218160721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca") pod "route-controller-manager-9db9db957-zdrjg" (UID: "b86ff0e8-2c72-4dc6-ac55-3c21940d044f") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.990735 master-0 kubenswrapper[33572]: E1204 22:19:38.990660 33572 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.990735 master-0 kubenswrapper[33572]: E1204 22:19:38.990736 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17912746-74eb-4c78-8c1b-2f66e7ce4299-metrics-client-ca podName:17912746-74eb-4c78-8c1b-2f66e7ce4299 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.490708654 +0000 UTC m=+43.218234343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/17912746-74eb-4c78-8c1b-2f66e7ce4299-metrics-client-ca") pod "openshift-state-metrics-5974b6b869-jm2hq" (UID: "17912746-74eb-4c78-8c1b-2f66e7ce4299") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.991265 master-0 kubenswrapper[33572]: E1204 22:19:38.991227 33572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.991347 master-0 kubenswrapper[33572]: E1204 22:19:38.991322 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config podName:0a726f44-a509-46b3-a6d5-70afe3b55e9f nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.49129938 +0000 UTC m=+43.218825059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config") pod "node-exporter-p5qlk" (UID: "0a726f44-a509-46b3-a6d5-70afe3b55e9f") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.991678 master-0 kubenswrapper[33572]: E1204 22:19:38.991634 33572 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.991771 master-0 kubenswrapper[33572]: E1204 22:19:38.991703 33572 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.991771 master-0 kubenswrapper[33572]: E1204 22:19:38.991726 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-auth-proxy-config podName:5dac8e25-0f51-4c04-929c-060479689a9d nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.491703831 +0000 UTC m=+43.219229520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-auth-proxy-config") pod "machine-approver-74d9cbffbc-nzqgx" (UID: "5dac8e25-0f51-4c04-929c-060479689a9d") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.991771 master-0 kubenswrapper[33572]: E1204 22:19:38.991643 33572 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.991771 master-0 kubenswrapper[33572]: E1204 22:19:38.991761 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-images podName:74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.491743452 +0000 UTC m=+43.219269151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-images") pod "machine-api-operator-88d48b57d-pp4fd" (UID: "74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.991994 master-0 kubenswrapper[33572]: E1204 22:19:38.991815 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74197c50-9a41-40e8-9289-c7e6afbd3737-cloud-controller-manager-operator-tls podName:74197c50-9a41-40e8-9289-c7e6afbd3737 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.491799564 +0000 UTC m=+43.219325243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/74197c50-9a41-40e8-9289-c7e6afbd3737-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" (UID: "74197c50-9a41-40e8-9289-c7e6afbd3737") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.991994 master-0 kubenswrapper[33572]: E1204 22:19:38.991927 33572 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.992105 master-0 kubenswrapper[33572]: E1204 22:19:38.991998 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-proxy-tls podName:ebfbb13d-c3f2-476d-bd89-cb8a13d2acee nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.491982239 +0000 UTC m=+43.219507918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-proxy-tls") pod "machine-config-controller-7c6d64c4cd-crk68" (UID: "ebfbb13d-c3f2-476d-bd89-cb8a13d2acee") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.993538 master-0 kubenswrapper[33572]: E1204 22:19:38.993492 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.993628 master-0 kubenswrapper[33572]: E1204 22:19:38.993544 33572 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.993628 master-0 kubenswrapper[33572]: E1204 22:19:38.993569 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-kube-rbac-proxy-config podName:6684358b-d7a6-4396-9b4f-ea67d85e4517 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.493555033 +0000 UTC m=+43.221080892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-6c74d9cb9f-9cnnh" (UID: "6684358b-d7a6-4396-9b4f-ea67d85e4517") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.993628 master-0 kubenswrapper[33572]: E1204 22:19:38.993604 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dac8e25-0f51-4c04-929c-060479689a9d-machine-approver-tls podName:5dac8e25-0f51-4c04-929c-060479689a9d nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.493588434 +0000 UTC m=+43.221114123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/5dac8e25-0f51-4c04-929c-060479689a9d-machine-approver-tls") pod "machine-approver-74d9cbffbc-nzqgx" (UID: "5dac8e25-0f51-4c04-929c-060479689a9d") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.993628 master-0 kubenswrapper[33572]: E1204 22:19:38.993605 33572 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.993628 master-0 kubenswrapper[33572]: E1204 22:19:38.993629 33572 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.993904 master-0 kubenswrapper[33572]: E1204 22:19:38.993637 33572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.993904 master-0 kubenswrapper[33572]: E1204 22:19:38.993677 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-srv-cert podName:e7a7f632-2442-4837-b068-c22b03c71fb0 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.493660126 +0000 UTC m=+43.221185805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-srv-cert") pod "catalog-operator-fbc6455c4-85tbt" (UID: "e7a7f632-2442-4837-b068-c22b03c71fb0") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.993904 master-0 kubenswrapper[33572]: E1204 22:19:38.993709 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs podName:34cad3de-8f3f-48cd-bd39-8745fad19e65 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.493695817 +0000 UTC m=+43.221221506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs") pod "multus-admission-controller-8dbbb5754-c9fx2" (UID: "34cad3de-8f3f-48cd-bd39-8745fad19e65") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.993904 master-0 kubenswrapper[33572]: E1204 22:19:38.993740 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-tls podName:8d84a7d3-46d1-48e3-83f3-f6b32f16cc76 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.493724338 +0000 UTC m=+43.221250017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-tls") pod "kube-state-metrics-5857974f64-qqxk9" (UID: "8d84a7d3-46d1-48e3-83f3-f6b32f16cc76") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.994775 33572 secret.go:189] Couldn't get secret openshift-cluster-storage-operator/cluster-storage-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.994853 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a043ea49-97f9-4ae6-83b9-733f12754d94-cluster-storage-operator-serving-cert podName:a043ea49-97f9-4ae6-83b9-733f12754d94 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.494827798 +0000 UTC m=+43.222353487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-storage-operator-serving-cert" (UniqueName: "kubernetes.io/secret/a043ea49-97f9-4ae6-83b9-733f12754d94-cluster-storage-operator-serving-cert") pod "cluster-storage-operator-dcf7fc84b-qmhlw" (UID: "a043ea49-97f9-4ae6-83b9-733f12754d94") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.994867 33572 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.994966 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles podName:72679051-6a4b-4991-85c4-e5d2cbbc6ed7 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.494945442 +0000 UTC m=+43.222471131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles") pod "metrics-server-55c77559c8-g74sm" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.995406 33572 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.995465 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-config podName:5dac8e25-0f51-4c04-929c-060479689a9d nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.495449476 +0000 UTC m=+43.222975165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-config") pod "machine-approver-74d9cbffbc-nzqgx" (UID: "5dac8e25-0f51-4c04-929c-060479689a9d") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.995549 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.995653 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-tls podName:6684358b-d7a6-4396-9b4f-ea67d85e4517 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.495627661 +0000 UTC m=+43.223153540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-tls") pod "prometheus-operator-6c74d9cb9f-9cnnh" (UID: "6684358b-d7a6-4396-9b4f-ea67d85e4517") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.995870 33572 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.995959 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-apiservice-cert podName:c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.49593604 +0000 UTC m=+43.223461729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-apiservice-cert") pod "packageserver-7b4bc6c685-l6dfn" (UID: "c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.996747 33572 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.996794 33572 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.996821 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/800f436c-145d-4281-8d4d-644ba2cb0ebb-cco-trusted-ca podName:800f436c-145d-4281-8d4d-644ba2cb0ebb nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.496802973 +0000 UTC m=+43.224328702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/800f436c-145d-4281-8d4d-644ba2cb0ebb-cco-trusted-ca") pod "cloud-credential-operator-698c598cfc-lgmqn" (UID: "800f436c-145d-4281-8d4d-644ba2cb0ebb") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.996875 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-auth-proxy-config podName:74197c50-9a41-40e8-9289-c7e6afbd3737 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.496854145 +0000 UTC m=+43.224379834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" (UID: "74197c50-9a41-40e8-9289-c7e6afbd3737") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.997407 33572 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.997522 master-0 kubenswrapper[33572]: E1204 22:19:38.997528 33572 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.998051 master-0 kubenswrapper[33572]: E1204 22:19:38.997542 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-profile-collector-cert podName:967bf4ac-f025-4296-8ed9-183a345f6b7c nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.497482412 +0000 UTC m=+43.225008261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-profile-collector-cert") pod "olm-operator-7cd7dbb44c-bqcf8" (UID: "967bf4ac-f025-4296-8ed9-183a345f6b7c") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.998051 master-0 kubenswrapper[33572]: E1204 22:19:38.997581 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-srv-cert podName:967bf4ac-f025-4296-8ed9-183a345f6b7c nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.497563364 +0000 UTC m=+43.225089263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-srv-cert") pod "olm-operator-7cd7dbb44c-bqcf8" (UID: "967bf4ac-f025-4296-8ed9-183a345f6b7c") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.998051 master-0 kubenswrapper[33572]: E1204 22:19:38.997631 33572 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.998051 master-0 kubenswrapper[33572]: E1204 22:19:38.997765 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-node-bootstrap-token podName:eb4d8477-c3b5-4e88-aaa9-222ad56d974c nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.49774519 +0000 UTC m=+43.225270879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-node-bootstrap-token") pod "machine-config-server-wmm89" (UID: "eb4d8477-c3b5-4e88-aaa9-222ad56d974c") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.998051 master-0 kubenswrapper[33572]: E1204 22:19:38.997822 33572 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.998051 master-0 kubenswrapper[33572]: E1204 22:19:38.997843 33572 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.998051 master-0 kubenswrapper[33572]: E1204 22:19:38.997873 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls podName:17912746-74eb-4c78-8c1b-2f66e7ce4299 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.497859954 +0000 UTC m=+43.225385643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls") pod "openshift-state-metrics-5974b6b869-jm2hq" (UID: "17912746-74eb-4c78-8c1b-2f66e7ce4299") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.998051 master-0 kubenswrapper[33572]: E1204 22:19:38.997909 33572 configmap.go:193] Couldn't get configMap openshift-machine-api/cluster-baremetal-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.998051 master-0 kubenswrapper[33572]: I1204 22:19:38.997909 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 22:19:38.998051 master-0 kubenswrapper[33572]: E1204 22:19:38.997972 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5598683a-cd32-486d-8839-205829d55cc2-cert podName:5598683a-cd32-486d-8839-205829d55cc2 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.497937586 +0000 UTC m=+43.225463415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5598683a-cd32-486d-8839-205829d55cc2-cert") pod "cluster-autoscaler-operator-5f49d774cd-5m4l9" (UID: "5598683a-cd32-486d-8839-205829d55cc2") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:38.998333 master-0 kubenswrapper[33572]: E1204 22:19:38.998066 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-images podName:a3899a38-39b8-4b48-81e5-4d8854ecc8ab nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.498045799 +0000 UTC m=+43.225571488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-images") pod "cluster-baremetal-operator-78f758c7b9-44srj" (UID: "a3899a38-39b8-4b48-81e5-4d8854ecc8ab") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.998333 master-0 kubenswrapper[33572]: E1204 22:19:38.998298 33572 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.998394 master-0 kubenswrapper[33572]: E1204 22:19:38.998354 33572 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.998425 master-0 kubenswrapper[33572]: E1204 22:19:38.998357 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0a726f44-a509-46b3-a6d5-70afe3b55e9f-metrics-client-ca podName:0a726f44-a509-46b3-a6d5-70afe3b55e9f nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.498338347 +0000 UTC m=+43.225864026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/0a726f44-a509-46b3-a6d5-70afe3b55e9f-metrics-client-ca") pod "node-exporter-p5qlk" (UID: "0a726f44-a509-46b3-a6d5-70afe3b55e9f") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:38.998481 master-0 kubenswrapper[33572]: E1204 22:19:38.998454 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-config podName:74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.498429079 +0000 UTC m=+43.225954968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-config") pod "machine-api-operator-88d48b57d-pp4fd" (UID: "74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:38.999024 33572 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:38.999103 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-profile-collector-cert podName:e7a7f632-2442-4837-b068-c22b03c71fb0 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.499085337 +0000 UTC m=+43.226611016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-profile-collector-cert") pod "catalog-operator-fbc6455c4-85tbt" (UID: "e7a7f632-2442-4837-b068-c22b03c71fb0") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:39.000779 33572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:39.000832 33572 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:39.000850 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls podName:0a726f44-a509-46b3-a6d5-70afe3b55e9f nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.500830917 +0000 UTC m=+43.228356616 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls") pod "node-exporter-p5qlk" (UID: "0a726f44-a509-46b3-a6d5-70afe3b55e9f") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:39.000898 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-mcd-auth-proxy-config podName:8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.500881298 +0000 UTC m=+43.228406977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-mcd-auth-proxy-config") pod "machine-config-daemon-ppnv8" (UID: "8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:39.000902 33572 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:39.000934 33572 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:39.000948 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5598683a-cd32-486d-8839-205829d55cc2-auth-proxy-config podName:5598683a-cd32-486d-8839-205829d55cc2 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.500936109 +0000 UTC m=+43.228461818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/5598683a-cd32-486d-8839-205829d55cc2-auth-proxy-config") pod "cluster-autoscaler-operator-5f49d774cd-5m4l9" (UID: "5598683a-cd32-486d-8839-205829d55cc2") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:39.000974 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-proxy-tls podName:8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.50096181 +0000 UTC m=+43.228487499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-proxy-tls") pod "machine-config-daemon-ppnv8" (UID: "8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:39.000987 33572 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.001668 master-0 kubenswrapper[33572]: E1204 22:19:39.001027 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-machine-api-operator-tls podName:74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.501017022 +0000 UTC m=+43.228542721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-machine-api-operator-tls") pod "machine-api-operator-88d48b57d-pp4fd" (UID: "74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.002757 master-0 kubenswrapper[33572]: E1204 22:19:39.001839 33572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.002757 master-0 kubenswrapper[33572]: E1204 22:19:39.001907 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-kube-rbac-proxy-config podName:8d84a7d3-46d1-48e3-83f3-f6b32f16cc76 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.501891966 +0000 UTC m=+43.229417625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-5857974f64-qqxk9" (UID: "8d84a7d3-46d1-48e3-83f3-f6b32f16cc76") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.002757 master-0 kubenswrapper[33572]: E1204 22:19:39.002357 33572 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.002757 master-0 kubenswrapper[33572]: E1204 22:19:39.002373 33572 secret.go:189] Couldn't get secret openshift-config-operator/config-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.002757 master-0 kubenswrapper[33572]: E1204 22:19:39.002420 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config podName:b86ff0e8-2c72-4dc6-ac55-3c21940d044f nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.50240557 +0000 UTC m=+43.229931249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config") pod "route-controller-manager-9db9db957-zdrjg" (UID: "b86ff0e8-2c72-4dc6-ac55-3c21940d044f") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.002757 master-0 kubenswrapper[33572]: E1204 22:19:39.002443 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/810c363b-a4c7-428d-a2fb-285adc29f477-serving-cert podName:810c363b-a4c7-428d-a2fb-285adc29f477 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.502432831 +0000 UTC m=+43.229958520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/810c363b-a4c7-428d-a2fb-285adc29f477-serving-cert") pod "openshift-config-operator-68758cbcdb-fg6vx" (UID: "810c363b-a4c7-428d-a2fb-285adc29f477") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.003536 master-0 kubenswrapper[33572]: E1204 22:19:39.003466 33572 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.003536 master-0 kubenswrapper[33572]: E1204 22:19:39.003538 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-webhook-cert podName:c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.503525712 +0000 UTC m=+43.231051701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-webhook-cert") pod "packageserver-7b4bc6c685-l6dfn" (UID: "c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.003718 master-0 kubenswrapper[33572]: E1204 22:19:39.003580 33572 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.003718 master-0 kubenswrapper[33572]: E1204 22:19:39.003593 33572 secret.go:189] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.003718 master-0 kubenswrapper[33572]: E1204 22:19:39.003631 33572 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.003718 master-0 kubenswrapper[33572]: E1204 22:19:39.003660 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-images podName:74197c50-9a41-40e8-9289-c7e6afbd3737 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.503641105 +0000 UTC m=+43.231166784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-images") pod "cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" (UID: "74197c50-9a41-40e8-9289-c7e6afbd3737") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.003718 master-0 kubenswrapper[33572]: E1204 22:19:39.003694 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs podName:72679051-6a4b-4991-85c4-e5d2cbbc6ed7 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.503681586 +0000 UTC m=+43.231207275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs") pod "metrics-server-55c77559c8-g74sm" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.003718 master-0 kubenswrapper[33572]: E1204 22:19:39.003723 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1534e25-7add-46a1-8f4e-0065c232aa4e-control-plane-machine-set-operator-tls podName:f1534e25-7add-46a1-8f4e-0065c232aa4e nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.503710287 +0000 UTC m=+43.231235976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/f1534e25-7add-46a1-8f4e-0065c232aa4e-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-7df95c79b5-nznvn" (UID: "f1534e25-7add-46a1-8f4e-0065c232aa4e") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.004701 master-0 kubenswrapper[33572]: E1204 22:19:39.004657 33572 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.004808 master-0 kubenswrapper[33572]: E1204 22:19:39.004738 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cert podName:a3899a38-39b8-4b48-81e5-4d8854ecc8ab nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.504726885 +0000 UTC m=+43.232252764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cert") pod "cluster-baremetal-operator-78f758c7b9-44srj" (UID: "a3899a38-39b8-4b48-81e5-4d8854ecc8ab") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.004808 master-0 kubenswrapper[33572]: E1204 22:19:39.004762 33572 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.004930 master-0 kubenswrapper[33572]: E1204 22:19:39.004850 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-mcc-auth-proxy-config podName:ebfbb13d-c3f2-476d-bd89-cb8a13d2acee nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.504826628 +0000 UTC m=+43.232352487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcc-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-mcc-auth-proxy-config") pod "machine-config-controller-7c6d64c4cd-crk68" (UID: "ebfbb13d-c3f2-476d-bd89-cb8a13d2acee") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.006028 master-0 kubenswrapper[33572]: E1204 22:19:39.005984 33572 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.006110 master-0 kubenswrapper[33572]: E1204 22:19:39.006055 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-metrics-client-ca podName:8d84a7d3-46d1-48e3-83f3-f6b32f16cc76 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.506038421 +0000 UTC m=+43.233564100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-metrics-client-ca") pod "kube-state-metrics-5857974f64-qqxk9" (UID: "8d84a7d3-46d1-48e3-83f3-f6b32f16cc76") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.006420 master-0 kubenswrapper[33572]: E1204 22:19:39.006365 33572 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.006552 master-0 kubenswrapper[33572]: E1204 22:19:39.006442 33572 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.006552 master-0 kubenswrapper[33572]: E1204 22:19:39.006458 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d142201-6e77-4828-b86b-05d4144a2f08-serving-cert podName:2d142201-6e77-4828-b86b-05d4144a2f08 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.506435193 +0000 UTC m=+43.233960882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2d142201-6e77-4828-b86b-05d4144a2f08-serving-cert") pod "insights-operator-55965856b6-7vlpp" (UID: "2d142201-6e77-4828-b86b-05d4144a2f08") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.006696 master-0 kubenswrapper[33572]: E1204 22:19:39.006588 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-images podName:55c4f1e1-1b78-45ec-915d-8055ab3e2786 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.506563486 +0000 UTC m=+43.234089286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-images") pod "machine-config-operator-dc5d7666f-d7mvx" (UID: "55c4f1e1-1b78-45ec-915d-8055ab3e2786") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.007610 master-0 kubenswrapper[33572]: E1204 22:19:39.007545 33572 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.007610 master-0 kubenswrapper[33572]: E1204 22:19:39.007607 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle podName:72679051-6a4b-4991-85c4-e5d2cbbc6ed7 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.507592525 +0000 UTC m=+43.235118214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle") pod "metrics-server-55c77559c8-g74sm" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:39.007789 master-0 kubenswrapper[33572]: E1204 22:19:39.007762 33572 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.007918 master-0 kubenswrapper[33572]: E1204 22:19:39.007883 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls podName:72679051-6a4b-4991-85c4-e5d2cbbc6ed7 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:39.507869322 +0000 UTC m=+43.235395211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls") pod "metrics-server-55c77559c8-g74sm" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:39.018449 master-0 kubenswrapper[33572]: I1204 22:19:39.018376 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 22:19:39.037215 master-0 kubenswrapper[33572]: I1204 22:19:39.036970 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 22:19:39.060170 master-0 kubenswrapper[33572]: I1204 22:19:39.060087 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 22:19:39.077530 master-0 kubenswrapper[33572]: I1204 22:19:39.077414 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 22:19:39.097085 master-0 kubenswrapper[33572]: I1204 22:19:39.096998 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 04 22:19:39.119611 master-0 kubenswrapper[33572]: I1204 22:19:39.119483 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-fvwtt" Dec 04 22:19:39.146725 master-0 kubenswrapper[33572]: I1204 22:19:39.146492 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 04 22:19:39.156980 master-0 kubenswrapper[33572]: I1204 22:19:39.156921 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 22:19:39.177378 master-0 kubenswrapper[33572]: I1204 22:19:39.177299 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 04 22:19:39.197896 master-0 kubenswrapper[33572]: I1204 22:19:39.197799 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-fj6qn" Dec 04 22:19:39.217661 master-0 kubenswrapper[33572]: I1204 22:19:39.217563 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-nvnbb" Dec 04 22:19:39.237223 master-0 kubenswrapper[33572]: I1204 22:19:39.237122 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 22:19:39.258349 master-0 kubenswrapper[33572]: I1204 22:19:39.258256 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 22:19:39.277756 master-0 kubenswrapper[33572]: I1204 22:19:39.277484 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 22:19:39.297162 master-0 kubenswrapper[33572]: I1204 22:19:39.297087 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 22:19:39.317526 master-0 kubenswrapper[33572]: I1204 22:19:39.317440 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 22:19:39.339681 master-0 kubenswrapper[33572]: I1204 22:19:39.339595 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 04 22:19:39.348582 master-0 kubenswrapper[33572]: I1204 22:19:39.348480 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b89698aa356a3bc32694e2b098f9a900/kube-apiserver-check-endpoints/0.log" Dec 04 22:19:39.351325 master-0 kubenswrapper[33572]: I1204 22:19:39.351273 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:39.358662 master-0 kubenswrapper[33572]: I1204 22:19:39.358567 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 04 22:19:39.378101 master-0 kubenswrapper[33572]: I1204 22:19:39.378033 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-zpcfd" Dec 04 22:19:39.397653 master-0 kubenswrapper[33572]: I1204 22:19:39.397453 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 04 22:19:39.417775 master-0 kubenswrapper[33572]: I1204 22:19:39.417698 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 04 22:19:39.439321 master-0 kubenswrapper[33572]: I1204 22:19:39.439236 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-69625" Dec 04 22:19:39.465910 master-0 kubenswrapper[33572]: I1204 22:19:39.465821 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 04 22:19:39.478341 master-0 kubenswrapper[33572]: I1204 22:19:39.478281 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 04 22:19:39.498195 master-0 kubenswrapper[33572]: I1204 22:19:39.498120 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-n25ns" Dec 04 22:19:39.520611 master-0 kubenswrapper[33572]: I1204 22:19:39.520468 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 04 22:19:39.538397 master-0 kubenswrapper[33572]: I1204 22:19:39.538164 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-x7b78" Dec 04 22:19:39.558349 master-0 kubenswrapper[33572]: I1204 22:19:39.558278 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 22:19:39.575667 master-0 kubenswrapper[33572]: I1204 22:19:39.575490 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-config\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:39.576102 master-0 kubenswrapper[33572]: I1204 22:19:39.575703 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55c4f1e1-1b78-45ec-915d-8055ab3e2786-proxy-tls\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:39.576102 master-0 kubenswrapper[33572]: I1204 22:19:39.575863 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:39.576102 master-0 kubenswrapper[33572]: I1204 22:19:39.575969 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-certs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:19:39.576102 master-0 kubenswrapper[33572]: I1204 22:19:39.576051 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:19:39.576365 master-0 kubenswrapper[33572]: I1204 22:19:39.576091 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-config\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:39.576444 master-0 kubenswrapper[33572]: I1204 22:19:39.576354 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6684358b-d7a6-4396-9b4f-ea67d85e4517-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:39.576705 master-0 kubenswrapper[33572]: I1204 22:19:39.576621 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:39.576958 master-0 kubenswrapper[33572]: I1204 22:19:39.576890 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-trusted-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:39.577040 master-0 kubenswrapper[33572]: I1204 22:19:39.576981 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17912746-74eb-4c78-8c1b-2f66e7ce4299-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:39.577113 master-0 kubenswrapper[33572]: I1204 22:19:39.577051 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:39.577113 master-0 kubenswrapper[33572]: I1204 22:19:39.577104 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:39.577245 master-0 kubenswrapper[33572]: I1204 22:19:39.577172 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:39.577317 master-0 kubenswrapper[33572]: I1204 22:19:39.577282 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/74197c50-9a41-40e8-9289-c7e6afbd3737-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:39.577658 master-0 kubenswrapper[33572]: I1204 22:19:39.577583 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-images\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:39.577768 master-0 kubenswrapper[33572]: I1204 22:19:39.577621 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:39.577768 master-0 kubenswrapper[33572]: I1204 22:19:39.577682 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:39.577897 master-0 kubenswrapper[33572]: I1204 22:19:39.577786 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:19:39.577897 master-0 kubenswrapper[33572]: I1204 22:19:39.577827 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 22:19:39.577897 master-0 kubenswrapper[33572]: I1204 22:19:39.577876 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5dac8e25-0f51-4c04-929c-060479689a9d-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:39.578084 master-0 kubenswrapper[33572]: I1204 22:19:39.577944 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:19:39.578084 master-0 kubenswrapper[33572]: I1204 22:19:39.578061 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:39.578209 master-0 kubenswrapper[33572]: I1204 22:19:39.578105 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-srv-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:19:39.578209 master-0 kubenswrapper[33572]: I1204 22:19:39.578146 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:39.578334 master-0 kubenswrapper[33572]: I1204 22:19:39.578234 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:39.578482 master-0 kubenswrapper[33572]: I1204 22:19:39.578450 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a043ea49-97f9-4ae6-83b9-733f12754d94-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:19:39.578665 master-0 kubenswrapper[33572]: I1204 22:19:39.578639 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:39.578748 master-0 kubenswrapper[33572]: I1204 22:19:39.578721 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:39.578839 master-0 kubenswrapper[33572]: I1204 22:19:39.578767 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-apiservice-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:39.578839 master-0 kubenswrapper[33572]: I1204 22:19:39.578832 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/800f436c-145d-4281-8d4d-644ba2cb0ebb-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:19:39.579024 master-0 kubenswrapper[33572]: I1204 22:19:39.578982 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a043ea49-97f9-4ae6-83b9-733f12754d94-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:19:39.579098 master-0 kubenswrapper[33572]: I1204 22:19:39.579034 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:39.579178 master-0 kubenswrapper[33572]: I1204 22:19:39.579110 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:19:39.579249 master-0 kubenswrapper[33572]: I1204 22:19:39.579198 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-srv-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:19:39.579323 master-0 kubenswrapper[33572]: I1204 22:19:39.579246 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:39.579323 master-0 kubenswrapper[33572]: I1204 22:19:39.579291 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-images\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:39.579477 master-0 kubenswrapper[33572]: I1204 22:19:39.579420 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-node-bootstrap-token\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:19:39.579820 master-0 kubenswrapper[33572]: I1204 22:19:39.579546 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:19:39.579820 master-0 kubenswrapper[33572]: I1204 22:19:39.579576 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5598683a-cd32-486d-8839-205829d55cc2-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:19:39.579820 master-0 kubenswrapper[33572]: I1204 22:19:39.579440 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/800f436c-145d-4281-8d4d-644ba2cb0ebb-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:19:39.579820 master-0 kubenswrapper[33572]: I1204 22:19:39.579683 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-config\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:39.579820 master-0 kubenswrapper[33572]: I1204 22:19:39.579727 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/967bf4ac-f025-4296-8ed9-183a345f6b7c-srv-cert\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:19:39.579820 master-0 kubenswrapper[33572]: I1204 22:19:39.579736 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-images\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:39.579820 master-0 kubenswrapper[33572]: I1204 22:19:39.579727 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a726f44-a509-46b3-a6d5-70afe3b55e9f-metrics-client-ca\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.579865 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.579928 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5598683a-cd32-486d-8839-205829d55cc2-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.580055 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5598683a-cd32-486d-8839-205829d55cc2-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.580108 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-mcd-auth-proxy-config\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.580180 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-proxy-tls\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.580243 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.580272 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.580343 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.580405 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.580446 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5598683a-cd32-486d-8839-205829d55cc2-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.580492 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810c363b-a4c7-428d-a2fb-285adc29f477-serving-cert\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:19:39.580611 master-0 kubenswrapper[33572]: I1204 22:19:39.580586 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.580692 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.580752 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-webhook-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.580778 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/810c363b-a4c7-428d-a2fb-285adc29f477-serving-cert\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.580798 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1534e25-7add-46a1-8f4e-0065c232aa4e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.580843 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.580883 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cert\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.580958 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.581023 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1534e25-7add-46a1-8f4e-0065c232aa4e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.581047 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.581058 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.581171 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-images\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.581233 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d142201-6e77-4828-b86b-05d4144a2f08-serving-cert\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.581324 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:39.581384 master-0 kubenswrapper[33572]: I1204 22:19:39.581366 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:39.582453 master-0 kubenswrapper[33572]: I1204 22:19:39.581469 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/800f436c-145d-4281-8d4d-644ba2cb0ebb-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:19:39.582453 master-0 kubenswrapper[33572]: I1204 22:19:39.581628 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:39.582453 master-0 kubenswrapper[33572]: I1204 22:19:39.581689 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:19:39.582453 master-0 kubenswrapper[33572]: I1204 22:19:39.581760 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-service-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:39.582453 master-0 kubenswrapper[33572]: I1204 22:19:39.581805 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:39.582453 master-0 kubenswrapper[33572]: I1204 22:19:39.581850 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/800f436c-145d-4281-8d4d-644ba2cb0ebb-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:19:39.582453 master-0 kubenswrapper[33572]: I1204 22:19:39.581882 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:39.582453 master-0 kubenswrapper[33572]: I1204 22:19:39.582167 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:19:39.599212 master-0 kubenswrapper[33572]: I1204 22:19:39.599094 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 22:19:39.618755 master-0 kubenswrapper[33572]: I1204 22:19:39.618689 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 04 22:19:39.621988 master-0 kubenswrapper[33572]: I1204 22:19:39.621936 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-cert\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:39.638292 master-0 kubenswrapper[33572]: I1204 22:19:39.638245 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 04 22:19:39.658540 master-0 kubenswrapper[33572]: I1204 22:19:39.658378 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 04 22:19:39.690615 master-0 kubenswrapper[33572]: I1204 22:19:39.690542 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 04 22:19:39.697600 master-0 kubenswrapper[33572]: I1204 22:19:39.697406 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 04 22:19:39.697740 master-0 kubenswrapper[33572]: I1204 22:19:39.697637 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-trusted-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:39.703047 master-0 kubenswrapper[33572]: I1204 22:19:39.703004 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d142201-6e77-4828-b86b-05d4144a2f08-serving-cert\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:39.716987 master-0 kubenswrapper[33572]: I1204 22:19:39.716936 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 04 22:19:39.723188 master-0 kubenswrapper[33572]: I1204 22:19:39.723135 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d142201-6e77-4828-b86b-05d4144a2f08-service-ca-bundle\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:39.738043 master-0 kubenswrapper[33572]: I1204 22:19:39.737995 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 22:19:39.738663 master-0 kubenswrapper[33572]: I1204 22:19:39.738616 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e7a7f632-2442-4837-b068-c22b03c71fb0-srv-cert\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:19:39.756729 master-0 kubenswrapper[33572]: I1204 22:19:39.756668 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-n8qln" Dec 04 22:19:39.778883 master-0 kubenswrapper[33572]: I1204 22:19:39.778801 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 22:19:39.798278 master-0 kubenswrapper[33572]: I1204 22:19:39.798240 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-dtp6d" Dec 04 22:19:39.818876 master-0 kubenswrapper[33572]: I1204 22:19:39.818825 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 22:19:39.827519 master-0 kubenswrapper[33572]: I1204 22:19:39.827449 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55c4f1e1-1b78-45ec-915d-8055ab3e2786-proxy-tls\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:39.838104 master-0 kubenswrapper[33572]: I1204 22:19:39.838026 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 22:19:39.841361 master-0 kubenswrapper[33572]: I1204 22:19:39.841308 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:39.859387 master-0 kubenswrapper[33572]: I1204 22:19:39.859335 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 22:19:39.860309 master-0 kubenswrapper[33572]: I1204 22:19:39.860241 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-config\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:39.878165 master-0 kubenswrapper[33572]: I1204 22:19:39.878093 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 22:19:39.898771 master-0 kubenswrapper[33572]: I1204 22:19:39.898677 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mw665" Dec 04 22:19:39.918179 master-0 kubenswrapper[33572]: I1204 22:19:39.917977 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 22:19:39.922270 master-0 kubenswrapper[33572]: I1204 22:19:39.921893 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-images\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:39.936177 master-0 kubenswrapper[33572]: I1204 22:19:39.936084 33572 request.go:700] Waited for 2.019284658s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/configmaps?fieldSelector=metadata.name%3Dmachine-api-operator-images&limit=500&resourceVersion=0 Dec 04 22:19:39.938481 master-0 kubenswrapper[33572]: I1204 22:19:39.938435 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 22:19:39.949069 master-0 kubenswrapper[33572]: I1204 22:19:39.949006 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-images\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:39.958431 master-0 kubenswrapper[33572]: I1204 22:19:39.958332 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 22:19:39.962010 master-0 kubenswrapper[33572]: I1204 22:19:39.961925 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-mcd-auth-proxy-config\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:39.962010 master-0 kubenswrapper[33572]: I1204 22:19:39.961924 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:19:39.963140 master-0 kubenswrapper[33572]: I1204 22:19:39.963082 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55c4f1e1-1b78-45ec-915d-8055ab3e2786-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:39.978212 master-0 kubenswrapper[33572]: I1204 22:19:39.978122 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 22:19:39.979445 master-0 kubenswrapper[33572]: I1204 22:19:39.979371 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-apiservice-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:39.982292 master-0 kubenswrapper[33572]: I1204 22:19:39.982215 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-webhook-cert\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:39.998083 master-0 kubenswrapper[33572]: I1204 22:19:39.998013 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-pjht7" Dec 04 22:19:40.018400 master-0 kubenswrapper[33572]: I1204 22:19:40.018295 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-sdd6h" Dec 04 22:19:40.038352 master-0 kubenswrapper[33572]: I1204 22:19:40.038315 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-4nxv4" Dec 04 22:19:40.058489 master-0 kubenswrapper[33572]: I1204 22:19:40.058455 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 22:19:40.061268 master-0 kubenswrapper[33572]: I1204 22:19:40.061208 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-proxy-tls\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:40.078271 master-0 kubenswrapper[33572]: I1204 22:19:40.078219 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-h7rbd" Dec 04 22:19:40.098039 master-0 kubenswrapper[33572]: I1204 22:19:40.097964 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-psdkj" Dec 04 22:19:40.118055 master-0 kubenswrapper[33572]: I1204 22:19:40.117999 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 04 22:19:40.128519 master-0 kubenswrapper[33572]: I1204 22:19:40.128424 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/74197c50-9a41-40e8-9289-c7e6afbd3737-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:40.137526 master-0 kubenswrapper[33572]: I1204 22:19:40.137449 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 22:19:40.157451 master-0 kubenswrapper[33572]: I1204 22:19:40.157389 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:19:40.178147 master-0 kubenswrapper[33572]: I1204 22:19:40.177954 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-sf4xn" Dec 04 22:19:40.197765 master-0 kubenswrapper[33572]: I1204 22:19:40.197631 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 04 22:19:40.199967 master-0 kubenswrapper[33572]: I1204 22:19:40.199908 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:40.218414 master-0 kubenswrapper[33572]: I1204 22:19:40.217856 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 04 22:19:40.222187 master-0 kubenswrapper[33572]: I1204 22:19:40.222133 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74197c50-9a41-40e8-9289-c7e6afbd3737-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:40.238841 master-0 kubenswrapper[33572]: I1204 22:19:40.238717 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 22:19:40.250550 master-0 kubenswrapper[33572]: I1204 22:19:40.249170 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:19:40.258863 master-0 kubenswrapper[33572]: I1204 22:19:40.258774 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 22:19:40.269717 master-0 kubenswrapper[33572]: I1204 22:19:40.269678 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5dac8e25-0f51-4c04-929c-060479689a9d-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:40.278256 master-0 kubenswrapper[33572]: I1204 22:19:40.278214 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-shf98" Dec 04 22:19:40.297493 master-0 kubenswrapper[33572]: I1204 22:19:40.297392 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-bktr2" Dec 04 22:19:40.318180 master-0 kubenswrapper[33572]: I1204 22:19:40.317701 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 22:19:40.318458 master-0 kubenswrapper[33572]: I1204 22:19:40.318171 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:40.337412 master-0 kubenswrapper[33572]: I1204 22:19:40.337336 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 22:19:40.339592 master-0 kubenswrapper[33572]: I1204 22:19:40.339356 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dac8e25-0f51-4c04-929c-060479689a9d-config\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:40.358312 master-0 kubenswrapper[33572]: I1204 22:19:40.358279 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 22:19:40.378739 master-0 kubenswrapper[33572]: I1204 22:19:40.378632 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 22:19:40.398480 master-0 kubenswrapper[33572]: I1204 22:19:40.398419 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-8rl6s" Dec 04 22:19:40.418425 master-0 kubenswrapper[33572]: I1204 22:19:40.418379 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 22:19:40.428015 master-0 kubenswrapper[33572]: I1204 22:19:40.427926 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-certs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:19:40.438294 master-0 kubenswrapper[33572]: I1204 22:19:40.438186 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 04 22:19:40.438803 master-0 kubenswrapper[33572]: I1204 22:19:40.438737 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:40.458839 master-0 kubenswrapper[33572]: I1204 22:19:40.458766 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 04 22:19:40.459980 master-0 kubenswrapper[33572]: I1204 22:19:40.459899 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/6684358b-d7a6-4396-9b4f-ea67d85e4517-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:40.478539 master-0 kubenswrapper[33572]: I1204 22:19:40.478464 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 22:19:40.481376 master-0 kubenswrapper[33572]: I1204 22:19:40.481332 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-node-bootstrap-token\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:19:40.497218 master-0 kubenswrapper[33572]: I1204 22:19:40.497169 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-h8rp7" Dec 04 22:19:40.518302 master-0 kubenswrapper[33572]: I1204 22:19:40.518182 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 04 22:19:40.520552 master-0 kubenswrapper[33572]: I1204 22:19:40.520478 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a726f44-a509-46b3-a6d5-70afe3b55e9f-metrics-client-ca\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:40.521911 master-0 kubenswrapper[33572]: I1204 22:19:40.521857 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:40.527848 master-0 kubenswrapper[33572]: I1204 22:19:40.527784 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17912746-74eb-4c78-8c1b-2f66e7ce4299-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:40.527996 master-0 kubenswrapper[33572]: I1204 22:19:40.527886 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6684358b-d7a6-4396-9b4f-ea67d85e4517-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:40.538430 master-0 kubenswrapper[33572]: I1204 22:19:40.538265 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-5kj7n" Dec 04 22:19:40.559076 master-0 kubenswrapper[33572]: I1204 22:19:40.558983 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 04 22:19:40.569816 master-0 kubenswrapper[33572]: I1204 22:19:40.569715 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:40.576394 master-0 kubenswrapper[33572]: E1204 22:19:40.576293 33572 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:40.576597 master-0 kubenswrapper[33572]: E1204 22:19:40.576430 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap podName:8d84a7d3-46d1-48e3-83f3-f6b32f16cc76 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.576402249 +0000 UTC m=+45.303927908 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-5857974f64-qqxk9" (UID: "8d84a7d3-46d1-48e3-83f3-f6b32f16cc76") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:40.576700 master-0 kubenswrapper[33572]: E1204 22:19:40.576612 33572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.576774 master-0 kubenswrapper[33572]: E1204 22:19:40.576752 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert podName:651c0fad-1577-4a7f-8718-ec2fd2f06c3e nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.576709157 +0000 UTC m=+45.304235026 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert") pod "ingress-canary-7cr8g" (UID: "651c0fad-1577-4a7f-8718-ec2fd2f06c3e") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.577206 master-0 kubenswrapper[33572]: E1204 22:19:40.577144 33572 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.577306 master-0 kubenswrapper[33572]: E1204 22:19:40.577268 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config podName:17912746-74eb-4c78-8c1b-2f66e7ce4299 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.577237732 +0000 UTC m=+45.304763391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-5974b6b869-jm2hq" (UID: "17912746-74eb-4c78-8c1b-2f66e7ce4299") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.578239 master-0 kubenswrapper[33572]: E1204 22:19:40.578179 33572 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.578370 master-0 kubenswrapper[33572]: E1204 22:19:40.578234 33572 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-3h94rftr47kot: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.578370 master-0 kubenswrapper[33572]: I1204 22:19:40.578270 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 04 22:19:40.578370 master-0 kubenswrapper[33572]: E1204 22:19:40.578255 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs podName:34cad3de-8f3f-48cd-bd39-8745fad19e65 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.57823943 +0000 UTC m=+45.305765089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs") pod "multus-admission-controller-8dbbb5754-c9fx2" (UID: "34cad3de-8f3f-48cd-bd39-8745fad19e65") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.578686 master-0 kubenswrapper[33572]: E1204 22:19:40.578174 33572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.578686 master-0 kubenswrapper[33572]: E1204 22:19:40.578398 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle podName:72679051-6a4b-4991-85c4-e5d2cbbc6ed7 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.578367494 +0000 UTC m=+45.305893183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle") pod "metrics-server-55c77559c8-g74sm" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.578686 master-0 kubenswrapper[33572]: E1204 22:19:40.578475 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config podName:0a726f44-a509-46b3-a6d5-70afe3b55e9f nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.578445316 +0000 UTC m=+45.305970995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config") pod "node-exporter-p5qlk" (UID: "0a726f44-a509-46b3-a6d5-70afe3b55e9f") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.578686 master-0 kubenswrapper[33572]: E1204 22:19:40.578533 33572 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:40.578686 master-0 kubenswrapper[33572]: E1204 22:19:40.578602 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles podName:72679051-6a4b-4991-85c4-e5d2cbbc6ed7 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.57858659 +0000 UTC m=+45.306112269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles") pod "metrics-server-55c77559c8-g74sm" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:40.579900 master-0 kubenswrapper[33572]: E1204 22:19:40.579604 33572 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.579900 master-0 kubenswrapper[33572]: E1204 22:19:40.579691 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls podName:17912746-74eb-4c78-8c1b-2f66e7ce4299 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.57967464 +0000 UTC m=+45.307200299 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls") pod "openshift-state-metrics-5974b6b869-jm2hq" (UID: "17912746-74eb-4c78-8c1b-2f66e7ce4299") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.580939 master-0 kubenswrapper[33572]: E1204 22:19:40.580893 33572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.581054 master-0 kubenswrapper[33572]: E1204 22:19:40.580988 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls podName:0a726f44-a509-46b3-a6d5-70afe3b55e9f nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.580967827 +0000 UTC m=+45.308493506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls") pod "node-exporter-p5qlk" (UID: "0a726f44-a509-46b3-a6d5-70afe3b55e9f") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.581054 master-0 kubenswrapper[33572]: E1204 22:19:40.581035 33572 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.581236 master-0 kubenswrapper[33572]: E1204 22:19:40.581093 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs podName:72679051-6a4b-4991-85c4-e5d2cbbc6ed7 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.58107839 +0000 UTC m=+45.308604079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs") pod "metrics-server-55c77559c8-g74sm" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.581236 master-0 kubenswrapper[33572]: I1204 22:19:40.581129 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:40.582119 master-0 kubenswrapper[33572]: E1204 22:19:40.582073 33572 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.582119 master-0 kubenswrapper[33572]: E1204 22:19:40.582094 33572 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:40.582287 master-0 kubenswrapper[33572]: E1204 22:19:40.582143 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls podName:72679051-6a4b-4991-85c4-e5d2cbbc6ed7 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.582129329 +0000 UTC m=+45.309654988 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls") pod "metrics-server-55c77559c8-g74sm" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7") : failed to sync secret cache: timed out waiting for the condition Dec 04 22:19:40.582287 master-0 kubenswrapper[33572]: E1204 22:19:40.582179 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle podName:72679051-6a4b-4991-85c4-e5d2cbbc6ed7 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:41.582156389 +0000 UTC m=+45.309682068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle") pod "metrics-server-55c77559c8-g74sm" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7") : failed to sync configmap cache: timed out waiting for the condition Dec 04 22:19:40.598644 master-0 kubenswrapper[33572]: I1204 22:19:40.598530 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 04 22:19:40.617198 master-0 kubenswrapper[33572]: I1204 22:19:40.617111 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-tm5gx" Dec 04 22:19:40.638381 master-0 kubenswrapper[33572]: I1204 22:19:40.638287 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 04 22:19:40.657545 master-0 kubenswrapper[33572]: I1204 22:19:40.657455 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 04 22:19:40.678283 master-0 kubenswrapper[33572]: I1204 22:19:40.677774 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 04 22:19:40.697812 master-0 kubenswrapper[33572]: I1204 22:19:40.697675 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-cmdvd" Dec 04 22:19:40.718479 master-0 kubenswrapper[33572]: I1204 22:19:40.718427 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 04 22:19:40.737932 master-0 kubenswrapper[33572]: I1204 22:19:40.737879 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-bvrgs" Dec 04 22:19:40.758395 master-0 kubenswrapper[33572]: I1204 22:19:40.758297 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 04 22:19:40.778532 master-0 kubenswrapper[33572]: I1204 22:19:40.778366 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 04 22:19:40.797420 master-0 kubenswrapper[33572]: I1204 22:19:40.797348 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-3h94rftr47kot" Dec 04 22:19:40.818538 master-0 kubenswrapper[33572]: I1204 22:19:40.818427 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 04 22:19:40.838928 master-0 kubenswrapper[33572]: I1204 22:19:40.838825 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-wc4xq" Dec 04 22:19:40.858473 master-0 kubenswrapper[33572]: I1204 22:19:40.858396 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 22:19:40.879384 master-0 kubenswrapper[33572]: I1204 22:19:40.879328 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 22:19:40.898371 master-0 kubenswrapper[33572]: I1204 22:19:40.898295 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 04 22:19:40.917231 master-0 kubenswrapper[33572]: I1204 22:19:40.917123 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 22:19:40.936400 master-0 kubenswrapper[33572]: I1204 22:19:40.936268 33572 request.go:700] Waited for 3.006772615s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-admission-controller-secret&limit=500&resourceVersion=0 Dec 04 22:19:40.939056 master-0 kubenswrapper[33572]: I1204 22:19:40.939007 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 22:19:40.958996 master-0 kubenswrapper[33572]: I1204 22:19:40.958790 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-496f9" Dec 04 22:19:40.999922 master-0 kubenswrapper[33572]: I1204 22:19:40.999729 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d7pj\" (UniqueName: \"kubernetes.io/projected/f1534e25-7add-46a1-8f4e-0065c232aa4e-kube-api-access-4d7pj\") pod \"control-plane-machine-set-operator-7df95c79b5-nznvn\" (UID: \"f1534e25-7add-46a1-8f4e-0065c232aa4e\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-nznvn" Dec 04 22:19:41.023348 master-0 kubenswrapper[33572]: I1204 22:19:41.022881 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngkqz\" (UniqueName: \"kubernetes.io/projected/800f436c-145d-4281-8d4d-644ba2cb0ebb-kube-api-access-ngkqz\") pod \"cloud-credential-operator-698c598cfc-lgmqn\" (UID: \"800f436c-145d-4281-8d4d-644ba2cb0ebb\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-lgmqn" Dec 04 22:19:41.043118 master-0 kubenswrapper[33572]: I1204 22:19:41.043006 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pctsn\" (UniqueName: \"kubernetes.io/projected/c178afcf-b713-4c74-b22b-6169ba3123f5-kube-api-access-pctsn\") pod \"router-default-5465c8b4db-8vm66\" (UID: \"c178afcf-b713-4c74-b22b-6169ba3123f5\") " pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:41.064073 master-0 kubenswrapper[33572]: I1204 22:19:41.063968 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smtnh\" (UniqueName: \"kubernetes.io/projected/17912746-74eb-4c78-8c1b-2f66e7ce4299-kube-api-access-smtnh\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:41.083243 master-0 kubenswrapper[33572]: I1204 22:19:41.083157 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc47q\" (UniqueName: \"kubernetes.io/projected/8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb-kube-api-access-jc47q\") pod \"machine-config-daemon-ppnv8\" (UID: \"8e7eb3f9-ce05-4128-9a1e-dc1c42ded4eb\") " pod="openshift-machine-config-operator/machine-config-daemon-ppnv8" Dec 04 22:19:41.096740 master-0 kubenswrapper[33572]: I1204 22:19:41.096673 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mttq\" (UniqueName: \"kubernetes.io/projected/465637a4-42be-4a65-a859-7af699960138-kube-api-access-4mttq\") pod \"cluster-olm-operator-56fcb6cc5f-t768p\" (UID: \"465637a4-42be-4a65-a859-7af699960138\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-t768p" Dec 04 22:19:41.123926 master-0 kubenswrapper[33572]: I1204 22:19:41.123850 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbpk\" (UniqueName: \"kubernetes.io/projected/ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa-kube-api-access-xdbpk\") pod \"network-metrics-daemon-9pfhj\" (UID: \"ed7b8a4c-b0ec-4ff4-9b2d-bdaff71cf2aa\") " pod="openshift-multus/network-metrics-daemon-9pfhj" Dec 04 22:19:41.149412 master-0 kubenswrapper[33572]: I1204 22:19:41.149318 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9rxt\" (UniqueName: \"kubernetes.io/projected/29828f55-427b-4fe3-8713-03bcd6ac9dec-kube-api-access-t9rxt\") pod \"certified-operators-sw6sx\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:19:41.163808 master-0 kubenswrapper[33572]: I1204 22:19:41.163726 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr65l\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-kube-api-access-lr65l\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:19:41.182078 master-0 kubenswrapper[33572]: I1204 22:19:41.181988 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2lwr\" (UniqueName: \"kubernetes.io/projected/5598683a-cd32-486d-8839-205829d55cc2-kube-api-access-s2lwr\") pod \"cluster-autoscaler-operator-5f49d774cd-5m4l9\" (UID: \"5598683a-cd32-486d-8839-205829d55cc2\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-5m4l9" Dec 04 22:19:41.203022 master-0 kubenswrapper[33572]: I1204 22:19:41.202907 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2ghk\" (UniqueName: \"kubernetes.io/projected/a7b2270b-2afc-4bf5-ae1a-5ccf9814657b-kube-api-access-g2ghk\") pod \"cluster-samples-operator-797cfd8b47-j469d\" (UID: \"a7b2270b-2afc-4bf5-ae1a-5ccf9814657b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-j469d" Dec 04 22:19:41.206646 master-0 kubenswrapper[33572]: I1204 22:19:41.206591 33572 scope.go:117] "RemoveContainer" containerID="0036ee313e2e8fbc7aa4a79880a6b001a94f998abd62378bddfdf0a04bcdd8e0" Dec 04 22:19:41.214682 master-0 kubenswrapper[33572]: I1204 22:19:41.214599 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vpxd\" (UniqueName: \"kubernetes.io/projected/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-kube-api-access-2vpxd\") pod \"route-controller-manager-9db9db957-zdrjg\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:41.236200 master-0 kubenswrapper[33572]: I1204 22:19:41.236072 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gj4j\" (UniqueName: \"kubernetes.io/projected/ae107ad4-104c-4264-9844-afb3af28b19e-kube-api-access-9gj4j\") pod \"redhat-marketplace-sdrkm\" (UID: \"ae107ad4-104c-4264-9844-afb3af28b19e\") " pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:19:41.263462 master-0 kubenswrapper[33572]: I1204 22:19:41.263348 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd6d8\" (UniqueName: \"kubernetes.io/projected/4f22eee4-a42d-4d2b-bffa-6c3f29f1f026-kube-api-access-vd6d8\") pod \"csi-snapshot-controller-6b958b6f94-w7hnc\" (UID: \"4f22eee4-a42d-4d2b-bffa-6c3f29f1f026\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-w7hnc" Dec 04 22:19:41.279558 master-0 kubenswrapper[33572]: I1204 22:19:41.279432 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fmp4\" (UniqueName: \"kubernetes.io/projected/989a73ce-3898-4f65-a437-2c7061f9375f-kube-api-access-7fmp4\") pod \"apiserver-58574fc8d8-gg42x\" (UID: \"989a73ce-3898-4f65-a437-2c7061f9375f\") " pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:41.303862 master-0 kubenswrapper[33572]: I1204 22:19:41.303756 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7d9j\" (UniqueName: \"kubernetes.io/projected/e7a7f632-2442-4837-b068-c22b03c71fb0-kube-api-access-c7d9j\") pod \"catalog-operator-fbc6455c4-85tbt\" (UID: \"e7a7f632-2442-4837-b068-c22b03c71fb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:19:41.323632 master-0 kubenswrapper[33572]: I1204 22:19:41.323496 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9z4k\" (UniqueName: \"kubernetes.io/projected/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-api-access-b9z4k\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:41.334939 master-0 kubenswrapper[33572]: I1204 22:19:41.334349 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:19:41.354338 master-0 kubenswrapper[33572]: I1204 22:19:41.354277 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ndk\" (UniqueName: \"kubernetes.io/projected/cedb0b3e-674e-40b9-a10d-45a9f0c5c59c-kube-api-access-w2ndk\") pod \"iptables-alerter-c747h\" (UID: \"cedb0b3e-674e-40b9-a10d-45a9f0c5c59c\") " pod="openshift-network-operator/iptables-alerter-c747h" Dec 04 22:19:41.376024 master-0 kubenswrapper[33572]: I1204 22:19:41.375961 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcgg9\" (UniqueName: \"kubernetes.io/projected/6c8c45e0-2342-499b-aa6b-339b6a722a87-kube-api-access-gcgg9\") pod \"multus-dgpw9\" (UID: \"6c8c45e0-2342-499b-aa6b-339b6a722a87\") " pod="openshift-multus/multus-dgpw9" Dec 04 22:19:41.396813 master-0 kubenswrapper[33572]: I1204 22:19:41.396638 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w592\" (UniqueName: \"kubernetes.io/projected/813f3ee7-35b5-4ee8-b453-00d16d910eae-kube-api-access-8w592\") pod \"package-server-manager-67477646d4-bslb5\" (UID: \"813f3ee7-35b5-4ee8-b453-00d16d910eae\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:19:41.422414 master-0 kubenswrapper[33572]: I1204 22:19:41.422319 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwx5k\" (UniqueName: \"kubernetes.io/projected/2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae-kube-api-access-mwx5k\") pod \"community-operators-vvkjf\" (UID: \"2bfb50b0-920e-4f85-a1ec-7b2ceaf89dae\") " pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:19:41.433835 master-0 kubenswrapper[33572]: I1204 22:19:41.433755 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwk6f\" (UniqueName: \"kubernetes.io/projected/46229484-5fa1-4595-94a0-44477abae90e-kube-api-access-jwk6f\") pod \"service-ca-operator-77758bc754-5xnjz\" (UID: \"46229484-5fa1-4595-94a0-44477abae90e\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-5xnjz" Dec 04 22:19:41.455926 master-0 kubenswrapper[33572]: I1204 22:19:41.455833 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq55v\" (UniqueName: \"kubernetes.io/projected/34cad3de-8f3f-48cd-bd39-8745fad19e65-kube-api-access-tq55v\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:19:41.492051 master-0 kubenswrapper[33572]: I1204 22:19:41.491880 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc4z5\" (UniqueName: \"kubernetes.io/projected/c3863c74-8f22-4c67-bef5-2d0d39df4abd-kube-api-access-pc4z5\") pod \"controller-manager-86785576d9-t7jrz\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:41.506192 master-0 kubenswrapper[33572]: I1204 22:19:41.506118 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfhgj\" (UniqueName: \"kubernetes.io/projected/510a595a-21bf-48fc-85cd-707bc8f5536f-kube-api-access-gfhgj\") pod \"network-check-target-6jkkl\" (UID: \"510a595a-21bf-48fc-85cd-707bc8f5536f\") " pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:19:41.517334 master-0 kubenswrapper[33572]: I1204 22:19:41.517273 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpksd\" (UniqueName: \"kubernetes.io/projected/c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d-kube-api-access-gpksd\") pod \"packageserver-7b4bc6c685-l6dfn\" (UID: \"c61ef71c-ad0f-41bc-b0ae-a3ee19696f9d\") " pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:41.535650 master-0 kubenswrapper[33572]: I1204 22:19:41.535609 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6vjb\" (UniqueName: \"kubernetes.io/projected/4d68dcb1-efe4-425f-9b28-1e5575548a32-kube-api-access-r6vjb\") pod \"service-ca-77c99c46b8-fpnwr\" (UID: \"4d68dcb1-efe4-425f-9b28-1e5575548a32\") " pod="openshift-service-ca/service-ca-77c99c46b8-fpnwr" Dec 04 22:19:41.569180 master-0 kubenswrapper[33572]: I1204 22:19:41.569099 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r2fn\" (UniqueName: \"kubernetes.io/projected/bda1cb0d-26cf-4b94-b359-432492112888-kube-api-access-8r2fn\") pod \"network-check-source-85d8db45d4-5gbc4\" (UID: \"bda1cb0d-26cf-4b94-b359-432492112888\") " pod="openshift-network-diagnostics/network-check-source-85d8db45d4-5gbc4" Dec 04 22:19:41.583745 master-0 kubenswrapper[33572]: I1204 22:19:41.582673 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56f25fad-089d-4df6-abb1-10d4c76750f1-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-vwpdg\" (UID: \"56f25fad-089d-4df6-abb1-10d4c76750f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-vwpdg" Dec 04 22:19:41.598448 master-0 kubenswrapper[33572]: I1204 22:19:41.598376 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6b4p\" (UniqueName: \"kubernetes.io/projected/fbb8e73f-7e50-451b-b400-e88a86b51e09-kube-api-access-c6b4p\") pod \"tuned-jn88h\" (UID: \"fbb8e73f-7e50-451b-b400-e88a86b51e09\") " pod="openshift-cluster-node-tuning-operator/tuned-jn88h" Dec 04 22:19:41.616103 master-0 kubenswrapper[33572]: I1204 22:19:41.616033 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsk29\" (UniqueName: \"kubernetes.io/projected/967bf4ac-f025-4296-8ed9-183a345f6b7c-kube-api-access-hsk29\") pod \"olm-operator-7cd7dbb44c-bqcf8\" (UID: \"967bf4ac-f025-4296-8ed9-183a345f6b7c\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:19:41.632752 master-0 kubenswrapper[33572]: I1204 22:19:41.632584 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g7n9\" (UniqueName: \"kubernetes.io/projected/74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9-kube-api-access-4g7n9\") pod \"machine-api-operator-88d48b57d-pp4fd\" (UID: \"74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-pp4fd" Dec 04 22:19:41.655050 master-0 kubenswrapper[33572]: I1204 22:19:41.654967 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87gv4\" (UniqueName: \"kubernetes.io/projected/ce6002bb-4948-45ab-bb1d-ed65e86b6466-kube-api-access-87gv4\") pod \"redhat-operators-zt44t\" (UID: \"ce6002bb-4948-45ab-bb1d-ed65e86b6466\") " pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:19:41.669223 master-0 kubenswrapper[33572]: I1204 22:19:41.669104 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:41.669558 master-0 kubenswrapper[33572]: I1204 22:19:41.669411 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:41.669558 master-0 kubenswrapper[33572]: I1204 22:19:41.669445 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:41.669558 master-0 kubenswrapper[33572]: I1204 22:19:41.669528 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:41.669973 master-0 kubenswrapper[33572]: I1204 22:19:41.669897 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d84a7d3-46d1-48e3-83f3-f6b32f16cc76-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-qqxk9\" (UID: \"8d84a7d3-46d1-48e3-83f3-f6b32f16cc76\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-qqxk9" Dec 04 22:19:41.670106 master-0 kubenswrapper[33572]: I1204 22:19:41.670025 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:41.670336 master-0 kubenswrapper[33572]: I1204 22:19:41.670125 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:41.670336 master-0 kubenswrapper[33572]: I1204 22:19:41.670151 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:19:41.670336 master-0 kubenswrapper[33572]: I1204 22:19:41.670206 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:41.670336 master-0 kubenswrapper[33572]: I1204 22:19:41.670243 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:41.670336 master-0 kubenswrapper[33572]: I1204 22:19:41.670270 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:41.670336 master-0 kubenswrapper[33572]: I1204 22:19:41.670313 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:19:41.670756 master-0 kubenswrapper[33572]: I1204 22:19:41.670348 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:41.670756 master-0 kubenswrapper[33572]: I1204 22:19:41.670460 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-cert\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:19:41.670756 master-0 kubenswrapper[33572]: I1204 22:19:41.670530 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:41.670756 master-0 kubenswrapper[33572]: I1204 22:19:41.670675 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:41.671005 master-0 kubenswrapper[33572]: I1204 22:19:41.670836 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:41.671069 master-0 kubenswrapper[33572]: I1204 22:19:41.671022 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a726f44-a509-46b3-a6d5-70afe3b55e9f-node-exporter-tls\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:41.671160 master-0 kubenswrapper[33572]: I1204 22:19:41.671113 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:41.672205 master-0 kubenswrapper[33572]: I1204 22:19:41.671293 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/34cad3de-8f3f-48cd-bd39-8745fad19e65-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-c9fx2\" (UID: \"34cad3de-8f3f-48cd-bd39-8745fad19e65\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-c9fx2" Dec 04 22:19:41.672205 master-0 kubenswrapper[33572]: I1204 22:19:41.671657 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:41.672205 master-0 kubenswrapper[33572]: I1204 22:19:41.671957 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:41.672205 master-0 kubenswrapper[33572]: I1204 22:19:41.671969 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/17912746-74eb-4c78-8c1b-2f66e7ce4299-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-jm2hq\" (UID: \"17912746-74eb-4c78-8c1b-2f66e7ce4299\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-jm2hq" Dec 04 22:19:41.672205 master-0 kubenswrapper[33572]: I1204 22:19:41.672096 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:41.673286 master-0 kubenswrapper[33572]: I1204 22:19:41.673225 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/addddaac-a31a-4dbf-b78f-87225b11b463-bound-sa-token\") pod \"ingress-operator-8649c48786-qlkgh\" (UID: \"addddaac-a31a-4dbf-b78f-87225b11b463\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" Dec 04 22:19:41.701477 master-0 kubenswrapper[33572]: I1204 22:19:41.701398 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jmv\" (UniqueName: \"kubernetes.io/projected/0a726f44-a509-46b3-a6d5-70afe3b55e9f-kube-api-access-k8jmv\") pod \"node-exporter-p5qlk\" (UID: \"0a726f44-a509-46b3-a6d5-70afe3b55e9f\") " pod="openshift-monitoring/node-exporter-p5qlk" Dec 04 22:19:41.727350 master-0 kubenswrapper[33572]: I1204 22:19:41.727260 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r57bb\" (UniqueName: \"kubernetes.io/projected/e37d318a-5bf8-46ed-b6de-494102738da7-kube-api-access-r57bb\") pod \"csi-snapshot-controller-operator-6bc8656fdc-xhndk\" (UID: \"e37d318a-5bf8-46ed-b6de-494102738da7\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-xhndk" Dec 04 22:19:41.733933 master-0 kubenswrapper[33572]: I1204 22:19:41.733838 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lclkg\" (UniqueName: \"kubernetes.io/projected/871cb002-67f4-43aa-a41d-7a5b2f340059-kube-api-access-lclkg\") pod \"network-operator-79767b7ff9-8lq7w\" (UID: \"871cb002-67f4-43aa-a41d-7a5b2f340059\") " pod="openshift-network-operator/network-operator-79767b7ff9-8lq7w" Dec 04 22:19:41.744121 master-0 kubenswrapper[33572]: I1204 22:19:41.743446 33572 scope.go:117] "RemoveContainer" containerID="35d6e2cec5a3a5b68cca4476358af1f1f50efd04a9c16f1130f8dd88a077a41e" Dec 04 22:19:41.754889 master-0 kubenswrapper[33572]: I1204 22:19:41.754781 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8636bd7-fa9e-44b9-82df-9d37b398736d-kube-api-access\") pod \"cluster-version-operator-6d5d5dcc89-t7cc5\" (UID: \"a8636bd7-fa9e-44b9-82df-9d37b398736d\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-t7cc5" Dec 04 22:19:41.775548 master-0 kubenswrapper[33572]: I1204 22:19:41.773082 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4cct\" (UniqueName: \"kubernetes.io/projected/35821f48-b000-4915-847f-a739b6efc5ee-kube-api-access-m4cct\") pod \"cluster-image-registry-operator-6fb9f88b7-r7wcq\" (UID: \"35821f48-b000-4915-847f-a739b6efc5ee\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-r7wcq" Dec 04 22:19:41.791710 master-0 kubenswrapper[33572]: I1204 22:19:41.790048 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cfhv\" (UniqueName: \"kubernetes.io/projected/ce6b5a46-172b-4575-ba22-ff3c6ea4207f-kube-api-access-2cfhv\") pod \"operator-controller-controller-manager-7cbd59c7f8-nxbjw\" (UID: \"ce6b5a46-172b-4575-ba22-ff3c6ea4207f\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:41.814648 master-0 kubenswrapper[33572]: I1204 22:19:41.813268 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cblk\" (UniqueName: \"kubernetes.io/projected/2d142201-6e77-4828-b86b-05d4144a2f08-kube-api-access-6cblk\") pod \"insights-operator-55965856b6-7vlpp\" (UID: \"2d142201-6e77-4828-b86b-05d4144a2f08\") " pod="openshift-insights/insights-operator-55965856b6-7vlpp" Dec 04 22:19:41.833744 master-0 kubenswrapper[33572]: I1204 22:19:41.833675 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5t2f\" (UniqueName: \"kubernetes.io/projected/7f091088-2166-4026-9fa6-62bd83407edb-kube-api-access-s5t2f\") pod \"openshift-controller-manager-operator-6c8676f99d-jb4xf\" (UID: \"7f091088-2166-4026-9fa6-62bd83407edb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-jb4xf" Dec 04 22:19:41.851784 master-0 kubenswrapper[33572]: I1204 22:19:41.851717 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8gh2\" (UniqueName: \"kubernetes.io/projected/651c0fad-1577-4a7f-8718-ec2fd2f06c3e-kube-api-access-n8gh2\") pod \"ingress-canary-7cr8g\" (UID: \"651c0fad-1577-4a7f-8718-ec2fd2f06c3e\") " pod="openshift-ingress-canary/ingress-canary-7cr8g" Dec 04 22:19:41.879483 master-0 kubenswrapper[33572]: I1204 22:19:41.879434 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2jq\" (UniqueName: \"kubernetes.io/projected/a3899a38-39b8-4b48-81e5-4d8854ecc8ab-kube-api-access-pt2jq\") pod \"cluster-baremetal-operator-78f758c7b9-44srj\" (UID: \"a3899a38-39b8-4b48-81e5-4d8854ecc8ab\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-44srj" Dec 04 22:19:41.889243 master-0 kubenswrapper[33572]: I1204 22:19:41.889196 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4rft\" (UniqueName: \"kubernetes.io/projected/59d3d0d8-1a2a-4d14-8312-d33818acba88-kube-api-access-d4rft\") pod \"ovnkube-node-8nxc5\" (UID: \"59d3d0d8-1a2a-4d14-8312-d33818acba88\") " pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:41.908173 master-0 kubenswrapper[33572]: I1204 22:19:41.908100 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8g99\" (UniqueName: \"kubernetes.io/projected/55c4f1e1-1b78-45ec-915d-8055ab3e2786-kube-api-access-b8g99\") pod \"machine-config-operator-dc5d7666f-d7mvx\" (UID: \"55c4f1e1-1b78-45ec-915d-8055ab3e2786\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-d7mvx" Dec 04 22:19:41.928940 master-0 kubenswrapper[33572]: I1204 22:19:41.928878 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7l9n\" (UniqueName: \"kubernetes.io/projected/ceb419e4-d804-4111-b8d8-8436cc2ee617-kube-api-access-c7l9n\") pod \"etcd-operator-5bf4d88c6f-flrrb\" (UID: \"ceb419e4-d804-4111-b8d8-8436cc2ee617\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-flrrb" Dec 04 22:19:41.951233 master-0 kubenswrapper[33572]: I1204 22:19:41.951170 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scht6\" (UniqueName: \"kubernetes.io/projected/a544105a-5bec-456a-aef6-c160943c1f67-kube-api-access-scht6\") pod \"openshift-apiserver-operator-7bf7f6b755-gcbgt\" (UID: \"a544105a-5bec-456a-aef6-c160943c1f67\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-gcbgt" Dec 04 22:19:41.955635 master-0 kubenswrapper[33572]: I1204 22:19:41.955591 33572 request.go:700] Waited for 3.956301781s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-dns/serviceaccounts/node-resolver/token Dec 04 22:19:41.970225 master-0 kubenswrapper[33572]: I1204 22:19:41.970174 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd5zx\" (UniqueName: \"kubernetes.io/projected/c2279404-fa75-4de2-a302-d7b15ead5232-kube-api-access-dd5zx\") pod \"node-resolver-6mgn6\" (UID: \"c2279404-fa75-4de2-a302-d7b15ead5232\") " pod="openshift-dns/node-resolver-6mgn6" Dec 04 22:19:41.989906 master-0 kubenswrapper[33572]: I1204 22:19:41.989845 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w82st\" (UniqueName: \"kubernetes.io/projected/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-kube-api-access-w82st\") pod \"metrics-server-55c77559c8-g74sm\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:42.009146 master-0 kubenswrapper[33572]: I1204 22:19:42.008971 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4cdh\" (UniqueName: \"kubernetes.io/projected/0173b8a7-07b4-407a-80b6-d86754072fd8-kube-api-access-z4cdh\") pod \"migrator-74b7b57c65-nzpb5\" (UID: \"0173b8a7-07b4-407a-80b6-d86754072fd8\") " pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-nzpb5" Dec 04 22:19:42.034064 master-0 kubenswrapper[33572]: I1204 22:19:42.033983 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch6s4\" (UniqueName: \"kubernetes.io/projected/5dac8e25-0f51-4c04-929c-060479689a9d-kube-api-access-ch6s4\") pod \"machine-approver-74d9cbffbc-nzqgx\" (UID: \"5dac8e25-0f51-4c04-929c-060479689a9d\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-nzqgx" Dec 04 22:19:42.062158 master-0 kubenswrapper[33572]: I1204 22:19:42.062055 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgt75\" (UniqueName: \"kubernetes.io/projected/634c1df6-de4d-4e26-8c71-d39311cae0ce-kube-api-access-xgt75\") pod \"network-node-identity-nk92d\" (UID: \"634c1df6-de4d-4e26-8c71-d39311cae0ce\") " pod="openshift-network-node-identity/network-node-identity-nk92d" Dec 04 22:19:42.072153 master-0 kubenswrapper[33572]: I1204 22:19:42.072090 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e065179e-634a-4cbe-bb59-5b01c514e4de-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-2j9hp\" (UID: \"e065179e-634a-4cbe-bb59-5b01c514e4de\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-2j9hp" Dec 04 22:19:42.097734 master-0 kubenswrapper[33572]: I1204 22:19:42.097629 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w8vs\" (UniqueName: \"kubernetes.io/projected/eb4d8477-c3b5-4e88-aaa9-222ad56d974c-kube-api-access-2w8vs\") pod \"machine-config-server-wmm89\" (UID: \"eb4d8477-c3b5-4e88-aaa9-222ad56d974c\") " pod="openshift-machine-config-operator/machine-config-server-wmm89" Dec 04 22:19:42.129541 master-0 kubenswrapper[33572]: I1204 22:19:42.128119 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq44d\" (UniqueName: \"kubernetes.io/projected/74197c50-9a41-40e8-9289-c7e6afbd3737-kube-api-access-hq44d\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4\" (UID: \"74197c50-9a41-40e8-9289-c7e6afbd3737\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4" Dec 04 22:19:42.155533 master-0 kubenswrapper[33572]: I1204 22:19:42.154475 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfcv9\" (UniqueName: \"kubernetes.io/projected/5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141-kube-api-access-bfcv9\") pod \"apiserver-8db7f8d79-rlqbz\" (UID: \"5a1fcc70-6350-4f0d-a6fb-d8d30b8c9141\") " pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:42.180531 master-0 kubenswrapper[33572]: I1204 22:19:42.179674 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wh6b\" (UniqueName: \"kubernetes.io/projected/3f6d05b8-b7b4-4b2d-ace0-d1f59035d161-kube-api-access-9wh6b\") pod \"ovnkube-control-plane-5df5548d54-gjjxs\" (UID: \"3f6d05b8-b7b4-4b2d-ace0-d1f59035d161\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-gjjxs" Dec 04 22:19:42.180531 master-0 kubenswrapper[33572]: I1204 22:19:42.180226 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j8fr\" (UniqueName: \"kubernetes.io/projected/76fd9f44-4365-4271-8772-025655c50334-kube-api-access-9j8fr\") pod \"multus-additional-cni-plugins-5tpnf\" (UID: \"76fd9f44-4365-4271-8772-025655c50334\") " pod="openshift-multus/multus-additional-cni-plugins-5tpnf" Dec 04 22:19:42.198238 master-0 kubenswrapper[33572]: I1204 22:19:42.198159 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxkk\" (UniqueName: \"kubernetes.io/projected/690b447a-19c0-4925-bc9d-d0c86a83a377-kube-api-access-wsxkk\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-768dx\" (UID: \"690b447a-19c0-4925-bc9d-d0c86a83a377\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-768dx" Dec 04 22:19:42.208276 master-0 kubenswrapper[33572]: I1204 22:19:42.208217 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4czl\" (UniqueName: \"kubernetes.io/projected/fb0274dc-fac1-41f9-b3e5-77253d851fdf-kube-api-access-r4czl\") pod \"catalogd-controller-manager-7cc89f4c4c-v7zfw\" (UID: \"fb0274dc-fac1-41f9-b3e5-77253d851fdf\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:42.230014 master-0 kubenswrapper[33572]: I1204 22:19:42.229926 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkg8s\" (UniqueName: \"kubernetes.io/projected/6684358b-d7a6-4396-9b4f-ea67d85e4517-kube-api-access-qkg8s\") pod \"prometheus-operator-6c74d9cb9f-9cnnh\" (UID: \"6684358b-d7a6-4396-9b4f-ea67d85e4517\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-9cnnh" Dec 04 22:19:42.250926 master-0 kubenswrapper[33572]: I1204 22:19:42.250813 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24648a41-875f-4e98-8b21-3bdd38dffa32-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-cqndn\" (UID: \"24648a41-875f-4e98-8b21-3bdd38dffa32\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-cqndn" Dec 04 22:19:42.270425 master-0 kubenswrapper[33572]: I1204 22:19:42.270336 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jlvp\" (UniqueName: \"kubernetes.io/projected/a043ea49-97f9-4ae6-83b9-733f12754d94-kube-api-access-6jlvp\") pod \"cluster-storage-operator-dcf7fc84b-qmhlw\" (UID: \"a043ea49-97f9-4ae6-83b9-733f12754d94\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-qmhlw" Dec 04 22:19:42.296908 master-0 kubenswrapper[33572]: I1204 22:19:42.296832 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfklr\" (UniqueName: \"kubernetes.io/projected/c6a5d14d-0409-4024-b0a8-200fa2594185-kube-api-access-bfklr\") pod \"marketplace-operator-f797b99b6-m9m4h\" (UID: \"c6a5d14d-0409-4024-b0a8-200fa2594185\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:19:42.310358 master-0 kubenswrapper[33572]: I1204 22:19:42.310288 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nrj9\" (UniqueName: \"kubernetes.io/projected/810c363b-a4c7-428d-a2fb-285adc29f477-kube-api-access-2nrj9\") pod \"openshift-config-operator-68758cbcdb-fg6vx\" (UID: \"810c363b-a4c7-428d-a2fb-285adc29f477\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:19:42.328664 master-0 kubenswrapper[33572]: I1204 22:19:42.328581 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8d54\" (UniqueName: \"kubernetes.io/projected/f893663c-7c1e-4eda-9839-99c1c0440304-kube-api-access-g8d54\") pod \"authentication-operator-6c968fdfdf-bm2pk\" (UID: \"f893663c-7c1e-4eda-9839-99c1c0440304\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-bm2pk" Dec 04 22:19:42.370299 master-0 kubenswrapper[33572]: I1204 22:19:42.370215 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkrvr\" (UniqueName: \"kubernetes.io/projected/ebfbb13d-c3f2-476d-bd89-cb8a13d2acee-kube-api-access-xkrvr\") pod \"machine-config-controller-7c6d64c4cd-crk68\" (UID: \"ebfbb13d-c3f2-476d-bd89-cb8a13d2acee\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-crk68" Dec 04 22:19:42.373428 master-0 kubenswrapper[33572]: I1204 22:19:42.373358 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcw8f\" (UniqueName: \"kubernetes.io/projected/ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e-kube-api-access-kcw8f\") pod \"dns-operator-7c56cf9b74-sshsd\" (UID: \"ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-sshsd" Dec 04 22:19:42.388037 master-0 kubenswrapper[33572]: I1204 22:19:42.387960 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/5.log" Dec 04 22:19:42.404557 master-0 kubenswrapper[33572]: I1204 22:19:42.402185 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvrr5\" (UniqueName: \"kubernetes.io/projected/0beb871c-3bf1-471c-a028-746a650267bf-kube-api-access-dvrr5\") pod \"cluster-node-tuning-operator-85cff47f46-4dv2b\" (UID: \"0beb871c-3bf1-471c-a028-746a650267bf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-4dv2b" Dec 04 22:19:42.419542 master-0 kubenswrapper[33572]: I1204 22:19:42.418187 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nkh\" (UniqueName: \"kubernetes.io/projected/512ba6af-11ad-4217-a1ce-a2ab3ef67ec5-kube-api-access-g5nkh\") pod \"cluster-monitoring-operator-7ff994598c-rn6cz\" (UID: \"512ba6af-11ad-4217-a1ce-a2ab3ef67ec5\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-rn6cz" Dec 04 22:19:42.434836 master-0 kubenswrapper[33572]: I1204 22:19:42.434769 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vpbl\" (UniqueName: \"kubernetes.io/projected/a5c2d3b8-41c0-4531-b770-57b7c567fe30-kube-api-access-5vpbl\") pod \"dns-default-vvs9c\" (UID: \"a5c2d3b8-41c0-4531-b770-57b7c567fe30\") " pod="openshift-dns/dns-default-vvs9c" Dec 04 22:19:42.453622 master-0 kubenswrapper[33572]: E1204 22:19:42.453542 33572 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:42.453622 master-0 kubenswrapper[33572]: E1204 22:19:42.453599 33572 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:42.454014 master-0 kubenswrapper[33572]: E1204 22:19:42.453725 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access podName:dbe54b09-0399-4fbe-9f84-dd9dede0ab96 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:42.953695914 +0000 UTC m=+46.681221573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access") pod "installer-3-master-0" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:42.495428 master-0 kubenswrapper[33572]: E1204 22:19:42.495355 33572 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.971s" Dec 04 22:19:42.512606 master-0 kubenswrapper[33572]: I1204 22:19:42.510916 33572 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 04 22:19:42.542231 master-0 kubenswrapper[33572]: I1204 22:19:42.541891 33572 status_manager.go:379] "Container startup changed for unknown container" pod="openshift-ingress/router-default-5465c8b4db-8vm66" containerID="cri-o://0036ee313e2e8fbc7aa4a79880a6b001a94f998abd62378bddfdf0a04bcdd8e0" Dec 04 22:19:42.542231 master-0 kubenswrapper[33572]: I1204 22:19:42.541956 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:42.542231 master-0 kubenswrapper[33572]: I1204 22:19:42.542175 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 04 22:19:42.542231 master-0 kubenswrapper[33572]: I1204 22:19:42.542192 33572 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="2dd4c24f-2a12-4c0a-8040-f17042299847" Dec 04 22:19:42.542231 master-0 kubenswrapper[33572]: I1204 22:19:42.542214 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 04 22:19:42.542231 master-0 kubenswrapper[33572]: I1204 22:19:42.542225 33572 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="2dd4c24f-2a12-4c0a-8040-f17042299847" Dec 04 22:19:42.552125 master-0 kubenswrapper[33572]: I1204 22:19:42.551970 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a906debd0c35952850935aee2d607cce","Type":"ContainerStarted","Data":"1454b21a3ef5677a973a0198487142abc699efae2e6eaece0bc51065261cbdf5"} Dec 04 22:19:42.552125 master-0 kubenswrapper[33572]: I1204 22:19:42.552084 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659"} Dec 04 22:19:42.552125 master-0 kubenswrapper[33572]: I1204 22:19:42.552130 33572 status_manager.go:379] "Container startup changed for unknown container" pod="openshift-ingress/router-default-5465c8b4db-8vm66" containerID="cri-o://0036ee313e2e8fbc7aa4a79880a6b001a94f998abd62378bddfdf0a04bcdd8e0" Dec 04 22:19:42.552125 master-0 kubenswrapper[33572]: I1204 22:19:42.552151 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:42.552634 master-0 kubenswrapper[33572]: I1204 22:19:42.552217 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:19:42.552634 master-0 kubenswrapper[33572]: I1204 22:19:42.552457 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-mp4qx" Dec 04 22:19:42.552786 master-0 kubenswrapper[33572]: I1204 22:19:42.552730 33572 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-ingress/router-default-5465c8b4db-8vm66" containerID="cri-o://0036ee313e2e8fbc7aa4a79880a6b001a94f998abd62378bddfdf0a04bcdd8e0" Dec 04 22:19:42.552786 master-0 kubenswrapper[33572]: I1204 22:19:42.552779 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:42.553146 master-0 kubenswrapper[33572]: I1204 22:19:42.553095 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:19:42.553146 master-0 kubenswrapper[33572]: I1204 22:19:42.553137 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-8vm66" event={"ID":"c178afcf-b713-4c74-b22b-6169ba3123f5","Type":"ContainerStarted","Data":"702d291fa3743bbafacfa47a60ae14f3d5c999daf25d1da9f2f4ffc4616415ac"} Dec 04 22:19:42.553281 master-0 kubenswrapper[33572]: I1204 22:19:42.553164 33572 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:42.553521 master-0 kubenswrapper[33572]: I1204 22:19:42.553398 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:19:42.553521 master-0 kubenswrapper[33572]: I1204 22:19:42.553461 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:19:42.553657 master-0 kubenswrapper[33572]: I1204 22:19:42.553484 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-qlkgh" event={"ID":"addddaac-a31a-4dbf-b78f-87225b11b463","Type":"ContainerStarted","Data":"c913d2bd0998c498cae1649fcb2279986482eef15d34831152206fdcf37b9b66"} Dec 04 22:19:42.553657 master-0 kubenswrapper[33572]: I1204 22:19:42.553584 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-bslb5" Dec 04 22:19:42.553657 master-0 kubenswrapper[33572]: I1204 22:19:42.553621 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:42.553657 master-0 kubenswrapper[33572]: I1204 22:19:42.553655 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:19:42.554655 master-0 kubenswrapper[33572]: I1204 22:19:42.554576 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:42.554655 master-0 kubenswrapper[33572]: I1204 22:19:42.554618 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:42.554655 master-0 kubenswrapper[33572]: I1204 22:19:42.554658 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:42.554815 master-0 kubenswrapper[33572]: I1204 22:19:42.554675 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:19:42.554815 master-0 kubenswrapper[33572]: I1204 22:19:42.554702 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:19:42.554815 master-0 kubenswrapper[33572]: I1204 22:19:42.554727 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:19:42.554815 master-0 kubenswrapper[33572]: I1204 22:19:42.554807 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.554825 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.554841 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.554952 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.554981 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555015 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555029 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555042 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555067 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-85tbt" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555098 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555123 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555205 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555234 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6jkkl" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555248 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555314 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555361 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555382 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555398 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555414 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555427 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555437 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555454 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555476 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555556 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555596 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555623 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.555662 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-nxbjw" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.556456 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-v7zfw" Dec 04 22:19:42.558627 master-0 kubenswrapper[33572]: I1204 22:19:42.556871 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-bqcf8" Dec 04 22:19:42.562851 master-0 kubenswrapper[33572]: I1204 22:19:42.561158 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-f797b99b6-m9m4h" Dec 04 22:19:42.562851 master-0 kubenswrapper[33572]: I1204 22:19:42.561967 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-fg6vx" Dec 04 22:19:42.562851 master-0 kubenswrapper[33572]: I1204 22:19:42.562036 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-7b4bc6c685-l6dfn" Dec 04 22:19:42.562851 master-0 kubenswrapper[33572]: I1204 22:19:42.562068 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:19:42.562851 master-0 kubenswrapper[33572]: I1204 22:19:42.562414 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:42.581538 master-0 kubenswrapper[33572]: I1204 22:19:42.578958 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:42.646543 master-0 kubenswrapper[33572]: I1204 22:19:42.646378 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:42.648970 master-0 kubenswrapper[33572]: I1204 22:19:42.648926 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8nxc5" Dec 04 22:19:42.655753 master-0 kubenswrapper[33572]: I1204 22:19:42.655723 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:19:42.705963 master-0 kubenswrapper[33572]: I1204 22:19:42.705852 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-vvs9c" Dec 04 22:19:42.707667 master-0 kubenswrapper[33572]: I1204 22:19:42.707617 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vvs9c" Dec 04 22:19:43.007209 master-0 kubenswrapper[33572]: I1204 22:19:43.007037 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:43.008041 master-0 kubenswrapper[33572]: E1204 22:19:43.007204 33572 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:43.008041 master-0 kubenswrapper[33572]: E1204 22:19:43.007305 33572 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:43.008041 master-0 kubenswrapper[33572]: E1204 22:19:43.007347 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access podName:dbe54b09-0399-4fbe-9f84-dd9dede0ab96 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:44.007330939 +0000 UTC m=+47.734856588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access") pod "installer-3-master-0" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:43.207923 master-0 kubenswrapper[33572]: I1204 22:19:43.207856 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:43.212351 master-0 kubenswrapper[33572]: I1204 22:19:43.212318 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:43.214601 master-0 kubenswrapper[33572]: I1204 22:19:43.214492 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:43.405940 master-0 kubenswrapper[33572]: I1204 22:19:43.405869 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:43.406292 master-0 kubenswrapper[33572]: I1204 22:19:43.406222 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5465c8b4db-8vm66" Dec 04 22:19:43.651549 master-0 kubenswrapper[33572]: I1204 22:19:43.651391 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=6.651354717 podStartE2EDuration="6.651354717s" podCreationTimestamp="2025-12-04 22:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:19:43.648181909 +0000 UTC m=+47.375707598" watchObservedRunningTime="2025-12-04 22:19:43.651354717 +0000 UTC m=+47.378880406" Dec 04 22:19:44.047136 master-0 kubenswrapper[33572]: I1204 22:19:44.047067 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:44.048019 master-0 kubenswrapper[33572]: E1204 22:19:44.047341 33572 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:44.048019 master-0 kubenswrapper[33572]: E1204 22:19:44.047402 33572 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:44.048019 master-0 kubenswrapper[33572]: E1204 22:19:44.047544 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access podName:dbe54b09-0399-4fbe-9f84-dd9dede0ab96 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:46.047475595 +0000 UTC m=+49.775001284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access") pod "installer-3-master-0" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:44.055888 master-0 kubenswrapper[33572]: I1204 22:19:44.055185 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=7.055148599 podStartE2EDuration="7.055148599s" podCreationTimestamp="2025-12-04 22:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:19:44.025480561 +0000 UTC m=+47.753006250" watchObservedRunningTime="2025-12-04 22:19:44.055148599 +0000 UTC m=+47.782674288" Dec 04 22:19:46.011770 master-0 kubenswrapper[33572]: I1204 22:19:46.011689 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5dd7b479dd-5z246"] Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012123 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: I1204 22:19:46.012154 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012201 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986a4de7-3a54-48dc-9599-49cf19ba0ad5" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: I1204 22:19:46.012214 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="986a4de7-3a54-48dc-9599-49cf19ba0ad5" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012250 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9fbd90-66d5-4637-9821-22242aa6f6d7" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: I1204 22:19:46.012264 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9fbd90-66d5-4637-9821-22242aa6f6d7" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012295 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0791dc66-67d9-42bd-b7c3-d45dc5513c3b" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: I1204 22:19:46.012308 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0791dc66-67d9-42bd-b7c3-d45dc5513c3b" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012344 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" containerName="collect-profiles" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: I1204 22:19:46.012356 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" containerName="collect-profiles" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012382 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da6da420-9631-4bce-b238-96ab361e23e9" containerName="collect-profiles" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: I1204 22:19:46.012396 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6da420-9631-4bce-b238-96ab361e23e9" containerName="collect-profiles" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012435 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3099f7b5-f904-4d15-aedb-f4e558b813e4" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: I1204 22:19:46.012453 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3099f7b5-f904-4d15-aedb-f4e558b813e4" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012535 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e011b0a-89e2-47e3-9112-d46a828416b1" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: I1204 22:19:46.012551 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e011b0a-89e2-47e3-9112-d46a828416b1" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012570 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerName="assisted-installer-controller" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: I1204 22:19:46.012582 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerName="assisted-installer-controller" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012626 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: I1204 22:19:46.012644 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" containerName="installer" Dec 04 22:19:46.012646 master-0 kubenswrapper[33572]: E1204 22:19:46.012684 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbe54b09-0399-4fbe-9f84-dd9dede0ab96" containerName="installer" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.012699 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbe54b09-0399-4fbe-9f84-dd9dede0ab96" containerName="installer" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.012918 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbe54b09-0399-4fbe-9f84-dd9dede0ab96" containerName="installer" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.012945 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab" containerName="installer" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.013017 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="0791dc66-67d9-42bd-b7c3-d45dc5513c3b" containerName="installer" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.013038 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e011b0a-89e2-47e3-9112-d46a828416b1" containerName="installer" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.013052 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="da6da420-9631-4bce-b238-96ab361e23e9" containerName="collect-profiles" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.013069 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9f6dd3-32d4-40e4-a550-f0bbfe31eeba" containerName="installer" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.013095 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9fbd90-66d5-4637-9821-22242aa6f6d7" containerName="installer" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.013110 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" containerName="collect-profiles" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.013129 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="986a4de7-3a54-48dc-9599-49cf19ba0ad5" containerName="installer" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.013155 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="3099f7b5-f904-4d15-aedb-f4e558b813e4" containerName="installer" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.013175 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="9160fec1-743a-470e-b48f-95a7ddf1c0b2" containerName="assisted-installer-controller" Dec 04 22:19:46.013991 master-0 kubenswrapper[33572]: I1204 22:19:46.013886 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.017486 master-0 kubenswrapper[33572]: I1204 22:19:46.017426 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 22:19:46.020092 master-0 kubenswrapper[33572]: I1204 22:19:46.020028 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 22:19:46.020215 master-0 kubenswrapper[33572]: I1204 22:19:46.020129 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-qjgpp" Dec 04 22:19:46.020286 master-0 kubenswrapper[33572]: I1204 22:19:46.020131 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 22:19:46.020360 master-0 kubenswrapper[33572]: I1204 22:19:46.020309 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 22:19:46.020946 master-0 kubenswrapper[33572]: I1204 22:19:46.020899 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 22:19:46.032101 master-0 kubenswrapper[33572]: I1204 22:19:46.032008 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 22:19:46.032400 master-0 kubenswrapper[33572]: I1204 22:19:46.032334 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 22:19:46.032478 master-0 kubenswrapper[33572]: I1204 22:19:46.032404 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 22:19:46.032478 master-0 kubenswrapper[33572]: I1204 22:19:46.032343 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 22:19:46.032613 master-0 kubenswrapper[33572]: I1204 22:19:46.032468 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 22:19:46.048541 master-0 kubenswrapper[33572]: I1204 22:19:46.045973 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 22:19:46.055948 master-0 kubenswrapper[33572]: I1204 22:19:46.054734 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 22:19:46.060842 master-0 kubenswrapper[33572]: I1204 22:19:46.060760 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5dd7b479dd-5z246"] Dec 04 22:19:46.076595 master-0 kubenswrapper[33572]: I1204 22:19:46.076496 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 22:19:46.086459 master-0 kubenswrapper[33572]: I1204 22:19:46.086334 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.086728 master-0 kubenswrapper[33572]: I1204 22:19:46.086495 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-router-certs\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.086728 master-0 kubenswrapper[33572]: I1204 22:19:46.086576 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-error\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.086728 master-0 kubenswrapper[33572]: I1204 22:19:46.086607 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-audit-policies\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.086728 master-0 kubenswrapper[33572]: I1204 22:19:46.086633 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-session\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.086728 master-0 kubenswrapper[33572]: I1204 22:19:46.086675 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-service-ca\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.086953 master-0 kubenswrapper[33572]: I1204 22:19:46.086817 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.086991 master-0 kubenswrapper[33572]: I1204 22:19:46.086959 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-login\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.087052 master-0 kubenswrapper[33572]: I1204 22:19:46.087022 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.087096 master-0 kubenswrapper[33572]: I1204 22:19:46.087071 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.087147 master-0 kubenswrapper[33572]: I1204 22:19:46.087116 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d685460a-9a74-411d-983a-79af235b2cc0-audit-dir\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.087195 master-0 kubenswrapper[33572]: I1204 22:19:46.087167 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8s2j\" (UniqueName: \"kubernetes.io/projected/d685460a-9a74-411d-983a-79af235b2cc0-kube-api-access-l8s2j\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.087234 master-0 kubenswrapper[33572]: I1204 22:19:46.087213 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.087298 master-0 kubenswrapper[33572]: I1204 22:19:46.087273 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:46.087607 master-0 kubenswrapper[33572]: E1204 22:19:46.087572 33572 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:46.087935 master-0 kubenswrapper[33572]: E1204 22:19:46.087903 33572 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:46.088084 master-0 kubenswrapper[33572]: E1204 22:19:46.088036 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access podName:dbe54b09-0399-4fbe-9f84-dd9dede0ab96 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:50.088002787 +0000 UTC m=+53.815528606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access") pod "installer-3-master-0" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:46.188752 master-0 kubenswrapper[33572]: I1204 22:19:46.188669 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.188752 master-0 kubenswrapper[33572]: I1204 22:19:46.188748 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-login\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.188752 master-0 kubenswrapper[33572]: I1204 22:19:46.188777 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189164 master-0 kubenswrapper[33572]: I1204 22:19:46.188801 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189164 master-0 kubenswrapper[33572]: I1204 22:19:46.188818 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d685460a-9a74-411d-983a-79af235b2cc0-audit-dir\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189164 master-0 kubenswrapper[33572]: I1204 22:19:46.188837 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8s2j\" (UniqueName: \"kubernetes.io/projected/d685460a-9a74-411d-983a-79af235b2cc0-kube-api-access-l8s2j\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189164 master-0 kubenswrapper[33572]: I1204 22:19:46.188856 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189164 master-0 kubenswrapper[33572]: I1204 22:19:46.188904 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189164 master-0 kubenswrapper[33572]: I1204 22:19:46.188927 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-router-certs\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189164 master-0 kubenswrapper[33572]: I1204 22:19:46.188944 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-error\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189164 master-0 kubenswrapper[33572]: I1204 22:19:46.188962 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-audit-policies\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189164 master-0 kubenswrapper[33572]: I1204 22:19:46.188981 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-session\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189164 master-0 kubenswrapper[33572]: I1204 22:19:46.189003 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-service-ca\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.189895 master-0 kubenswrapper[33572]: I1204 22:19:46.189858 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-service-ca\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.190235 master-0 kubenswrapper[33572]: I1204 22:19:46.190188 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d685460a-9a74-411d-983a-79af235b2cc0-audit-dir\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.190325 master-0 kubenswrapper[33572]: I1204 22:19:46.190244 33572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 04 22:19:46.191141 master-0 kubenswrapper[33572]: I1204 22:19:46.191095 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.191862 master-0 kubenswrapper[33572]: I1204 22:19:46.191786 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-audit-policies\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.192275 master-0 kubenswrapper[33572]: I1204 22:19:46.192157 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.194780 master-0 kubenswrapper[33572]: I1204 22:19:46.194742 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-session\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.194780 master-0 kubenswrapper[33572]: I1204 22:19:46.194760 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.195443 master-0 kubenswrapper[33572]: I1204 22:19:46.195409 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.195994 master-0 kubenswrapper[33572]: I1204 22:19:46.195894 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-error\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.196216 master-0 kubenswrapper[33572]: I1204 22:19:46.196152 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-login\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.196705 master-0 kubenswrapper[33572]: I1204 22:19:46.196675 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-router-certs\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.197562 master-0 kubenswrapper[33572]: I1204 22:19:46.197488 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.207720 master-0 kubenswrapper[33572]: I1204 22:19:46.207668 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8s2j\" (UniqueName: \"kubernetes.io/projected/d685460a-9a74-411d-983a-79af235b2cc0-kube-api-access-l8s2j\") pod \"oauth-openshift-5dd7b479dd-5z246\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.376972 master-0 kubenswrapper[33572]: I1204 22:19:46.376901 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:46.512769 master-0 kubenswrapper[33572]: I1204 22:19:46.512716 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-58574fc8d8-gg42x" Dec 04 22:19:46.900175 master-0 kubenswrapper[33572]: I1204 22:19:46.900123 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5dd7b479dd-5z246"] Dec 04 22:19:46.903928 master-0 kubenswrapper[33572]: W1204 22:19:46.903874 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd685460a_9a74_411d_983a_79af235b2cc0.slice/crio-8c6f9f0ddb838fa2eccda391b140ed7bd936b5b53875b4276b92e13799a7b546 WatchSource:0}: Error finding container 8c6f9f0ddb838fa2eccda391b140ed7bd936b5b53875b4276b92e13799a7b546: Status 404 returned error can't find the container with id 8c6f9f0ddb838fa2eccda391b140ed7bd936b5b53875b4276b92e13799a7b546 Dec 04 22:19:46.906592 master-0 kubenswrapper[33572]: I1204 22:19:46.906549 33572 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 22:19:47.377557 master-0 kubenswrapper[33572]: I1204 22:19:47.377450 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-8db7f8d79-rlqbz" Dec 04 22:19:47.451925 master-0 kubenswrapper[33572]: I1204 22:19:47.451844 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" event={"ID":"d685460a-9a74-411d-983a-79af235b2cc0","Type":"ContainerStarted","Data":"8c6f9f0ddb838fa2eccda391b140ed7bd936b5b53875b4276b92e13799a7b546"} Dec 04 22:19:48.205411 master-0 kubenswrapper[33572]: I1204 22:19:48.205338 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:48.206358 master-0 kubenswrapper[33572]: I1204 22:19:48.206318 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:48.216415 master-0 kubenswrapper[33572]: I1204 22:19:48.216344 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:19:48.224930 master-0 kubenswrapper[33572]: I1204 22:19:48.224881 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 04 22:19:48.466415 master-0 kubenswrapper[33572]: I1204 22:19:48.466254 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:19:49.106628 master-0 kubenswrapper[33572]: I1204 22:19:49.099554 33572 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 04 22:19:49.106628 master-0 kubenswrapper[33572]: I1204 22:19:49.100437 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="a906debd0c35952850935aee2d607cce" containerName="startup-monitor" containerID="cri-o://1454b21a3ef5677a973a0198487142abc699efae2e6eaece0bc51065261cbdf5" gracePeriod=5 Dec 04 22:19:50.169542 master-0 kubenswrapper[33572]: I1204 22:19:50.169401 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:50.170707 master-0 kubenswrapper[33572]: E1204 22:19:50.169659 33572 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:50.170707 master-0 kubenswrapper[33572]: E1204 22:19:50.169706 33572 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:50.170707 master-0 kubenswrapper[33572]: E1204 22:19:50.169799 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access podName:dbe54b09-0399-4fbe-9f84-dd9dede0ab96 nodeName:}" failed. No retries permitted until 2025-12-04 22:19:58.169772721 +0000 UTC m=+61.897298400 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access") pod "installer-3-master-0" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:50.479000 master-0 kubenswrapper[33572]: I1204 22:19:50.478833 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" event={"ID":"d685460a-9a74-411d-983a-79af235b2cc0","Type":"ContainerStarted","Data":"9496356901a9f29e38cb4b8ee6ac1abc544307a1b801d9b2ead4cd1140645187"} Dec 04 22:19:50.481152 master-0 kubenswrapper[33572]: I1204 22:19:50.481106 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:50.487998 master-0 kubenswrapper[33572]: I1204 22:19:50.487951 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:19:50.513881 master-0 kubenswrapper[33572]: I1204 22:19:50.513787 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" podStartSLOduration=3.058826543 podStartE2EDuration="5.513766413s" podCreationTimestamp="2025-12-04 22:19:45 +0000 UTC" firstStartedPulling="2025-12-04 22:19:46.906347351 +0000 UTC m=+50.633873040" lastFinishedPulling="2025-12-04 22:19:49.361287261 +0000 UTC m=+53.088812910" observedRunningTime="2025-12-04 22:19:50.510650626 +0000 UTC m=+54.238176285" watchObservedRunningTime="2025-12-04 22:19:50.513766413 +0000 UTC m=+54.241292072" Dec 04 22:19:51.272309 master-0 kubenswrapper[33572]: I1204 22:19:51.272202 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:19:51.582375 master-0 kubenswrapper[33572]: I1204 22:19:51.582191 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sdrkm" Dec 04 22:19:51.584103 master-0 kubenswrapper[33572]: I1204 22:19:51.584036 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vvkjf" Dec 04 22:19:51.889113 master-0 kubenswrapper[33572]: I1204 22:19:51.888842 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zt44t" Dec 04 22:19:54.520210 master-0 kubenswrapper[33572]: I1204 22:19:54.520135 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a906debd0c35952850935aee2d607cce/startup-monitor/0.log" Dec 04 22:19:54.521048 master-0 kubenswrapper[33572]: I1204 22:19:54.520231 33572 generic.go:334] "Generic (PLEG): container finished" podID="a906debd0c35952850935aee2d607cce" containerID="1454b21a3ef5677a973a0198487142abc699efae2e6eaece0bc51065261cbdf5" exitCode=137 Dec 04 22:19:54.679871 master-0 kubenswrapper[33572]: I1204 22:19:54.679783 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a906debd0c35952850935aee2d607cce/startup-monitor/0.log" Dec 04 22:19:54.680145 master-0 kubenswrapper[33572]: I1204 22:19:54.679973 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:54.697492 master-0 kubenswrapper[33572]: I1204 22:19:54.697422 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"a906debd0c35952850935aee2d607cce\" (UID: \"a906debd0c35952850935aee2d607cce\") " Dec 04 22:19:54.697726 master-0 kubenswrapper[33572]: I1204 22:19:54.697613 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests" (OuterVolumeSpecName: "manifests") pod "a906debd0c35952850935aee2d607cce" (UID: "a906debd0c35952850935aee2d607cce"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:19:54.697726 master-0 kubenswrapper[33572]: I1204 22:19:54.697706 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"a906debd0c35952850935aee2d607cce\" (UID: \"a906debd0c35952850935aee2d607cce\") " Dec 04 22:19:54.697889 master-0 kubenswrapper[33572]: I1204 22:19:54.697835 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a906debd0c35952850935aee2d607cce" (UID: "a906debd0c35952850935aee2d607cce"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:19:54.698906 master-0 kubenswrapper[33572]: I1204 22:19:54.698828 33572 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") on node \"master-0\" DevicePath \"\"" Dec 04 22:19:54.698906 master-0 kubenswrapper[33572]: I1204 22:19:54.698905 33572 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:19:54.800029 master-0 kubenswrapper[33572]: I1204 22:19:54.799821 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"a906debd0c35952850935aee2d607cce\" (UID: \"a906debd0c35952850935aee2d607cce\") " Dec 04 22:19:54.800029 master-0 kubenswrapper[33572]: I1204 22:19:54.800002 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"a906debd0c35952850935aee2d607cce\" (UID: \"a906debd0c35952850935aee2d607cce\") " Dec 04 22:19:54.800487 master-0 kubenswrapper[33572]: I1204 22:19:54.800034 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock" (OuterVolumeSpecName: "var-lock") pod "a906debd0c35952850935aee2d607cce" (UID: "a906debd0c35952850935aee2d607cce"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:19:54.800843 master-0 kubenswrapper[33572]: I1204 22:19:54.800766 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"a906debd0c35952850935aee2d607cce\" (UID: \"a906debd0c35952850935aee2d607cce\") " Dec 04 22:19:54.801015 master-0 kubenswrapper[33572]: I1204 22:19:54.800957 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log" (OuterVolumeSpecName: "var-log") pod "a906debd0c35952850935aee2d607cce" (UID: "a906debd0c35952850935aee2d607cce"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:19:54.801374 master-0 kubenswrapper[33572]: I1204 22:19:54.801328 33572 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:19:54.801374 master-0 kubenswrapper[33572]: I1204 22:19:54.801360 33572 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") on node \"master-0\" DevicePath \"\"" Dec 04 22:19:54.808738 master-0 kubenswrapper[33572]: I1204 22:19:54.808646 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "a906debd0c35952850935aee2d607cce" (UID: "a906debd0c35952850935aee2d607cce"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:19:54.903051 master-0 kubenswrapper[33572]: I1204 22:19:54.902596 33572 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:19:55.532245 master-0 kubenswrapper[33572]: I1204 22:19:55.532163 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a906debd0c35952850935aee2d607cce/startup-monitor/0.log" Dec 04 22:19:55.532245 master-0 kubenswrapper[33572]: I1204 22:19:55.532257 33572 scope.go:117] "RemoveContainer" containerID="1454b21a3ef5677a973a0198487142abc699efae2e6eaece0bc51065261cbdf5" Dec 04 22:19:55.533198 master-0 kubenswrapper[33572]: I1204 22:19:55.532413 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:19:56.544753 master-0 kubenswrapper[33572]: I1204 22:19:56.544675 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a906debd0c35952850935aee2d607cce" path="/var/lib/kubelet/pods/a906debd0c35952850935aee2d607cce/volumes" Dec 04 22:19:56.545366 master-0 kubenswrapper[33572]: I1204 22:19:56.545108 33572 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Dec 04 22:19:56.581393 master-0 kubenswrapper[33572]: I1204 22:19:56.567135 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 04 22:19:56.581393 master-0 kubenswrapper[33572]: I1204 22:19:56.567216 33572 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f0c61551-2c23-4f4c-82e9-661de389bed6" Dec 04 22:19:56.581393 master-0 kubenswrapper[33572]: I1204 22:19:56.571942 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 04 22:19:56.581393 master-0 kubenswrapper[33572]: I1204 22:19:56.572006 33572 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f0c61551-2c23-4f4c-82e9-661de389bed6" Dec 04 22:19:58.253917 master-0 kubenswrapper[33572]: I1204 22:19:58.253734 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:19:58.253917 master-0 kubenswrapper[33572]: E1204 22:19:58.253916 33572 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:58.254777 master-0 kubenswrapper[33572]: E1204 22:19:58.253937 33572 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:19:58.254777 master-0 kubenswrapper[33572]: E1204 22:19:58.253995 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access podName:dbe54b09-0399-4fbe-9f84-dd9dede0ab96 nodeName:}" failed. No retries permitted until 2025-12-04 22:20:14.253979933 +0000 UTC m=+77.981505582 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access") pod "installer-3-master-0" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:20:02.137573 master-0 kubenswrapper[33572]: I1204 22:20:02.137474 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:20:14.343907 master-0 kubenswrapper[33572]: I1204 22:20:14.343765 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:20:14.344972 master-0 kubenswrapper[33572]: E1204 22:20:14.344173 33572 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:20:14.344972 master-0 kubenswrapper[33572]: E1204 22:20:14.344265 33572 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:20:14.344972 master-0 kubenswrapper[33572]: E1204 22:20:14.344389 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access podName:dbe54b09-0399-4fbe-9f84-dd9dede0ab96 nodeName:}" failed. No retries permitted until 2025-12-04 22:20:46.344346721 +0000 UTC m=+110.071872410 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access") pod "installer-3-master-0" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 04 22:20:19.227522 master-0 kubenswrapper[33572]: I1204 22:20:19.227414 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 04 22:20:19.228380 master-0 kubenswrapper[33572]: E1204 22:20:19.227979 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a906debd0c35952850935aee2d607cce" containerName="startup-monitor" Dec 04 22:20:19.228380 master-0 kubenswrapper[33572]: I1204 22:20:19.228002 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a906debd0c35952850935aee2d607cce" containerName="startup-monitor" Dec 04 22:20:19.228380 master-0 kubenswrapper[33572]: I1204 22:20:19.228194 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="a906debd0c35952850935aee2d607cce" containerName="startup-monitor" Dec 04 22:20:19.228921 master-0 kubenswrapper[33572]: I1204 22:20:19.228867 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:19.231936 master-0 kubenswrapper[33572]: I1204 22:20:19.231872 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-ckmlf" Dec 04 22:20:19.231936 master-0 kubenswrapper[33572]: I1204 22:20:19.231914 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 22:20:19.247636 master-0 kubenswrapper[33572]: I1204 22:20:19.244655 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 04 22:20:19.425652 master-0 kubenswrapper[33572]: I1204 22:20:19.425563 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kube-api-access\") pod \"installer-4-master-0\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:19.426001 master-0 kubenswrapper[33572]: I1204 22:20:19.425795 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:19.426001 master-0 kubenswrapper[33572]: I1204 22:20:19.425858 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-var-lock\") pod \"installer-4-master-0\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:19.528889 master-0 kubenswrapper[33572]: I1204 22:20:19.528121 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kube-api-access\") pod \"installer-4-master-0\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:19.529398 master-0 kubenswrapper[33572]: I1204 22:20:19.529340 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:19.529565 master-0 kubenswrapper[33572]: I1204 22:20:19.529401 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-var-lock\") pod \"installer-4-master-0\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:19.529688 master-0 kubenswrapper[33572]: I1204 22:20:19.529569 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:19.529774 master-0 kubenswrapper[33572]: I1204 22:20:19.529671 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-var-lock\") pod \"installer-4-master-0\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:19.559814 master-0 kubenswrapper[33572]: I1204 22:20:19.559713 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kube-api-access\") pod \"installer-4-master-0\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:19.564884 master-0 kubenswrapper[33572]: I1204 22:20:19.564811 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:20:20.049594 master-0 kubenswrapper[33572]: I1204 22:20:20.049516 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 04 22:20:20.054659 master-0 kubenswrapper[33572]: W1204 22:20:20.054595 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd8307846_d1cf_4357_bcc0_b3531d34dc8b.slice/crio-6a024729afee87c3e33b7e75781752b2a815d9566b6738f78b14f4287c197966 WatchSource:0}: Error finding container 6a024729afee87c3e33b7e75781752b2a815d9566b6738f78b14f4287c197966: Status 404 returned error can't find the container with id 6a024729afee87c3e33b7e75781752b2a815d9566b6738f78b14f4287c197966 Dec 04 22:20:20.777011 master-0 kubenswrapper[33572]: I1204 22:20:20.776899 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"d8307846-d1cf-4357-bcc0-b3531d34dc8b","Type":"ContainerStarted","Data":"9b6f3d6390ba26ec726db6df56392bbfde00022e8a480e77e2a9b36e93f4d416"} Dec 04 22:20:20.777011 master-0 kubenswrapper[33572]: I1204 22:20:20.776975 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"d8307846-d1cf-4357-bcc0-b3531d34dc8b","Type":"ContainerStarted","Data":"6a024729afee87c3e33b7e75781752b2a815d9566b6738f78b14f4287c197966"} Dec 04 22:20:20.812548 master-0 kubenswrapper[33572]: I1204 22:20:20.808460 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=1.808381077 podStartE2EDuration="1.808381077s" podCreationTimestamp="2025-12-04 22:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:20:20.802913754 +0000 UTC m=+84.530439433" watchObservedRunningTime="2025-12-04 22:20:20.808381077 +0000 UTC m=+84.535906766" Dec 04 22:20:46.380757 master-0 kubenswrapper[33572]: I1204 22:20:46.380393 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:20:46.385198 master-0 kubenswrapper[33572]: I1204 22:20:46.385134 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"installer-3-master-0\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 04 22:20:46.481490 master-0 kubenswrapper[33572]: I1204 22:20:46.481399 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") pod \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\" (UID: \"dbe54b09-0399-4fbe-9f84-dd9dede0ab96\") " Dec 04 22:20:46.485252 master-0 kubenswrapper[33572]: I1204 22:20:46.485151 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dbe54b09-0399-4fbe-9f84-dd9dede0ab96" (UID: "dbe54b09-0399-4fbe-9f84-dd9dede0ab96"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:20:46.585605 master-0 kubenswrapper[33572]: I1204 22:20:46.585348 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dbe54b09-0399-4fbe-9f84-dd9dede0ab96-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:20:58.698931 master-0 kubenswrapper[33572]: I1204 22:20:58.698868 33572 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 04 22:20:58.700206 master-0 kubenswrapper[33572]: I1204 22:20:58.699825 33572 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 04 22:20:58.700206 master-0 kubenswrapper[33572]: I1204 22:20:58.699960 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.700412 master-0 kubenswrapper[33572]: I1204 22:20:58.700372 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver" containerID="cri-o://0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c" gracePeriod=15 Dec 04 22:20:58.700465 master-0 kubenswrapper[33572]: I1204 22:20:58.700392 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d" gracePeriod=15 Dec 04 22:20:58.700465 master-0 kubenswrapper[33572]: I1204 22:20:58.700434 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-syncer" containerID="cri-o://af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321" gracePeriod=15 Dec 04 22:20:58.700549 master-0 kubenswrapper[33572]: I1204 22:20:58.700386 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659" gracePeriod=15 Dec 04 22:20:58.700590 master-0 kubenswrapper[33572]: I1204 22:20:58.700546 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65" gracePeriod=15 Dec 04 22:20:58.701627 master-0 kubenswrapper[33572]: I1204 22:20:58.701300 33572 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 04 22:20:58.701677 master-0 kubenswrapper[33572]: E1204 22:20:58.701633 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" Dec 04 22:20:58.701677 master-0 kubenswrapper[33572]: I1204 22:20:58.701654 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" Dec 04 22:20:58.701776 master-0 kubenswrapper[33572]: E1204 22:20:58.701685 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" Dec 04 22:20:58.701776 master-0 kubenswrapper[33572]: I1204 22:20:58.701700 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" Dec 04 22:20:58.701776 master-0 kubenswrapper[33572]: E1204 22:20:58.701735 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver" Dec 04 22:20:58.701776 master-0 kubenswrapper[33572]: I1204 22:20:58.701748 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver" Dec 04 22:20:58.701934 master-0 kubenswrapper[33572]: E1204 22:20:58.701784 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-syncer" Dec 04 22:20:58.701934 master-0 kubenswrapper[33572]: I1204 22:20:58.701795 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-syncer" Dec 04 22:20:58.701934 master-0 kubenswrapper[33572]: E1204 22:20:58.701809 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="setup" Dec 04 22:20:58.701934 master-0 kubenswrapper[33572]: I1204 22:20:58.701817 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="setup" Dec 04 22:20:58.701934 master-0 kubenswrapper[33572]: E1204 22:20:58.701828 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 22:20:58.701934 master-0 kubenswrapper[33572]: I1204 22:20:58.701842 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 22:20:58.701934 master-0 kubenswrapper[33572]: E1204 22:20:58.701864 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-insecure-readyz" Dec 04 22:20:58.701934 master-0 kubenswrapper[33572]: I1204 22:20:58.701872 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-insecure-readyz" Dec 04 22:20:58.702168 master-0 kubenswrapper[33572]: I1204 22:20:58.702021 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-insecure-readyz" Dec 04 22:20:58.702168 master-0 kubenswrapper[33572]: I1204 22:20:58.702059 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" Dec 04 22:20:58.702168 master-0 kubenswrapper[33572]: I1204 22:20:58.702078 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" Dec 04 22:20:58.702168 master-0 kubenswrapper[33572]: I1204 22:20:58.702103 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver" Dec 04 22:20:58.702168 master-0 kubenswrapper[33572]: I1204 22:20:58.702114 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-syncer" Dec 04 22:20:58.702168 master-0 kubenswrapper[33572]: I1204 22:20:58.702126 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 22:20:58.758334 master-0 kubenswrapper[33572]: I1204 22:20:58.758257 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33faf0280764958ffeecc2dce44a9bfc-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"33faf0280764958ffeecc2dce44a9bfc\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:20:58.758334 master-0 kubenswrapper[33572]: I1204 22:20:58.758347 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.758817 master-0 kubenswrapper[33572]: I1204 22:20:58.758378 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.758817 master-0 kubenswrapper[33572]: I1204 22:20:58.758410 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.758817 master-0 kubenswrapper[33572]: I1204 22:20:58.758428 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/33faf0280764958ffeecc2dce44a9bfc-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"33faf0280764958ffeecc2dce44a9bfc\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:20:58.758817 master-0 kubenswrapper[33572]: I1204 22:20:58.758448 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.758817 master-0 kubenswrapper[33572]: I1204 22:20:58.758466 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.758817 master-0 kubenswrapper[33572]: I1204 22:20:58.758533 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/33faf0280764958ffeecc2dce44a9bfc-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"33faf0280764958ffeecc2dce44a9bfc\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:20:58.861035 master-0 kubenswrapper[33572]: I1204 22:20:58.860978 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.861164 master-0 kubenswrapper[33572]: I1204 22:20:58.861056 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.861164 master-0 kubenswrapper[33572]: I1204 22:20:58.861102 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.861164 master-0 kubenswrapper[33572]: I1204 22:20:58.861135 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/33faf0280764958ffeecc2dce44a9bfc-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"33faf0280764958ffeecc2dce44a9bfc\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:20:58.861288 master-0 kubenswrapper[33572]: I1204 22:20:58.861172 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.861288 master-0 kubenswrapper[33572]: I1204 22:20:58.861205 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.861288 master-0 kubenswrapper[33572]: I1204 22:20:58.861250 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/33faf0280764958ffeecc2dce44a9bfc-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"33faf0280764958ffeecc2dce44a9bfc\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:20:58.861288 master-0 kubenswrapper[33572]: I1204 22:20:58.861281 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33faf0280764958ffeecc2dce44a9bfc-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"33faf0280764958ffeecc2dce44a9bfc\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:20:58.861444 master-0 kubenswrapper[33572]: I1204 22:20:58.861420 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33faf0280764958ffeecc2dce44a9bfc-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"33faf0280764958ffeecc2dce44a9bfc\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:20:58.866629 master-0 kubenswrapper[33572]: I1204 22:20:58.861484 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.866629 master-0 kubenswrapper[33572]: I1204 22:20:58.861585 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.866629 master-0 kubenswrapper[33572]: I1204 22:20:58.861623 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.866629 master-0 kubenswrapper[33572]: I1204 22:20:58.861659 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/33faf0280764958ffeecc2dce44a9bfc-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"33faf0280764958ffeecc2dce44a9bfc\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:20:58.866629 master-0 kubenswrapper[33572]: I1204 22:20:58.861700 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.866629 master-0 kubenswrapper[33572]: I1204 22:20:58.861735 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:20:58.866629 master-0 kubenswrapper[33572]: I1204 22:20:58.861769 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/33faf0280764958ffeecc2dce44a9bfc-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"33faf0280764958ffeecc2dce44a9bfc\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:20:59.131170 master-0 kubenswrapper[33572]: I1204 22:20:59.130801 33572 generic.go:334] "Generic (PLEG): container finished" podID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" containerID="9b6f3d6390ba26ec726db6df56392bbfde00022e8a480e77e2a9b36e93f4d416" exitCode=0 Dec 04 22:20:59.131170 master-0 kubenswrapper[33572]: I1204 22:20:59.131080 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"d8307846-d1cf-4357-bcc0-b3531d34dc8b","Type":"ContainerDied","Data":"9b6f3d6390ba26ec726db6df56392bbfde00022e8a480e77e2a9b36e93f4d416"} Dec 04 22:20:59.132982 master-0 kubenswrapper[33572]: I1204 22:20:59.132887 33572 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:20:59.134057 master-0 kubenswrapper[33572]: I1204 22:20:59.133987 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:20:59.136131 master-0 kubenswrapper[33572]: I1204 22:20:59.136062 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b89698aa356a3bc32694e2b098f9a900/kube-apiserver-check-endpoints/0.log" Dec 04 22:20:59.138882 master-0 kubenswrapper[33572]: I1204 22:20:59.138828 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b89698aa356a3bc32694e2b098f9a900/kube-apiserver-cert-syncer/0.log" Dec 04 22:20:59.140142 master-0 kubenswrapper[33572]: I1204 22:20:59.140070 33572 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659" exitCode=0 Dec 04 22:20:59.140142 master-0 kubenswrapper[33572]: I1204 22:20:59.140117 33572 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65" exitCode=0 Dec 04 22:20:59.140142 master-0 kubenswrapper[33572]: I1204 22:20:59.140138 33572 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d" exitCode=0 Dec 04 22:20:59.140444 master-0 kubenswrapper[33572]: I1204 22:20:59.140161 33572 scope.go:117] "RemoveContainer" containerID="5fcdcfec6584d4b81237f292d9431a4c2eccc32a97cb24f1ad54b8dd23d476bb" Dec 04 22:20:59.140444 master-0 kubenswrapper[33572]: I1204 22:20:59.140174 33572 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321" exitCode=2 Dec 04 22:21:00.153660 master-0 kubenswrapper[33572]: I1204 22:21:00.153605 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b89698aa356a3bc32694e2b098f9a900/kube-apiserver-cert-syncer/0.log" Dec 04 22:21:00.670758 master-0 kubenswrapper[33572]: I1204 22:21:00.670670 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:21:00.672735 master-0 kubenswrapper[33572]: I1204 22:21:00.672650 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:00.799643 master-0 kubenswrapper[33572]: I1204 22:21:00.797694 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kube-api-access\") pod \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " Dec 04 22:21:00.799643 master-0 kubenswrapper[33572]: I1204 22:21:00.797802 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kubelet-dir\") pod \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " Dec 04 22:21:00.799643 master-0 kubenswrapper[33572]: I1204 22:21:00.798046 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-var-lock\") pod \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\" (UID: \"d8307846-d1cf-4357-bcc0-b3531d34dc8b\") " Dec 04 22:21:00.801731 master-0 kubenswrapper[33572]: I1204 22:21:00.801666 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d8307846-d1cf-4357-bcc0-b3531d34dc8b" (UID: "d8307846-d1cf-4357-bcc0-b3531d34dc8b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:21:00.811450 master-0 kubenswrapper[33572]: I1204 22:21:00.807659 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-var-lock" (OuterVolumeSpecName: "var-lock") pod "d8307846-d1cf-4357-bcc0-b3531d34dc8b" (UID: "d8307846-d1cf-4357-bcc0-b3531d34dc8b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:21:00.811830 master-0 kubenswrapper[33572]: I1204 22:21:00.811577 33572 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:00.811830 master-0 kubenswrapper[33572]: I1204 22:21:00.811667 33572 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:00.829757 master-0 kubenswrapper[33572]: I1204 22:21:00.829676 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d8307846-d1cf-4357-bcc0-b3531d34dc8b" (UID: "d8307846-d1cf-4357-bcc0-b3531d34dc8b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:21:00.914130 master-0 kubenswrapper[33572]: I1204 22:21:00.913988 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8307846-d1cf-4357-bcc0-b3531d34dc8b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:01.089826 master-0 kubenswrapper[33572]: I1204 22:21:01.089762 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b89698aa356a3bc32694e2b098f9a900/kube-apiserver-cert-syncer/0.log" Dec 04 22:21:01.090748 master-0 kubenswrapper[33572]: I1204 22:21:01.090709 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:01.092583 master-0 kubenswrapper[33572]: I1204 22:21:01.092491 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:01.093346 master-0 kubenswrapper[33572]: I1204 22:21:01.093282 33572 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:01.117140 master-0 kubenswrapper[33572]: I1204 22:21:01.117034 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"b89698aa356a3bc32694e2b098f9a900\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " Dec 04 22:21:01.117392 master-0 kubenswrapper[33572]: I1204 22:21:01.117153 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "b89698aa356a3bc32694e2b098f9a900" (UID: "b89698aa356a3bc32694e2b098f9a900"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:21:01.117392 master-0 kubenswrapper[33572]: I1204 22:21:01.117175 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"b89698aa356a3bc32694e2b098f9a900\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " Dec 04 22:21:01.117392 master-0 kubenswrapper[33572]: I1204 22:21:01.117226 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"b89698aa356a3bc32694e2b098f9a900\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " Dec 04 22:21:01.117392 master-0 kubenswrapper[33572]: I1204 22:21:01.117279 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b89698aa356a3bc32694e2b098f9a900" (UID: "b89698aa356a3bc32694e2b098f9a900"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:21:01.117392 master-0 kubenswrapper[33572]: I1204 22:21:01.117369 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "b89698aa356a3bc32694e2b098f9a900" (UID: "b89698aa356a3bc32694e2b098f9a900"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:21:01.117938 master-0 kubenswrapper[33572]: I1204 22:21:01.117888 33572 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:01.117987 master-0 kubenswrapper[33572]: I1204 22:21:01.117942 33572 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:01.117987 master-0 kubenswrapper[33572]: I1204 22:21:01.117968 33572 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:01.165738 master-0 kubenswrapper[33572]: I1204 22:21:01.165539 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b89698aa356a3bc32694e2b098f9a900/kube-apiserver-cert-syncer/0.log" Dec 04 22:21:01.166576 master-0 kubenswrapper[33572]: I1204 22:21:01.166283 33572 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c" exitCode=0 Dec 04 22:21:01.166576 master-0 kubenswrapper[33572]: I1204 22:21:01.166445 33572 scope.go:117] "RemoveContainer" containerID="0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659" Dec 04 22:21:01.166576 master-0 kubenswrapper[33572]: I1204 22:21:01.166531 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:01.169093 master-0 kubenswrapper[33572]: I1204 22:21:01.169006 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"d8307846-d1cf-4357-bcc0-b3531d34dc8b","Type":"ContainerDied","Data":"6a024729afee87c3e33b7e75781752b2a815d9566b6738f78b14f4287c197966"} Dec 04 22:21:01.169335 master-0 kubenswrapper[33572]: I1204 22:21:01.169288 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a024729afee87c3e33b7e75781752b2a815d9566b6738f78b14f4287c197966" Dec 04 22:21:01.169490 master-0 kubenswrapper[33572]: I1204 22:21:01.169142 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 04 22:21:01.201438 master-0 kubenswrapper[33572]: I1204 22:21:01.200898 33572 scope.go:117] "RemoveContainer" containerID="8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65" Dec 04 22:21:01.205526 master-0 kubenswrapper[33572]: I1204 22:21:01.205421 33572 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:01.207451 master-0 kubenswrapper[33572]: I1204 22:21:01.207393 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:01.209750 master-0 kubenswrapper[33572]: I1204 22:21:01.209657 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:01.210855 master-0 kubenswrapper[33572]: I1204 22:21:01.210783 33572 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:01.220079 master-0 kubenswrapper[33572]: I1204 22:21:01.220027 33572 scope.go:117] "RemoveContainer" containerID="e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d" Dec 04 22:21:01.241057 master-0 kubenswrapper[33572]: I1204 22:21:01.240797 33572 scope.go:117] "RemoveContainer" containerID="af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321" Dec 04 22:21:01.263409 master-0 kubenswrapper[33572]: I1204 22:21:01.263262 33572 scope.go:117] "RemoveContainer" containerID="0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c" Dec 04 22:21:01.281888 master-0 kubenswrapper[33572]: I1204 22:21:01.281825 33572 scope.go:117] "RemoveContainer" containerID="84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f" Dec 04 22:21:01.309597 master-0 kubenswrapper[33572]: I1204 22:21:01.309552 33572 scope.go:117] "RemoveContainer" containerID="0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659" Dec 04 22:21:01.310539 master-0 kubenswrapper[33572]: E1204 22:21:01.310463 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659\": container with ID starting with 0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659 not found: ID does not exist" containerID="0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659" Dec 04 22:21:01.310642 master-0 kubenswrapper[33572]: I1204 22:21:01.310557 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659"} err="failed to get container status \"0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659\": rpc error: code = NotFound desc = could not find container \"0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659\": container with ID starting with 0f4c610878e4f0709a40da0e99bbda4cb6866b10ac77167dac52a3afbc535659 not found: ID does not exist" Dec 04 22:21:01.310642 master-0 kubenswrapper[33572]: I1204 22:21:01.310596 33572 scope.go:117] "RemoveContainer" containerID="8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65" Dec 04 22:21:01.311378 master-0 kubenswrapper[33572]: E1204 22:21:01.311203 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65\": container with ID starting with 8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65 not found: ID does not exist" containerID="8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65" Dec 04 22:21:01.311378 master-0 kubenswrapper[33572]: I1204 22:21:01.311266 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65"} err="failed to get container status \"8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65\": rpc error: code = NotFound desc = could not find container \"8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65\": container with ID starting with 8a3c3daf0226df6999b2b18a2f2b2d6e553cd39d1cff86512faefb3ef7accb65 not found: ID does not exist" Dec 04 22:21:01.311378 master-0 kubenswrapper[33572]: I1204 22:21:01.311310 33572 scope.go:117] "RemoveContainer" containerID="e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d" Dec 04 22:21:01.311842 master-0 kubenswrapper[33572]: E1204 22:21:01.311806 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d\": container with ID starting with e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d not found: ID does not exist" containerID="e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d" Dec 04 22:21:01.311912 master-0 kubenswrapper[33572]: I1204 22:21:01.311839 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d"} err="failed to get container status \"e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d\": rpc error: code = NotFound desc = could not find container \"e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d\": container with ID starting with e37504084fc0976c31ec4cd1c197454b5a9200b1ecaca0de60f2b0e9f337d76d not found: ID does not exist" Dec 04 22:21:01.311912 master-0 kubenswrapper[33572]: I1204 22:21:01.311860 33572 scope.go:117] "RemoveContainer" containerID="af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321" Dec 04 22:21:01.312262 master-0 kubenswrapper[33572]: E1204 22:21:01.312228 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321\": container with ID starting with af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321 not found: ID does not exist" containerID="af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321" Dec 04 22:21:01.312329 master-0 kubenswrapper[33572]: I1204 22:21:01.312255 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321"} err="failed to get container status \"af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321\": rpc error: code = NotFound desc = could not find container \"af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321\": container with ID starting with af56cf6442fa394ed495e5e219391341ff7d8183b2c2a9dfcb878c959a91c321 not found: ID does not exist" Dec 04 22:21:01.312329 master-0 kubenswrapper[33572]: I1204 22:21:01.312285 33572 scope.go:117] "RemoveContainer" containerID="0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c" Dec 04 22:21:01.312693 master-0 kubenswrapper[33572]: E1204 22:21:01.312660 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c\": container with ID starting with 0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c not found: ID does not exist" containerID="0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c" Dec 04 22:21:01.312760 master-0 kubenswrapper[33572]: I1204 22:21:01.312696 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c"} err="failed to get container status \"0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c\": rpc error: code = NotFound desc = could not find container \"0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c\": container with ID starting with 0aa79089549695359ad41ba6c12b3f894b2da6ddb8831100a0e37edac1ddc89c not found: ID does not exist" Dec 04 22:21:01.312760 master-0 kubenswrapper[33572]: I1204 22:21:01.312718 33572 scope.go:117] "RemoveContainer" containerID="84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f" Dec 04 22:21:01.313220 master-0 kubenswrapper[33572]: E1204 22:21:01.313169 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f\": container with ID starting with 84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f not found: ID does not exist" containerID="84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f" Dec 04 22:21:01.313220 master-0 kubenswrapper[33572]: I1204 22:21:01.313199 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f"} err="failed to get container status \"84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f\": rpc error: code = NotFound desc = could not find container \"84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f\": container with ID starting with 84841ae2789f428e736e2d7fc7b2b8e288c838206f9dfea59bf553e20360160f not found: ID does not exist" Dec 04 22:21:02.534425 master-0 kubenswrapper[33572]: I1204 22:21:02.534350 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89698aa356a3bc32694e2b098f9a900" path="/var/lib/kubelet/pods/b89698aa356a3bc32694e2b098f9a900/volumes" Dec 04 22:21:03.742240 master-0 kubenswrapper[33572]: E1204 22:21:03.742159 33572 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:21:03.742910 master-0 kubenswrapper[33572]: I1204 22:21:03.742867 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:21:03.749623 master-0 kubenswrapper[33572]: E1204 22:21:03.749362 33572 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.187e233f4999bb77 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:b89698aa356a3bc32694e2b098f9a900,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Killing,Message:Stopping container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:20:58.700536695 +0000 UTC m=+122.428062344,LastTimestamp:2025-12-04 22:20:58.700536695 +0000 UTC m=+122.428062344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:21:03.763740 master-0 kubenswrapper[33572]: W1204 22:21:03.763680 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b5fd88e9b399c73466d52c7a3c61e3f.slice/crio-ada26e00790f42b5f8f68ff9f7f74d75b6e217a3193fe2de76de68f20e8d1377 WatchSource:0}: Error finding container ada26e00790f42b5f8f68ff9f7f74d75b6e217a3193fe2de76de68f20e8d1377: Status 404 returned error can't find the container with id ada26e00790f42b5f8f68ff9f7f74d75b6e217a3193fe2de76de68f20e8d1377 Dec 04 22:21:03.921557 master-0 kubenswrapper[33572]: E1204 22:21:03.921169 33572 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:03.922332 master-0 kubenswrapper[33572]: E1204 22:21:03.922242 33572 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:03.923770 master-0 kubenswrapper[33572]: E1204 22:21:03.923706 33572 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:03.924606 master-0 kubenswrapper[33572]: E1204 22:21:03.924541 33572 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:03.925296 master-0 kubenswrapper[33572]: E1204 22:21:03.925203 33572 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:03.925296 master-0 kubenswrapper[33572]: I1204 22:21:03.925283 33572 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 04 22:21:03.926144 master-0 kubenswrapper[33572]: E1204 22:21:03.926079 33572 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Dec 04 22:21:04.127327 master-0 kubenswrapper[33572]: E1204 22:21:04.127173 33572 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Dec 04 22:21:04.197333 master-0 kubenswrapper[33572]: I1204 22:21:04.197262 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"9b5fd88e9b399c73466d52c7a3c61e3f","Type":"ContainerStarted","Data":"196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5"} Dec 04 22:21:04.197333 master-0 kubenswrapper[33572]: I1204 22:21:04.197332 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"9b5fd88e9b399c73466d52c7a3c61e3f","Type":"ContainerStarted","Data":"ada26e00790f42b5f8f68ff9f7f74d75b6e217a3193fe2de76de68f20e8d1377"} Dec 04 22:21:04.198794 master-0 kubenswrapper[33572]: I1204 22:21:04.198737 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:04.198891 master-0 kubenswrapper[33572]: E1204 22:21:04.198832 33572 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:21:04.529754 master-0 kubenswrapper[33572]: E1204 22:21:04.529672 33572 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Dec 04 22:21:04.897094 master-0 kubenswrapper[33572]: E1204 22:21:04.896493 33572 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.187e233f4999bb77 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:b89698aa356a3bc32694e2b098f9a900,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Killing,Message:Stopping container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-04 22:20:58.700536695 +0000 UTC m=+122.428062344,LastTimestamp:2025-12-04 22:20:58.700536695 +0000 UTC m=+122.428062344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 04 22:21:05.330920 master-0 kubenswrapper[33572]: E1204 22:21:05.330842 33572 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Dec 04 22:21:06.533289 master-0 kubenswrapper[33572]: I1204 22:21:06.533178 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:06.932893 master-0 kubenswrapper[33572]: E1204 22:21:06.932625 33572 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Dec 04 22:21:10.133734 master-0 kubenswrapper[33572]: E1204 22:21:10.133633 33572 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Dec 04 22:21:12.266372 master-0 kubenswrapper[33572]: I1204 22:21:12.266275 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_5859424d8ea4459c5b854f1ae5fd942c/kube-controller-manager/0.log" Dec 04 22:21:12.267316 master-0 kubenswrapper[33572]: I1204 22:21:12.266394 33572 generic.go:334] "Generic (PLEG): container finished" podID="5859424d8ea4459c5b854f1ae5fd942c" containerID="669f49b80171e40aea73e838597bed75920e67751d5f839f6934dbce1fedc710" exitCode=1 Dec 04 22:21:12.267316 master-0 kubenswrapper[33572]: I1204 22:21:12.266643 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerDied","Data":"669f49b80171e40aea73e838597bed75920e67751d5f839f6934dbce1fedc710"} Dec 04 22:21:12.267728 master-0 kubenswrapper[33572]: I1204 22:21:12.267660 33572 scope.go:117] "RemoveContainer" containerID="669f49b80171e40aea73e838597bed75920e67751d5f839f6934dbce1fedc710" Dec 04 22:21:12.268815 master-0 kubenswrapper[33572]: I1204 22:21:12.268699 33572 status_manager.go:851] "Failed to get status for pod" podUID="5859424d8ea4459c5b854f1ae5fd942c" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:12.270274 master-0 kubenswrapper[33572]: I1204 22:21:12.269912 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:13.283643 master-0 kubenswrapper[33572]: I1204 22:21:13.283271 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_5859424d8ea4459c5b854f1ae5fd942c/kube-controller-manager/0.log" Dec 04 22:21:13.283643 master-0 kubenswrapper[33572]: I1204 22:21:13.283620 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5859424d8ea4459c5b854f1ae5fd942c","Type":"ContainerStarted","Data":"39aacc773fddb0383604f8a27ba1b199b302e4f4ede41fa8f08e464ed1607b81"} Dec 04 22:21:13.285423 master-0 kubenswrapper[33572]: I1204 22:21:13.285349 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:13.286328 master-0 kubenswrapper[33572]: I1204 22:21:13.286236 33572 status_manager.go:851] "Failed to get status for pod" podUID="5859424d8ea4459c5b854f1ae5fd942c" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:13.525844 master-0 kubenswrapper[33572]: I1204 22:21:13.525575 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:13.527317 master-0 kubenswrapper[33572]: I1204 22:21:13.527208 33572 status_manager.go:851] "Failed to get status for pod" podUID="5859424d8ea4459c5b854f1ae5fd942c" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:13.528666 master-0 kubenswrapper[33572]: I1204 22:21:13.528468 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:13.555374 master-0 kubenswrapper[33572]: I1204 22:21:13.555277 33572 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:13.555374 master-0 kubenswrapper[33572]: I1204 22:21:13.555350 33572 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:13.556817 master-0 kubenswrapper[33572]: E1204 22:21:13.556736 33572 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:13.557424 master-0 kubenswrapper[33572]: I1204 22:21:13.557385 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:13.601149 master-0 kubenswrapper[33572]: W1204 22:21:13.601036 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33faf0280764958ffeecc2dce44a9bfc.slice/crio-73a4aa51c94bd007e6c79951bdc57b8ea17dc30fda88f8949a1f16813299e725 WatchSource:0}: Error finding container 73a4aa51c94bd007e6c79951bdc57b8ea17dc30fda88f8949a1f16813299e725: Status 404 returned error can't find the container with id 73a4aa51c94bd007e6c79951bdc57b8ea17dc30fda88f8949a1f16813299e725 Dec 04 22:21:14.296011 master-0 kubenswrapper[33572]: I1204 22:21:14.295920 33572 generic.go:334] "Generic (PLEG): container finished" podID="33faf0280764958ffeecc2dce44a9bfc" containerID="3d507499ecae4e3dc17d093f004d68735d61d1aa68564a0cf6f88021ae248171" exitCode=0 Dec 04 22:21:14.296931 master-0 kubenswrapper[33572]: I1204 22:21:14.296056 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"33faf0280764958ffeecc2dce44a9bfc","Type":"ContainerDied","Data":"3d507499ecae4e3dc17d093f004d68735d61d1aa68564a0cf6f88021ae248171"} Dec 04 22:21:14.296931 master-0 kubenswrapper[33572]: I1204 22:21:14.296142 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"33faf0280764958ffeecc2dce44a9bfc","Type":"ContainerStarted","Data":"73a4aa51c94bd007e6c79951bdc57b8ea17dc30fda88f8949a1f16813299e725"} Dec 04 22:21:14.296931 master-0 kubenswrapper[33572]: I1204 22:21:14.296704 33572 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:14.296931 master-0 kubenswrapper[33572]: I1204 22:21:14.296738 33572 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:14.297859 master-0 kubenswrapper[33572]: E1204 22:21:14.297794 33572 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:14.297859 master-0 kubenswrapper[33572]: I1204 22:21:14.297805 33572 status_manager.go:851] "Failed to get status for pod" podUID="5859424d8ea4459c5b854f1ae5fd942c" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:14.298534 master-0 kubenswrapper[33572]: I1204 22:21:14.298452 33572 status_manager.go:851] "Failed to get status for pod" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 04 22:21:15.306413 master-0 kubenswrapper[33572]: I1204 22:21:15.305437 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"33faf0280764958ffeecc2dce44a9bfc","Type":"ContainerStarted","Data":"2f2ac46581b0152dc22afd0d312f211bf8b49334fe71e4913d46310eb8802276"} Dec 04 22:21:15.306413 master-0 kubenswrapper[33572]: I1204 22:21:15.305554 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"33faf0280764958ffeecc2dce44a9bfc","Type":"ContainerStarted","Data":"7efdefcf0e3cff8e3972e0b89fb60364e155a572af2aab5284bc7f07c4c66cce"} Dec 04 22:21:16.326107 master-0 kubenswrapper[33572]: I1204 22:21:16.326029 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"33faf0280764958ffeecc2dce44a9bfc","Type":"ContainerStarted","Data":"943feefded84581b1debc130f470e1cc6fbab1337912ed0702214d2fc4dda100"} Dec 04 22:21:16.326107 master-0 kubenswrapper[33572]: I1204 22:21:16.326087 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"33faf0280764958ffeecc2dce44a9bfc","Type":"ContainerStarted","Data":"aca1d1a46584f13f24c0f74b2dd0cb67a1a133f8527adc2a6865e9e0524b87cc"} Dec 04 22:21:16.326107 master-0 kubenswrapper[33572]: I1204 22:21:16.326098 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"33faf0280764958ffeecc2dce44a9bfc","Type":"ContainerStarted","Data":"ae9b290ced902807415e8297ff43bf6ded3e50ee554852356ff02bc96c93f400"} Dec 04 22:21:16.326906 master-0 kubenswrapper[33572]: I1204 22:21:16.326240 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:16.326906 master-0 kubenswrapper[33572]: I1204 22:21:16.326441 33572 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:16.326906 master-0 kubenswrapper[33572]: I1204 22:21:16.326481 33572 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:18.199954 master-0 kubenswrapper[33572]: I1204 22:21:18.199843 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:21:18.199954 master-0 kubenswrapper[33572]: I1204 22:21:18.199956 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:21:18.207168 master-0 kubenswrapper[33572]: I1204 22:21:18.207132 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:21:18.557605 master-0 kubenswrapper[33572]: I1204 22:21:18.557532 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:18.557605 master-0 kubenswrapper[33572]: I1204 22:21:18.557611 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:18.565546 master-0 kubenswrapper[33572]: I1204 22:21:18.565481 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:21.356531 master-0 kubenswrapper[33572]: I1204 22:21:21.349967 33572 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:21.453927 master-0 kubenswrapper[33572]: I1204 22:21:21.453852 33572 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="33faf0280764958ffeecc2dce44a9bfc" podUID="9ba1b942-cdac-4cfc-b802-903664b5e27c" Dec 04 22:21:22.381652 master-0 kubenswrapper[33572]: I1204 22:21:22.381556 33572 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:22.381652 master-0 kubenswrapper[33572]: I1204 22:21:22.381629 33572 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:22.389041 master-0 kubenswrapper[33572]: I1204 22:21:22.388972 33572 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="33faf0280764958ffeecc2dce44a9bfc" podUID="9ba1b942-cdac-4cfc-b802-903664b5e27c" Dec 04 22:21:22.389636 master-0 kubenswrapper[33572]: I1204 22:21:22.389581 33572 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-master-0" containerID="cri-o://7efdefcf0e3cff8e3972e0b89fb60364e155a572af2aab5284bc7f07c4c66cce" Dec 04 22:21:22.389636 master-0 kubenswrapper[33572]: I1204 22:21:22.389623 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:23.392267 master-0 kubenswrapper[33572]: I1204 22:21:23.392136 33572 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:23.392267 master-0 kubenswrapper[33572]: I1204 22:21:23.392197 33572 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:23.399736 master-0 kubenswrapper[33572]: I1204 22:21:23.399673 33572 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="33faf0280764958ffeecc2dce44a9bfc" podUID="9ba1b942-cdac-4cfc-b802-903664b5e27c" Dec 04 22:21:28.206853 master-0 kubenswrapper[33572]: I1204 22:21:28.206708 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:21:31.345898 master-0 kubenswrapper[33572]: I1204 22:21:31.345820 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 22:21:31.512701 master-0 kubenswrapper[33572]: I1204 22:21:31.512653 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 22:21:31.586789 master-0 kubenswrapper[33572]: I1204 22:21:31.586717 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 22:21:31.816032 master-0 kubenswrapper[33572]: I1204 22:21:31.815942 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 22:21:33.067887 master-0 kubenswrapper[33572]: I1204 22:21:33.067797 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 22:21:33.185972 master-0 kubenswrapper[33572]: I1204 22:21:33.185890 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 04 22:21:33.187840 master-0 kubenswrapper[33572]: I1204 22:21:33.187707 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:21:33.286461 master-0 kubenswrapper[33572]: I1204 22:21:33.286405 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 22:21:33.491041 master-0 kubenswrapper[33572]: I1204 22:21:33.490875 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-wc4xq" Dec 04 22:21:33.492969 master-0 kubenswrapper[33572]: I1204 22:21:33.492904 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 22:21:33.531553 master-0 kubenswrapper[33572]: I1204 22:21:33.531412 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 22:21:33.690157 master-0 kubenswrapper[33572]: I1204 22:21:33.690057 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 22:21:33.719237 master-0 kubenswrapper[33572]: I1204 22:21:33.719167 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 22:21:33.991021 master-0 kubenswrapper[33572]: I1204 22:21:33.990921 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-sf4xn" Dec 04 22:21:34.107466 master-0 kubenswrapper[33572]: I1204 22:21:34.107377 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 04 22:21:34.122922 master-0 kubenswrapper[33572]: I1204 22:21:34.122853 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-4nxv4" Dec 04 22:21:34.228085 master-0 kubenswrapper[33572]: I1204 22:21:34.228010 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 22:21:34.257652 master-0 kubenswrapper[33572]: I1204 22:21:34.257472 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 04 22:21:34.313318 master-0 kubenswrapper[33572]: I1204 22:21:34.313219 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-tm5gx" Dec 04 22:21:34.388080 master-0 kubenswrapper[33572]: I1204 22:21:34.387946 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 22:21:34.462627 master-0 kubenswrapper[33572]: I1204 22:21:34.462575 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 22:21:34.481915 master-0 kubenswrapper[33572]: I1204 22:21:34.481851 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 22:21:34.583139 master-0 kubenswrapper[33572]: I1204 22:21:34.583040 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 22:21:34.610824 master-0 kubenswrapper[33572]: I1204 22:21:34.610722 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 22:21:34.624986 master-0 kubenswrapper[33572]: I1204 22:21:34.624923 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 04 22:21:34.627458 master-0 kubenswrapper[33572]: I1204 22:21:34.627389 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 22:21:34.689300 master-0 kubenswrapper[33572]: I1204 22:21:34.689235 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 22:21:34.731048 master-0 kubenswrapper[33572]: I1204 22:21:34.730970 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 22:21:34.830896 master-0 kubenswrapper[33572]: I1204 22:21:34.830816 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 22:21:34.896950 master-0 kubenswrapper[33572]: I1204 22:21:34.896869 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 22:21:34.928203 master-0 kubenswrapper[33572]: I1204 22:21:34.927960 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 04 22:21:34.996477 master-0 kubenswrapper[33572]: I1204 22:21:34.996373 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 22:21:35.002654 master-0 kubenswrapper[33572]: I1204 22:21:35.002592 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 04 22:21:35.046285 master-0 kubenswrapper[33572]: I1204 22:21:35.046205 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 04 22:21:35.062523 master-0 kubenswrapper[33572]: I1204 22:21:35.062450 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 04 22:21:35.082739 master-0 kubenswrapper[33572]: I1204 22:21:35.082614 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 22:21:35.092691 master-0 kubenswrapper[33572]: I1204 22:21:35.092619 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 22:21:35.121421 master-0 kubenswrapper[33572]: I1204 22:21:35.121208 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 04 22:21:35.158610 master-0 kubenswrapper[33572]: I1204 22:21:35.158490 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 22:21:35.240613 master-0 kubenswrapper[33572]: I1204 22:21:35.240518 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-n25ns" Dec 04 22:21:35.366737 master-0 kubenswrapper[33572]: I1204 22:21:35.366663 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-fj6qn" Dec 04 22:21:35.372533 master-0 kubenswrapper[33572]: I1204 22:21:35.372428 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 04 22:21:35.432334 master-0 kubenswrapper[33572]: I1204 22:21:35.432247 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 04 22:21:35.458156 master-0 kubenswrapper[33572]: I1204 22:21:35.458082 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 22:21:35.464448 master-0 kubenswrapper[33572]: I1204 22:21:35.464383 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 04 22:21:35.485972 master-0 kubenswrapper[33572]: I1204 22:21:35.485894 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 04 22:21:35.530941 master-0 kubenswrapper[33572]: I1204 22:21:35.530848 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-dtp6d" Dec 04 22:21:35.640869 master-0 kubenswrapper[33572]: I1204 22:21:35.640703 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 22:21:35.654737 master-0 kubenswrapper[33572]: I1204 22:21:35.654635 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 04 22:21:35.897933 master-0 kubenswrapper[33572]: I1204 22:21:35.897684 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 04 22:21:35.903728 master-0 kubenswrapper[33572]: I1204 22:21:35.903692 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-496f9" Dec 04 22:21:35.978665 master-0 kubenswrapper[33572]: I1204 22:21:35.978596 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:21:36.023062 master-0 kubenswrapper[33572]: I1204 22:21:36.022981 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 04 22:21:36.042536 master-0 kubenswrapper[33572]: I1204 22:21:36.042426 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 22:21:36.113725 master-0 kubenswrapper[33572]: I1204 22:21:36.113626 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-zpcfd" Dec 04 22:21:36.155767 master-0 kubenswrapper[33572]: I1204 22:21:36.155585 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 22:21:36.175701 master-0 kubenswrapper[33572]: I1204 22:21:36.175579 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 22:21:36.242789 master-0 kubenswrapper[33572]: I1204 22:21:36.242694 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 22:21:36.319207 master-0 kubenswrapper[33572]: I1204 22:21:36.319113 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 22:21:36.418860 master-0 kubenswrapper[33572]: I1204 22:21:36.418585 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 22:21:36.420763 master-0 kubenswrapper[33572]: I1204 22:21:36.420682 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 22:21:36.463648 master-0 kubenswrapper[33572]: I1204 22:21:36.463562 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 22:21:36.475124 master-0 kubenswrapper[33572]: I1204 22:21:36.475055 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 22:21:36.562074 master-0 kubenswrapper[33572]: I1204 22:21:36.561945 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 22:21:36.607120 master-0 kubenswrapper[33572]: I1204 22:21:36.607020 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 22:21:36.617822 master-0 kubenswrapper[33572]: I1204 22:21:36.617755 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 22:21:36.625223 master-0 kubenswrapper[33572]: I1204 22:21:36.625163 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 04 22:21:36.681996 master-0 kubenswrapper[33572]: I1204 22:21:36.681845 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 04 22:21:36.715896 master-0 kubenswrapper[33572]: I1204 22:21:36.715835 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 22:21:36.718663 master-0 kubenswrapper[33572]: I1204 22:21:36.718608 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 22:21:36.724264 master-0 kubenswrapper[33572]: I1204 22:21:36.724187 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 04 22:21:36.731687 master-0 kubenswrapper[33572]: I1204 22:21:36.731628 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-fvwtt" Dec 04 22:21:36.902083 master-0 kubenswrapper[33572]: I1204 22:21:36.901999 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 22:21:36.906392 master-0 kubenswrapper[33572]: I1204 22:21:36.906353 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 22:21:37.003248 master-0 kubenswrapper[33572]: I1204 22:21:37.003077 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 22:21:37.063815 master-0 kubenswrapper[33572]: I1204 22:21:37.063727 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-h8rp7" Dec 04 22:21:37.075287 master-0 kubenswrapper[33572]: I1204 22:21:37.075220 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 22:21:37.203480 master-0 kubenswrapper[33572]: I1204 22:21:37.203397 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 22:21:37.278323 master-0 kubenswrapper[33572]: I1204 22:21:37.278188 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 22:21:37.350075 master-0 kubenswrapper[33572]: I1204 22:21:37.349987 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 22:21:37.356173 master-0 kubenswrapper[33572]: I1204 22:21:37.356129 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 04 22:21:37.414097 master-0 kubenswrapper[33572]: I1204 22:21:37.414020 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 22:21:37.501756 master-0 kubenswrapper[33572]: I1204 22:21:37.501639 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 22:21:37.519234 master-0 kubenswrapper[33572]: I1204 22:21:37.518440 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-shf98" Dec 04 22:21:37.648670 master-0 kubenswrapper[33572]: I1204 22:21:37.648337 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 22:21:37.650907 master-0 kubenswrapper[33572]: I1204 22:21:37.650826 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 04 22:21:37.655637 master-0 kubenswrapper[33572]: I1204 22:21:37.655589 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 22:21:37.698011 master-0 kubenswrapper[33572]: I1204 22:21:37.697930 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 22:21:37.700444 master-0 kubenswrapper[33572]: I1204 22:21:37.700393 33572 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 22:21:37.714923 master-0 kubenswrapper[33572]: I1204 22:21:37.714873 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 22:21:37.753095 master-0 kubenswrapper[33572]: I1204 22:21:37.753005 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 22:21:37.861527 master-0 kubenswrapper[33572]: I1204 22:21:37.861297 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 22:21:37.958261 master-0 kubenswrapper[33572]: I1204 22:21:37.957891 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 22:21:37.971640 master-0 kubenswrapper[33572]: I1204 22:21:37.971462 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 22:21:37.986681 master-0 kubenswrapper[33572]: I1204 22:21:37.986610 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 22:21:38.081671 master-0 kubenswrapper[33572]: I1204 22:21:38.081557 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 04 22:21:38.120979 master-0 kubenswrapper[33572]: I1204 22:21:38.120889 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 22:21:38.162671 master-0 kubenswrapper[33572]: I1204 22:21:38.162568 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 22:21:38.246601 master-0 kubenswrapper[33572]: I1204 22:21:38.246390 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 22:21:38.278997 master-0 kubenswrapper[33572]: I1204 22:21:38.278894 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 22:21:38.305417 master-0 kubenswrapper[33572]: I1204 22:21:38.305364 33572 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 22:21:38.323549 master-0 kubenswrapper[33572]: I1204 22:21:38.323448 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-69625" Dec 04 22:21:38.337259 master-0 kubenswrapper[33572]: I1204 22:21:38.337208 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 22:21:38.385314 master-0 kubenswrapper[33572]: I1204 22:21:38.385258 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 22:21:38.516004 master-0 kubenswrapper[33572]: I1204 22:21:38.515953 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 22:21:38.529015 master-0 kubenswrapper[33572]: I1204 22:21:38.528944 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 22:21:38.625124 master-0 kubenswrapper[33572]: I1204 22:21:38.625074 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 22:21:38.685345 master-0 kubenswrapper[33572]: I1204 22:21:38.685293 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 22:21:38.703956 master-0 kubenswrapper[33572]: I1204 22:21:38.703916 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 04 22:21:38.776060 master-0 kubenswrapper[33572]: I1204 22:21:38.775949 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 04 22:21:38.791461 master-0 kubenswrapper[33572]: I1204 22:21:38.791422 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 22:21:38.833089 master-0 kubenswrapper[33572]: I1204 22:21:38.833032 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 22:21:38.851657 master-0 kubenswrapper[33572]: I1204 22:21:38.851613 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 22:21:38.879891 master-0 kubenswrapper[33572]: I1204 22:21:38.879833 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 22:21:38.890117 master-0 kubenswrapper[33572]: I1204 22:21:38.890056 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 22:21:39.053049 master-0 kubenswrapper[33572]: I1204 22:21:39.052706 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-pjht7" Dec 04 22:21:39.130578 master-0 kubenswrapper[33572]: I1204 22:21:39.130494 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 22:21:39.155108 master-0 kubenswrapper[33572]: I1204 22:21:39.155065 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 22:21:39.244375 master-0 kubenswrapper[33572]: I1204 22:21:39.244318 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 22:21:39.317273 master-0 kubenswrapper[33572]: I1204 22:21:39.317139 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-5kj7n" Dec 04 22:21:39.371246 master-0 kubenswrapper[33572]: I1204 22:21:39.371175 33572 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 22:21:39.380478 master-0 kubenswrapper[33572]: I1204 22:21:39.380395 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 04 22:21:39.380478 master-0 kubenswrapper[33572]: I1204 22:21:39.380482 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 04 22:21:39.381391 master-0 kubenswrapper[33572]: I1204 22:21:39.381332 33572 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:39.381464 master-0 kubenswrapper[33572]: I1204 22:21:39.381399 33572 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="28e08278-06d0-4733-96af-0abd2722360e" Dec 04 22:21:39.392421 master-0 kubenswrapper[33572]: I1204 22:21:39.392364 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 04 22:21:39.418356 master-0 kubenswrapper[33572]: I1204 22:21:39.418293 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 22:21:39.432608 master-0 kubenswrapper[33572]: I1204 22:21:39.431962 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=18.431931052 podStartE2EDuration="18.431931052s" podCreationTimestamp="2025-12-04 22:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:21:39.420061741 +0000 UTC m=+163.147587390" watchObservedRunningTime="2025-12-04 22:21:39.431931052 +0000 UTC m=+163.159456731" Dec 04 22:21:39.439074 master-0 kubenswrapper[33572]: I1204 22:21:39.439034 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 04 22:21:39.455225 master-0 kubenswrapper[33572]: I1204 22:21:39.455165 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 22:21:39.469494 master-0 kubenswrapper[33572]: I1204 22:21:39.469322 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 22:21:39.475875 master-0 kubenswrapper[33572]: I1204 22:21:39.475845 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 04 22:21:39.484542 master-0 kubenswrapper[33572]: I1204 22:21:39.484487 33572 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 22:21:39.494599 master-0 kubenswrapper[33572]: I1204 22:21:39.494523 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-h7rbd" Dec 04 22:21:39.523572 master-0 kubenswrapper[33572]: I1204 22:21:39.523476 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 22:21:39.653588 master-0 kubenswrapper[33572]: I1204 22:21:39.653405 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 22:21:39.717832 master-0 kubenswrapper[33572]: I1204 22:21:39.717766 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 22:21:39.718850 master-0 kubenswrapper[33572]: I1204 22:21:39.718791 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 04 22:21:39.745961 master-0 kubenswrapper[33572]: I1204 22:21:39.745895 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 04 22:21:39.753545 master-0 kubenswrapper[33572]: I1204 22:21:39.753453 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 22:21:39.834653 master-0 kubenswrapper[33572]: I1204 22:21:39.834568 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 22:21:39.840339 master-0 kubenswrapper[33572]: I1204 22:21:39.840301 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 22:21:39.885328 master-0 kubenswrapper[33572]: I1204 22:21:39.885212 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 22:21:39.951619 master-0 kubenswrapper[33572]: I1204 22:21:39.951352 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 04 22:21:39.996661 master-0 kubenswrapper[33572]: I1204 22:21:39.996598 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 22:21:40.006601 master-0 kubenswrapper[33572]: I1204 22:21:40.006556 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 22:21:40.007983 master-0 kubenswrapper[33572]: I1204 22:21:40.007932 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 22:21:40.022070 master-0 kubenswrapper[33572]: I1204 22:21:40.019950 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 04 22:21:40.062678 master-0 kubenswrapper[33572]: I1204 22:21:40.062615 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 22:21:40.093072 master-0 kubenswrapper[33572]: I1204 22:21:40.092987 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 22:21:40.161449 master-0 kubenswrapper[33572]: I1204 22:21:40.161373 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 22:21:40.178009 master-0 kubenswrapper[33572]: I1204 22:21:40.177947 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 04 22:21:40.203894 master-0 kubenswrapper[33572]: I1204 22:21:40.203711 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 04 22:21:40.250780 master-0 kubenswrapper[33572]: I1204 22:21:40.250705 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 22:21:40.270293 master-0 kubenswrapper[33572]: I1204 22:21:40.270199 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-psdkj" Dec 04 22:21:40.309970 master-0 kubenswrapper[33572]: I1204 22:21:40.309871 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 22:21:40.342753 master-0 kubenswrapper[33572]: I1204 22:21:40.342646 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 04 22:21:40.430026 master-0 kubenswrapper[33572]: I1204 22:21:40.429932 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 22:21:40.472890 master-0 kubenswrapper[33572]: I1204 22:21:40.472685 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 04 22:21:40.474175 master-0 kubenswrapper[33572]: I1204 22:21:40.474097 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 22:21:40.549202 master-0 kubenswrapper[33572]: I1204 22:21:40.549076 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 04 22:21:40.656790 master-0 kubenswrapper[33572]: I1204 22:21:40.656703 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 22:21:40.684622 master-0 kubenswrapper[33572]: I1204 22:21:40.684564 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-cmdvd" Dec 04 22:21:40.787876 master-0 kubenswrapper[33572]: I1204 22:21:40.787796 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 04 22:21:40.841384 master-0 kubenswrapper[33572]: I1204 22:21:40.841292 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 22:21:40.913018 master-0 kubenswrapper[33572]: I1204 22:21:40.912937 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 22:21:40.966799 master-0 kubenswrapper[33572]: I1204 22:21:40.966725 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 22:21:40.981904 master-0 kubenswrapper[33572]: I1204 22:21:40.981769 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-qjgpp" Dec 04 22:21:41.010080 master-0 kubenswrapper[33572]: I1204 22:21:41.010015 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 04 22:21:41.067654 master-0 kubenswrapper[33572]: I1204 22:21:41.066527 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 22:21:41.099410 master-0 kubenswrapper[33572]: I1204 22:21:41.099319 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 22:21:41.147417 master-0 kubenswrapper[33572]: I1204 22:21:41.147330 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 04 22:21:41.151026 master-0 kubenswrapper[33572]: I1204 22:21:41.150958 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 22:21:41.232914 master-0 kubenswrapper[33572]: I1204 22:21:41.232837 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 04 22:21:41.242855 master-0 kubenswrapper[33572]: I1204 22:21:41.242777 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 22:21:41.295045 master-0 kubenswrapper[33572]: I1204 22:21:41.294920 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 22:21:41.385636 master-0 kubenswrapper[33572]: I1204 22:21:41.385427 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 22:21:41.520143 master-0 kubenswrapper[33572]: I1204 22:21:41.520075 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 04 22:21:41.540764 master-0 kubenswrapper[33572]: I1204 22:21:41.540707 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 04 22:21:41.546284 master-0 kubenswrapper[33572]: I1204 22:21:41.546224 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-x7b78" Dec 04 22:21:41.552906 master-0 kubenswrapper[33572]: I1204 22:21:41.552863 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 04 22:21:41.563577 master-0 kubenswrapper[33572]: I1204 22:21:41.563247 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 04 22:21:41.564798 master-0 kubenswrapper[33572]: I1204 22:21:41.564743 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 22:21:41.629082 master-0 kubenswrapper[33572]: I1204 22:21:41.628989 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 04 22:21:41.897637 master-0 kubenswrapper[33572]: I1204 22:21:41.897569 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 04 22:21:41.948578 master-0 kubenswrapper[33572]: I1204 22:21:41.947971 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 22:21:41.950439 master-0 kubenswrapper[33572]: I1204 22:21:41.950390 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 22:21:41.981850 master-0 kubenswrapper[33572]: I1204 22:21:41.981756 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 22:21:42.045427 master-0 kubenswrapper[33572]: I1204 22:21:42.045338 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 04 22:21:42.122282 master-0 kubenswrapper[33572]: I1204 22:21:42.122200 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 22:21:42.136497 master-0 kubenswrapper[33572]: I1204 22:21:42.136409 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 04 22:21:42.148605 master-0 kubenswrapper[33572]: I1204 22:21:42.148420 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 22:21:42.158691 master-0 kubenswrapper[33572]: I1204 22:21:42.158610 33572 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 22:21:42.169813 master-0 kubenswrapper[33572]: I1204 22:21:42.169734 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 04 22:21:42.177153 master-0 kubenswrapper[33572]: I1204 22:21:42.177085 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-8rl6s" Dec 04 22:21:42.222155 master-0 kubenswrapper[33572]: I1204 22:21:42.222078 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 22:21:42.296075 master-0 kubenswrapper[33572]: I1204 22:21:42.295984 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 22:21:42.467104 master-0 kubenswrapper[33572]: I1204 22:21:42.466899 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 22:21:42.468135 master-0 kubenswrapper[33572]: I1204 22:21:42.467386 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 22:21:42.485824 master-0 kubenswrapper[33572]: I1204 22:21:42.485703 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 22:21:42.562422 master-0 kubenswrapper[33572]: I1204 22:21:42.562319 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 04 22:21:42.616549 master-0 kubenswrapper[33572]: I1204 22:21:42.616412 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 04 22:21:42.617927 master-0 kubenswrapper[33572]: I1204 22:21:42.617867 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 22:21:42.686296 master-0 kubenswrapper[33572]: I1204 22:21:42.686192 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 22:21:42.732630 master-0 kubenswrapper[33572]: I1204 22:21:42.732399 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 22:21:42.801647 master-0 kubenswrapper[33572]: I1204 22:21:42.801472 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 22:21:42.820450 master-0 kubenswrapper[33572]: I1204 22:21:42.820326 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 22:21:42.897711 master-0 kubenswrapper[33572]: I1204 22:21:42.897646 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 22:21:42.980466 master-0 kubenswrapper[33572]: I1204 22:21:42.980369 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 22:21:43.074980 master-0 kubenswrapper[33572]: I1204 22:21:43.074267 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 22:21:43.153722 master-0 kubenswrapper[33572]: I1204 22:21:43.153628 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 22:21:43.193972 master-0 kubenswrapper[33572]: I1204 22:21:43.193874 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 22:21:43.194327 master-0 kubenswrapper[33572]: I1204 22:21:43.194244 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-n8qln" Dec 04 22:21:43.212804 master-0 kubenswrapper[33572]: I1204 22:21:43.212728 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 22:21:43.232645 master-0 kubenswrapper[33572]: I1204 22:21:43.232576 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 04 22:21:43.264536 master-0 kubenswrapper[33572]: I1204 22:21:43.264432 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 04 22:21:43.266894 master-0 kubenswrapper[33572]: I1204 22:21:43.266832 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 22:21:43.318248 master-0 kubenswrapper[33572]: I1204 22:21:43.317975 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 22:21:43.424856 master-0 kubenswrapper[33572]: I1204 22:21:43.424638 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-bvrgs" Dec 04 22:21:43.498829 master-0 kubenswrapper[33572]: I1204 22:21:43.498761 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 22:21:43.518883 master-0 kubenswrapper[33572]: I1204 22:21:43.518833 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-nvnbb" Dec 04 22:21:43.570011 master-0 kubenswrapper[33572]: I1204 22:21:43.569899 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 22:21:43.572793 master-0 kubenswrapper[33572]: I1204 22:21:43.572734 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 22:21:43.729677 master-0 kubenswrapper[33572]: I1204 22:21:43.729538 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 04 22:21:43.732003 master-0 kubenswrapper[33572]: I1204 22:21:43.731964 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 22:21:43.760842 master-0 kubenswrapper[33572]: I1204 22:21:43.760755 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-bktr2" Dec 04 22:21:43.893355 master-0 kubenswrapper[33572]: I1204 22:21:43.893273 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 22:21:43.975821 master-0 kubenswrapper[33572]: I1204 22:21:43.975718 33572 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 04 22:21:43.976260 master-0 kubenswrapper[33572]: I1204 22:21:43.976144 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="9b5fd88e9b399c73466d52c7a3c61e3f" containerName="startup-monitor" containerID="cri-o://196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5" gracePeriod=5 Dec 04 22:21:44.060931 master-0 kubenswrapper[33572]: I1204 22:21:44.060862 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 22:21:44.196377 master-0 kubenswrapper[33572]: I1204 22:21:44.196189 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-sdd6h" Dec 04 22:21:44.224646 master-0 kubenswrapper[33572]: I1204 22:21:44.224553 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 22:21:44.255452 master-0 kubenswrapper[33572]: I1204 22:21:44.255343 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 22:21:44.265217 master-0 kubenswrapper[33572]: I1204 22:21:44.265148 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 22:21:44.271970 master-0 kubenswrapper[33572]: I1204 22:21:44.271923 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 22:21:44.274220 master-0 kubenswrapper[33572]: I1204 22:21:44.274167 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 22:21:44.288579 master-0 kubenswrapper[33572]: I1204 22:21:44.288486 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 22:21:44.399919 master-0 kubenswrapper[33572]: I1204 22:21:44.399741 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mw665" Dec 04 22:21:44.451320 master-0 kubenswrapper[33572]: I1204 22:21:44.451217 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 22:21:44.575630 master-0 kubenswrapper[33572]: I1204 22:21:44.575549 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 22:21:44.586998 master-0 kubenswrapper[33572]: I1204 22:21:44.586905 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 22:21:44.699846 master-0 kubenswrapper[33572]: I1204 22:21:44.699652 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 22:21:44.789053 master-0 kubenswrapper[33572]: I1204 22:21:44.788962 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 22:21:44.822639 master-0 kubenswrapper[33572]: I1204 22:21:44.822491 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 22:21:44.838434 master-0 kubenswrapper[33572]: I1204 22:21:44.838359 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 22:21:44.900988 master-0 kubenswrapper[33572]: I1204 22:21:44.900864 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 22:21:45.022450 master-0 kubenswrapper[33572]: I1204 22:21:45.022368 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 22:21:45.214553 master-0 kubenswrapper[33572]: I1204 22:21:45.214460 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 22:21:45.345969 master-0 kubenswrapper[33572]: I1204 22:21:45.345790 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 22:21:45.346217 master-0 kubenswrapper[33572]: I1204 22:21:45.346126 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 04 22:21:45.359954 master-0 kubenswrapper[33572]: I1204 22:21:45.359883 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 22:21:45.493406 master-0 kubenswrapper[33572]: I1204 22:21:45.493320 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 04 22:21:45.540541 master-0 kubenswrapper[33572]: I1204 22:21:45.540447 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 22:21:45.628882 master-0 kubenswrapper[33572]: I1204 22:21:45.628287 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 22:21:45.638985 master-0 kubenswrapper[33572]: I1204 22:21:45.638930 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 22:21:45.764912 master-0 kubenswrapper[33572]: I1204 22:21:45.764818 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 22:21:46.014233 master-0 kubenswrapper[33572]: I1204 22:21:46.014049 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 22:21:46.053607 master-0 kubenswrapper[33572]: I1204 22:21:46.053487 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 22:21:46.064302 master-0 kubenswrapper[33572]: I1204 22:21:46.064237 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 22:21:46.124822 master-0 kubenswrapper[33572]: I1204 22:21:46.124762 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 22:21:46.392351 master-0 kubenswrapper[33572]: I1204 22:21:46.392283 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 22:21:46.447700 master-0 kubenswrapper[33572]: I1204 22:21:46.447245 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 22:21:46.785216 master-0 kubenswrapper[33572]: I1204 22:21:46.785154 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 04 22:21:46.898601 master-0 kubenswrapper[33572]: I1204 22:21:46.898530 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 04 22:21:46.938132 master-0 kubenswrapper[33572]: I1204 22:21:46.937575 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 22:21:46.959275 master-0 kubenswrapper[33572]: I1204 22:21:46.959194 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 22:21:47.183025 master-0 kubenswrapper[33572]: I1204 22:21:47.182585 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 22:21:47.377636 master-0 kubenswrapper[33572]: I1204 22:21:47.377539 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-3h94rftr47kot" Dec 04 22:21:47.530603 master-0 kubenswrapper[33572]: I1204 22:21:47.530492 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 22:21:47.601213 master-0 kubenswrapper[33572]: I1204 22:21:47.601144 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 22:21:47.627551 master-0 kubenswrapper[33572]: I1204 22:21:47.627477 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 22:21:49.579706 master-0 kubenswrapper[33572]: I1204 22:21:49.579644 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_9b5fd88e9b399c73466d52c7a3c61e3f/startup-monitor/0.log" Dec 04 22:21:49.580264 master-0 kubenswrapper[33572]: I1204 22:21:49.579789 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:21:49.623120 master-0 kubenswrapper[33572]: I1204 22:21:49.623064 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_9b5fd88e9b399c73466d52c7a3c61e3f/startup-monitor/0.log" Dec 04 22:21:49.623358 master-0 kubenswrapper[33572]: I1204 22:21:49.623132 33572 generic.go:334] "Generic (PLEG): container finished" podID="9b5fd88e9b399c73466d52c7a3c61e3f" containerID="196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5" exitCode=137 Dec 04 22:21:49.623358 master-0 kubenswrapper[33572]: I1204 22:21:49.623187 33572 scope.go:117] "RemoveContainer" containerID="196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5" Dec 04 22:21:49.623358 master-0 kubenswrapper[33572]: I1204 22:21:49.623201 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 04 22:21:49.646454 master-0 kubenswrapper[33572]: I1204 22:21:49.646402 33572 scope.go:117] "RemoveContainer" containerID="196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5" Dec 04 22:21:49.647098 master-0 kubenswrapper[33572]: E1204 22:21:49.647048 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5\": container with ID starting with 196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5 not found: ID does not exist" containerID="196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5" Dec 04 22:21:49.647180 master-0 kubenswrapper[33572]: I1204 22:21:49.647120 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5"} err="failed to get container status \"196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5\": rpc error: code = NotFound desc = could not find container \"196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5\": container with ID starting with 196fac175bf340eec3996b2dee4b2505982fe435d055d1ad0e559a2d9a9238c5 not found: ID does not exist" Dec 04 22:21:49.672642 master-0 kubenswrapper[33572]: I1204 22:21:49.672598 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-lock\") pod \"9b5fd88e9b399c73466d52c7a3c61e3f\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " Dec 04 22:21:49.672642 master-0 kubenswrapper[33572]: I1204 22:21:49.672642 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-pod-resource-dir\") pod \"9b5fd88e9b399c73466d52c7a3c61e3f\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " Dec 04 22:21:49.672896 master-0 kubenswrapper[33572]: I1204 22:21:49.672687 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-log\") pod \"9b5fd88e9b399c73466d52c7a3c61e3f\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " Dec 04 22:21:49.672896 master-0 kubenswrapper[33572]: I1204 22:21:49.672727 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-resource-dir\") pod \"9b5fd88e9b399c73466d52c7a3c61e3f\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " Dec 04 22:21:49.672896 master-0 kubenswrapper[33572]: I1204 22:21:49.672718 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-lock" (OuterVolumeSpecName: "var-lock") pod "9b5fd88e9b399c73466d52c7a3c61e3f" (UID: "9b5fd88e9b399c73466d52c7a3c61e3f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:21:49.672896 master-0 kubenswrapper[33572]: I1204 22:21:49.672744 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-manifests\") pod \"9b5fd88e9b399c73466d52c7a3c61e3f\" (UID: \"9b5fd88e9b399c73466d52c7a3c61e3f\") " Dec 04 22:21:49.672896 master-0 kubenswrapper[33572]: I1204 22:21:49.672785 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-log" (OuterVolumeSpecName: "var-log") pod "9b5fd88e9b399c73466d52c7a3c61e3f" (UID: "9b5fd88e9b399c73466d52c7a3c61e3f"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:21:49.672896 master-0 kubenswrapper[33572]: I1204 22:21:49.672813 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "9b5fd88e9b399c73466d52c7a3c61e3f" (UID: "9b5fd88e9b399c73466d52c7a3c61e3f"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:21:49.672896 master-0 kubenswrapper[33572]: I1204 22:21:49.672824 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-manifests" (OuterVolumeSpecName: "manifests") pod "9b5fd88e9b399c73466d52c7a3c61e3f" (UID: "9b5fd88e9b399c73466d52c7a3c61e3f"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:21:49.673158 master-0 kubenswrapper[33572]: I1204 22:21:49.672949 33572 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-log\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:49.673158 master-0 kubenswrapper[33572]: I1204 22:21:49.672962 33572 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:49.673158 master-0 kubenswrapper[33572]: I1204 22:21:49.672971 33572 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-manifests\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:49.673158 master-0 kubenswrapper[33572]: I1204 22:21:49.672979 33572 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:49.682898 master-0 kubenswrapper[33572]: I1204 22:21:49.682826 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "9b5fd88e9b399c73466d52c7a3c61e3f" (UID: "9b5fd88e9b399c73466d52c7a3c61e3f"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:21:49.773965 master-0 kubenswrapper[33572]: I1204 22:21:49.773893 33572 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/9b5fd88e9b399c73466d52c7a3c61e3f-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:21:50.534658 master-0 kubenswrapper[33572]: I1204 22:21:50.534569 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5fd88e9b399c73466d52c7a3c61e3f" path="/var/lib/kubelet/pods/9b5fd88e9b399c73466d52c7a3c61e3f/volumes" Dec 04 22:21:53.136603 master-0 kubenswrapper[33572]: I1204 22:21:53.136477 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6c8647588d-8b8m8"] Dec 04 22:21:53.137461 master-0 kubenswrapper[33572]: E1204 22:21:53.136806 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5fd88e9b399c73466d52c7a3c61e3f" containerName="startup-monitor" Dec 04 22:21:53.137461 master-0 kubenswrapper[33572]: I1204 22:21:53.136826 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5fd88e9b399c73466d52c7a3c61e3f" containerName="startup-monitor" Dec 04 22:21:53.137461 master-0 kubenswrapper[33572]: E1204 22:21:53.136855 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" containerName="installer" Dec 04 22:21:53.137461 master-0 kubenswrapper[33572]: I1204 22:21:53.136865 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" containerName="installer" Dec 04 22:21:53.137461 master-0 kubenswrapper[33572]: I1204 22:21:53.137052 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8307846-d1cf-4357-bcc0-b3531d34dc8b" containerName="installer" Dec 04 22:21:53.137461 master-0 kubenswrapper[33572]: I1204 22:21:53.137073 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5fd88e9b399c73466d52c7a3c61e3f" containerName="startup-monitor" Dec 04 22:21:53.139066 master-0 kubenswrapper[33572]: I1204 22:21:53.139030 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.141852 master-0 kubenswrapper[33572]: I1204 22:21:53.141805 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 04 22:21:53.143052 master-0 kubenswrapper[33572]: I1204 22:21:53.143019 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 04 22:21:53.143449 master-0 kubenswrapper[33572]: I1204 22:21:53.143405 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 04 22:21:53.143747 master-0 kubenswrapper[33572]: I1204 22:21:53.143699 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-cr637c5do8ln7" Dec 04 22:21:53.143824 master-0 kubenswrapper[33572]: I1204 22:21:53.143763 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 04 22:21:53.145703 master-0 kubenswrapper[33572]: I1204 22:21:53.145634 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 04 22:21:53.194609 master-0 kubenswrapper[33572]: I1204 22:21:53.167566 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6c8647588d-8b8m8"] Dec 04 22:21:53.227628 master-0 kubenswrapper[33572]: I1204 22:21:53.222452 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.227628 master-0 kubenswrapper[33572]: I1204 22:21:53.222586 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-tls\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.227628 master-0 kubenswrapper[33572]: I1204 22:21:53.222649 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-grpc-tls\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.227628 master-0 kubenswrapper[33572]: I1204 22:21:53.222834 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30c16431-1b9d-4c3c-a570-b208d3ec0e95-metrics-client-ca\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.227628 master-0 kubenswrapper[33572]: I1204 22:21:53.223018 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.227628 master-0 kubenswrapper[33572]: I1204 22:21:53.223087 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.227628 master-0 kubenswrapper[33572]: I1204 22:21:53.223233 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mwvr\" (UniqueName: \"kubernetes.io/projected/30c16431-1b9d-4c3c-a570-b208d3ec0e95-kube-api-access-8mwvr\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.227628 master-0 kubenswrapper[33572]: I1204 22:21:53.223382 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.324452 master-0 kubenswrapper[33572]: I1204 22:21:53.324368 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-tls\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.324816 master-0 kubenswrapper[33572]: I1204 22:21:53.324731 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-grpc-tls\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.324943 master-0 kubenswrapper[33572]: I1204 22:21:53.324885 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30c16431-1b9d-4c3c-a570-b208d3ec0e95-metrics-client-ca\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.325022 master-0 kubenswrapper[33572]: I1204 22:21:53.325000 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.325238 master-0 kubenswrapper[33572]: I1204 22:21:53.325194 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.325469 master-0 kubenswrapper[33572]: I1204 22:21:53.325248 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mwvr\" (UniqueName: \"kubernetes.io/projected/30c16431-1b9d-4c3c-a570-b208d3ec0e95-kube-api-access-8mwvr\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.325643 master-0 kubenswrapper[33572]: I1204 22:21:53.325572 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.325781 master-0 kubenswrapper[33572]: I1204 22:21:53.325701 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.327431 master-0 kubenswrapper[33572]: I1204 22:21:53.327364 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30c16431-1b9d-4c3c-a570-b208d3ec0e95-metrics-client-ca\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.329676 master-0 kubenswrapper[33572]: I1204 22:21:53.329631 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.329839 master-0 kubenswrapper[33572]: I1204 22:21:53.329806 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-grpc-tls\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.329935 master-0 kubenswrapper[33572]: I1204 22:21:53.329800 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-tls\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.331412 master-0 kubenswrapper[33572]: I1204 22:21:53.331355 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.331635 master-0 kubenswrapper[33572]: I1204 22:21:53.331576 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.331737 master-0 kubenswrapper[33572]: I1204 22:21:53.331691 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30c16431-1b9d-4c3c-a570-b208d3ec0e95-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.353228 master-0 kubenswrapper[33572]: I1204 22:21:53.353154 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mwvr\" (UniqueName: \"kubernetes.io/projected/30c16431-1b9d-4c3c-a570-b208d3ec0e95-kube-api-access-8mwvr\") pod \"thanos-querier-6c8647588d-8b8m8\" (UID: \"30c16431-1b9d-4c3c-a570-b208d3ec0e95\") " pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:53.498160 master-0 kubenswrapper[33572]: I1204 22:21:53.497924 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:54.004578 master-0 kubenswrapper[33572]: W1204 22:21:54.004483 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30c16431_1b9d_4c3c_a570_b208d3ec0e95.slice/crio-d6afad8de7d970eda8bc651345218059fd7290eee2959244f83a0fcddff38bf7 WatchSource:0}: Error finding container d6afad8de7d970eda8bc651345218059fd7290eee2959244f83a0fcddff38bf7: Status 404 returned error can't find the container with id d6afad8de7d970eda8bc651345218059fd7290eee2959244f83a0fcddff38bf7 Dec 04 22:21:54.009078 master-0 kubenswrapper[33572]: I1204 22:21:54.009035 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6c8647588d-8b8m8"] Dec 04 22:21:54.674527 master-0 kubenswrapper[33572]: I1204 22:21:54.674414 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" event={"ID":"30c16431-1b9d-4c3c-a570-b208d3ec0e95","Type":"ContainerStarted","Data":"d6afad8de7d970eda8bc651345218059fd7290eee2959244f83a0fcddff38bf7"} Dec 04 22:21:55.823977 master-0 kubenswrapper[33572]: I1204 22:21:55.823927 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-65f77db9b4-9s9lq"] Dec 04 22:21:55.832065 master-0 kubenswrapper[33572]: I1204 22:21:55.824890 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:55.846153 master-0 kubenswrapper[33572]: I1204 22:21:55.836249 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-aplglc3867qgp" Dec 04 22:21:55.846153 master-0 kubenswrapper[33572]: I1204 22:21:55.842635 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-55c77559c8-g74sm"] Dec 04 22:21:55.846153 master-0 kubenswrapper[33572]: I1204 22:21:55.842930 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" podUID="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" containerName="metrics-server" containerID="cri-o://8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c" gracePeriod=170 Dec 04 22:21:55.855779 master-0 kubenswrapper[33572]: I1204 22:21:55.855711 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-65f77db9b4-9s9lq"] Dec 04 22:21:55.965756 master-0 kubenswrapper[33572]: I1204 22:21:55.965681 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49753afa-912e-44bc-ad0b-8d5f61ab7300-client-ca-bundle\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:55.967671 master-0 kubenswrapper[33572]: I1204 22:21:55.965784 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49753afa-912e-44bc-ad0b-8d5f61ab7300-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:55.967671 master-0 kubenswrapper[33572]: I1204 22:21:55.965825 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/49753afa-912e-44bc-ad0b-8d5f61ab7300-secret-metrics-server-tls\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:55.967671 master-0 kubenswrapper[33572]: I1204 22:21:55.965883 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5wmw\" (UniqueName: \"kubernetes.io/projected/49753afa-912e-44bc-ad0b-8d5f61ab7300-kube-api-access-q5wmw\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:55.967671 master-0 kubenswrapper[33572]: I1204 22:21:55.967342 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/49753afa-912e-44bc-ad0b-8d5f61ab7300-metrics-server-audit-profiles\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:55.967671 master-0 kubenswrapper[33572]: I1204 22:21:55.967587 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/49753afa-912e-44bc-ad0b-8d5f61ab7300-secret-metrics-client-certs\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:55.967913 master-0 kubenswrapper[33572]: I1204 22:21:55.967797 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/49753afa-912e-44bc-ad0b-8d5f61ab7300-audit-log\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.069256 master-0 kubenswrapper[33572]: I1204 22:21:56.069035 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5wmw\" (UniqueName: \"kubernetes.io/projected/49753afa-912e-44bc-ad0b-8d5f61ab7300-kube-api-access-q5wmw\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.069256 master-0 kubenswrapper[33572]: I1204 22:21:56.069116 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/49753afa-912e-44bc-ad0b-8d5f61ab7300-metrics-server-audit-profiles\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.070202 master-0 kubenswrapper[33572]: I1204 22:21:56.070121 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/49753afa-912e-44bc-ad0b-8d5f61ab7300-secret-metrics-client-certs\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.070795 master-0 kubenswrapper[33572]: I1204 22:21:56.070758 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/49753afa-912e-44bc-ad0b-8d5f61ab7300-audit-log\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.071635 master-0 kubenswrapper[33572]: I1204 22:21:56.071030 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49753afa-912e-44bc-ad0b-8d5f61ab7300-client-ca-bundle\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.071635 master-0 kubenswrapper[33572]: I1204 22:21:56.071226 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/49753afa-912e-44bc-ad0b-8d5f61ab7300-metrics-server-audit-profiles\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.072109 master-0 kubenswrapper[33572]: I1204 22:21:56.071769 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49753afa-912e-44bc-ad0b-8d5f61ab7300-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.072109 master-0 kubenswrapper[33572]: I1204 22:21:56.071848 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/49753afa-912e-44bc-ad0b-8d5f61ab7300-secret-metrics-server-tls\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.072529 master-0 kubenswrapper[33572]: I1204 22:21:56.072369 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/49753afa-912e-44bc-ad0b-8d5f61ab7300-audit-log\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.073120 master-0 kubenswrapper[33572]: I1204 22:21:56.073069 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49753afa-912e-44bc-ad0b-8d5f61ab7300-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.073627 master-0 kubenswrapper[33572]: I1204 22:21:56.073585 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/49753afa-912e-44bc-ad0b-8d5f61ab7300-secret-metrics-client-certs\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.074711 master-0 kubenswrapper[33572]: I1204 22:21:56.074627 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49753afa-912e-44bc-ad0b-8d5f61ab7300-client-ca-bundle\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.075303 master-0 kubenswrapper[33572]: I1204 22:21:56.075264 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/49753afa-912e-44bc-ad0b-8d5f61ab7300-secret-metrics-server-tls\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.090383 master-0 kubenswrapper[33572]: I1204 22:21:56.090337 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5wmw\" (UniqueName: \"kubernetes.io/projected/49753afa-912e-44bc-ad0b-8d5f61ab7300-kube-api-access-q5wmw\") pod \"metrics-server-65f77db9b4-9s9lq\" (UID: \"49753afa-912e-44bc-ad0b-8d5f61ab7300\") " pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.168980 master-0 kubenswrapper[33572]: I1204 22:21:56.168921 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-79f5646748-zd47k"] Dec 04 22:21:56.171152 master-0 kubenswrapper[33572]: I1204 22:21:56.170534 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.174696 master-0 kubenswrapper[33572]: I1204 22:21:56.174570 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Dec 04 22:21:56.174696 master-0 kubenswrapper[33572]: I1204 22:21:56.174642 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Dec 04 22:21:56.174903 master-0 kubenswrapper[33572]: I1204 22:21:56.174821 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.174903 master-0 kubenswrapper[33572]: I1204 22:21:56.174868 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Dec 04 22:21:56.175054 master-0 kubenswrapper[33572]: I1204 22:21:56.174938 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0153e881-4d2d-4ff6-9e70-d6163a62970c-serving-certs-ca-bundle\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.175054 master-0 kubenswrapper[33572]: I1204 22:21:56.174994 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0153e881-4d2d-4ff6-9e70-d6163a62970c-metrics-client-ca\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.175279 master-0 kubenswrapper[33572]: I1204 22:21:56.175249 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-secret-telemeter-client\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.175334 master-0 kubenswrapper[33572]: I1204 22:21:56.175281 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Dec 04 22:21:56.175334 master-0 kubenswrapper[33572]: I1204 22:21:56.175290 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.175729 master-0 kubenswrapper[33572]: I1204 22:21:56.175399 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.175729 master-0 kubenswrapper[33572]: I1204 22:21:56.175589 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-federate-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.175729 master-0 kubenswrapper[33572]: I1204 22:21:56.175660 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8sh7\" (UniqueName: \"kubernetes.io/projected/0153e881-4d2d-4ff6-9e70-d6163a62970c-kube-api-access-d8sh7\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.178750 master-0 kubenswrapper[33572]: I1204 22:21:56.178153 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:21:56.179188 master-0 kubenswrapper[33572]: I1204 22:21:56.179073 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Dec 04 22:21:56.185297 master-0 kubenswrapper[33572]: I1204 22:21:56.183750 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-56c9b9fa8d9gs" Dec 04 22:21:56.187040 master-0 kubenswrapper[33572]: I1204 22:21:56.187000 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-79f5646748-zd47k"] Dec 04 22:21:56.277488 master-0 kubenswrapper[33572]: I1204 22:21:56.277442 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-secret-telemeter-client\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.277488 master-0 kubenswrapper[33572]: I1204 22:21:56.277488 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.277683 master-0 kubenswrapper[33572]: I1204 22:21:56.277528 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.277683 master-0 kubenswrapper[33572]: I1204 22:21:56.277552 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-federate-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.277683 master-0 kubenswrapper[33572]: I1204 22:21:56.277577 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8sh7\" (UniqueName: \"kubernetes.io/projected/0153e881-4d2d-4ff6-9e70-d6163a62970c-kube-api-access-d8sh7\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.277683 master-0 kubenswrapper[33572]: I1204 22:21:56.277607 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.277683 master-0 kubenswrapper[33572]: I1204 22:21:56.277632 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0153e881-4d2d-4ff6-9e70-d6163a62970c-serving-certs-ca-bundle\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.277683 master-0 kubenswrapper[33572]: I1204 22:21:56.277654 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0153e881-4d2d-4ff6-9e70-d6163a62970c-metrics-client-ca\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.278534 master-0 kubenswrapper[33572]: I1204 22:21:56.278480 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0153e881-4d2d-4ff6-9e70-d6163a62970c-metrics-client-ca\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.279397 master-0 kubenswrapper[33572]: I1204 22:21:56.279365 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0153e881-4d2d-4ff6-9e70-d6163a62970c-serving-certs-ca-bundle\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.279474 master-0 kubenswrapper[33572]: E1204 22:21:56.279434 33572 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 04 22:21:56.279558 master-0 kubenswrapper[33572]: E1204 22:21:56.279479 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls podName:0153e881-4d2d-4ff6-9e70-d6163a62970c nodeName:}" failed. No retries permitted until 2025-12-04 22:21:56.779463571 +0000 UTC m=+180.506989220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls") pod "telemeter-client-79f5646748-zd47k" (UID: "0153e881-4d2d-4ff6-9e70-d6163a62970c") : secret "telemeter-client-tls" not found Dec 04 22:21:56.280747 master-0 kubenswrapper[33572]: I1204 22:21:56.280706 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.286681 master-0 kubenswrapper[33572]: I1204 22:21:56.286581 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-federate-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.287907 master-0 kubenswrapper[33572]: I1204 22:21:56.287846 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.288944 master-0 kubenswrapper[33572]: I1204 22:21:56.288896 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-secret-telemeter-client\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.300401 master-0 kubenswrapper[33572]: I1204 22:21:56.300357 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8sh7\" (UniqueName: \"kubernetes.io/projected/0153e881-4d2d-4ff6-9e70-d6163a62970c-kube-api-access-d8sh7\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.610446 master-0 kubenswrapper[33572]: I1204 22:21:56.610375 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-65f77db9b4-9s9lq"] Dec 04 22:21:56.616041 master-0 kubenswrapper[33572]: W1204 22:21:56.615998 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49753afa_912e_44bc_ad0b_8d5f61ab7300.slice/crio-4d741f7704db4830f8d6d51ecad380175497f06f46a43a4cbec357a76addfb7e WatchSource:0}: Error finding container 4d741f7704db4830f8d6d51ecad380175497f06f46a43a4cbec357a76addfb7e: Status 404 returned error can't find the container with id 4d741f7704db4830f8d6d51ecad380175497f06f46a43a4cbec357a76addfb7e Dec 04 22:21:56.691031 master-0 kubenswrapper[33572]: I1204 22:21:56.690879 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" event={"ID":"30c16431-1b9d-4c3c-a570-b208d3ec0e95","Type":"ContainerStarted","Data":"7188ebeba39ddf1e5f37e77f171a2097e523fde2d6be5dd445b40bbaaa4f70cf"} Dec 04 22:21:56.691031 master-0 kubenswrapper[33572]: I1204 22:21:56.690981 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" event={"ID":"30c16431-1b9d-4c3c-a570-b208d3ec0e95","Type":"ContainerStarted","Data":"74cc3aa54c4f1d35fb3f963d60b78a16da69eb8ee7fe76c96dc9002f95d30081"} Dec 04 22:21:56.691031 master-0 kubenswrapper[33572]: I1204 22:21:56.690995 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" event={"ID":"30c16431-1b9d-4c3c-a570-b208d3ec0e95","Type":"ContainerStarted","Data":"9af8364c9e54451159cef1f58077fbe30cd629db9085a69d30fcb52ec5667dc1"} Dec 04 22:21:56.692662 master-0 kubenswrapper[33572]: I1204 22:21:56.692624 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" event={"ID":"49753afa-912e-44bc-ad0b-8d5f61ab7300","Type":"ContainerStarted","Data":"4d741f7704db4830f8d6d51ecad380175497f06f46a43a4cbec357a76addfb7e"} Dec 04 22:21:56.784125 master-0 kubenswrapper[33572]: I1204 22:21:56.784072 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:56.784651 master-0 kubenswrapper[33572]: E1204 22:21:56.784633 33572 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 04 22:21:56.784782 master-0 kubenswrapper[33572]: E1204 22:21:56.784769 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls podName:0153e881-4d2d-4ff6-9e70-d6163a62970c nodeName:}" failed. No retries permitted until 2025-12-04 22:21:57.784747496 +0000 UTC m=+181.512273145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls") pod "telemeter-client-79f5646748-zd47k" (UID: "0153e881-4d2d-4ff6-9e70-d6163a62970c") : secret "telemeter-client-tls" not found Dec 04 22:21:57.703677 master-0 kubenswrapper[33572]: I1204 22:21:57.703622 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" event={"ID":"49753afa-912e-44bc-ad0b-8d5f61ab7300","Type":"ContainerStarted","Data":"f3f8142bcee720a82fdd9e22b0d8022f02332c2acb870847cf38c712b3f54e10"} Dec 04 22:21:57.733419 master-0 kubenswrapper[33572]: I1204 22:21:57.733331 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" podStartSLOduration=2.733308237 podStartE2EDuration="2.733308237s" podCreationTimestamp="2025-12-04 22:21:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:21:57.732477704 +0000 UTC m=+181.460003363" watchObservedRunningTime="2025-12-04 22:21:57.733308237 +0000 UTC m=+181.460833886" Dec 04 22:21:57.802897 master-0 kubenswrapper[33572]: I1204 22:21:57.802813 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:57.803201 master-0 kubenswrapper[33572]: E1204 22:21:57.803128 33572 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 04 22:21:57.803287 master-0 kubenswrapper[33572]: E1204 22:21:57.803263 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls podName:0153e881-4d2d-4ff6-9e70-d6163a62970c nodeName:}" failed. No retries permitted until 2025-12-04 22:21:59.803233866 +0000 UTC m=+183.530759525 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls") pod "telemeter-client-79f5646748-zd47k" (UID: "0153e881-4d2d-4ff6-9e70-d6163a62970c") : secret "telemeter-client-tls" not found Dec 04 22:21:58.718086 master-0 kubenswrapper[33572]: I1204 22:21:58.718008 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" event={"ID":"30c16431-1b9d-4c3c-a570-b208d3ec0e95","Type":"ContainerStarted","Data":"00395c2498781c9c41ae5f91a2b52a2bf57d8419e00b813887f7a5721a2dfc3a"} Dec 04 22:21:58.719597 master-0 kubenswrapper[33572]: I1204 22:21:58.719563 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" event={"ID":"30c16431-1b9d-4c3c-a570-b208d3ec0e95","Type":"ContainerStarted","Data":"64109ca13c2c682718720f6ecab4c864d0b2b3c59b9e9f0e5937c2a8f85e5395"} Dec 04 22:21:58.719741 master-0 kubenswrapper[33572]: I1204 22:21:58.719721 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" event={"ID":"30c16431-1b9d-4c3c-a570-b208d3ec0e95","Type":"ContainerStarted","Data":"e70b2b5ccf28d3fdcd7afad5733cf220d25e0fe87e401f079d7d116614bb8313"} Dec 04 22:21:58.761773 master-0 kubenswrapper[33572]: I1204 22:21:58.761686 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" podStartSLOduration=2.100916698 podStartE2EDuration="5.761661942s" podCreationTimestamp="2025-12-04 22:21:53 +0000 UTC" firstStartedPulling="2025-12-04 22:21:54.007116389 +0000 UTC m=+177.734642078" lastFinishedPulling="2025-12-04 22:21:57.667861673 +0000 UTC m=+181.395387322" observedRunningTime="2025-12-04 22:21:58.757710503 +0000 UTC m=+182.485236192" watchObservedRunningTime="2025-12-04 22:21:58.761661942 +0000 UTC m=+182.489187601" Dec 04 22:21:59.726558 master-0 kubenswrapper[33572]: I1204 22:21:59.726446 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:21:59.843402 master-0 kubenswrapper[33572]: I1204 22:21:59.843289 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:21:59.843949 master-0 kubenswrapper[33572]: E1204 22:21:59.843633 33572 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 04 22:21:59.843949 master-0 kubenswrapper[33572]: E1204 22:21:59.843705 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls podName:0153e881-4d2d-4ff6-9e70-d6163a62970c nodeName:}" failed. No retries permitted until 2025-12-04 22:22:03.843685984 +0000 UTC m=+187.571211633 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls") pod "telemeter-client-79f5646748-zd47k" (UID: "0153e881-4d2d-4ff6-9e70-d6163a62970c") : secret "telemeter-client-tls" not found Dec 04 22:22:03.510895 master-0 kubenswrapper[33572]: I1204 22:22:03.510804 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6c8647588d-8b8m8" Dec 04 22:22:03.749161 master-0 kubenswrapper[33572]: I1204 22:22:03.749040 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 22:22:03.919267 master-0 kubenswrapper[33572]: I1204 22:22:03.918931 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:22:03.919663 master-0 kubenswrapper[33572]: E1204 22:22:03.919411 33572 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 04 22:22:03.919663 master-0 kubenswrapper[33572]: E1204 22:22:03.919483 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls podName:0153e881-4d2d-4ff6-9e70-d6163a62970c nodeName:}" failed. No retries permitted until 2025-12-04 22:22:11.919461876 +0000 UTC m=+195.646987525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls") pod "telemeter-client-79f5646748-zd47k" (UID: "0153e881-4d2d-4ff6-9e70-d6163a62970c") : secret "telemeter-client-tls" not found Dec 04 22:22:11.958962 master-0 kubenswrapper[33572]: I1204 22:22:11.958837 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:22:11.960346 master-0 kubenswrapper[33572]: E1204 22:22:11.959232 33572 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 04 22:22:11.960346 master-0 kubenswrapper[33572]: E1204 22:22:11.959423 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls podName:0153e881-4d2d-4ff6-9e70-d6163a62970c nodeName:}" failed. No retries permitted until 2025-12-04 22:22:27.959380078 +0000 UTC m=+211.686905767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls") pod "telemeter-client-79f5646748-zd47k" (UID: "0153e881-4d2d-4ff6-9e70-d6163a62970c") : secret "telemeter-client-tls" not found Dec 04 22:22:16.179214 master-0 kubenswrapper[33572]: I1204 22:22:16.179141 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:22:16.180130 master-0 kubenswrapper[33572]: I1204 22:22:16.179235 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:22:16.189781 master-0 kubenswrapper[33572]: I1204 22:22:16.189684 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:22:16.889949 master-0 kubenswrapper[33572]: I1204 22:22:16.889804 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-65f77db9b4-9s9lq" Dec 04 22:22:18.180166 master-0 kubenswrapper[33572]: I1204 22:22:18.180085 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 22:22:21.114653 master-0 kubenswrapper[33572]: I1204 22:22:21.114594 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 22:22:28.030124 master-0 kubenswrapper[33572]: I1204 22:22:28.030048 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:22:28.030807 master-0 kubenswrapper[33572]: E1204 22:22:28.030261 33572 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 04 22:22:28.030807 master-0 kubenswrapper[33572]: E1204 22:22:28.030355 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls podName:0153e881-4d2d-4ff6-9e70-d6163a62970c nodeName:}" failed. No retries permitted until 2025-12-04 22:23:00.030335488 +0000 UTC m=+243.757861137 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls") pod "telemeter-client-79f5646748-zd47k" (UID: "0153e881-4d2d-4ff6-9e70-d6163a62970c") : secret "telemeter-client-tls" not found Dec 04 22:22:28.036981 master-0 kubenswrapper[33572]: I1204 22:22:28.036909 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 04 22:22:28.039593 master-0 kubenswrapper[33572]: I1204 22:22:28.039552 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.043966 master-0 kubenswrapper[33572]: I1204 22:22:28.043910 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 04 22:22:28.044080 master-0 kubenswrapper[33572]: I1204 22:22:28.044048 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 04 22:22:28.044156 master-0 kubenswrapper[33572]: I1204 22:22:28.044134 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 04 22:22:28.044586 master-0 kubenswrapper[33572]: I1204 22:22:28.044566 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 04 22:22:28.044820 master-0 kubenswrapper[33572]: I1204 22:22:28.044792 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 04 22:22:28.044968 master-0 kubenswrapper[33572]: I1204 22:22:28.044949 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 04 22:22:28.046534 master-0 kubenswrapper[33572]: I1204 22:22:28.046510 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 04 22:22:28.065084 master-0 kubenswrapper[33572]: I1204 22:22:28.065036 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 04 22:22:28.091186 master-0 kubenswrapper[33572]: I1204 22:22:28.091139 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 04 22:22:28.131922 master-0 kubenswrapper[33572]: I1204 22:22:28.131885 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-volume\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.132103 master-0 kubenswrapper[33572]: I1204 22:22:28.132085 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlfqv\" (UniqueName: \"kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-kube-api-access-qlfqv\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.132193 master-0 kubenswrapper[33572]: I1204 22:22:28.132178 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.132295 master-0 kubenswrapper[33572]: I1204 22:22:28.132283 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-out\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.132378 master-0 kubenswrapper[33572]: I1204 22:22:28.132366 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.132482 master-0 kubenswrapper[33572]: I1204 22:22:28.132468 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.132589 master-0 kubenswrapper[33572]: I1204 22:22:28.132566 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.132670 master-0 kubenswrapper[33572]: I1204 22:22:28.132657 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.132746 master-0 kubenswrapper[33572]: I1204 22:22:28.132734 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.132818 master-0 kubenswrapper[33572]: I1204 22:22:28.132805 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.132944 master-0 kubenswrapper[33572]: I1204 22:22:28.132915 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-web-config\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.133166 master-0 kubenswrapper[33572]: I1204 22:22:28.133149 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.234533 master-0 kubenswrapper[33572]: I1204 22:22:28.234431 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-volume\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.234828 master-0 kubenswrapper[33572]: I1204 22:22:28.234748 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlfqv\" (UniqueName: \"kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-kube-api-access-qlfqv\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.234924 master-0 kubenswrapper[33572]: I1204 22:22:28.234888 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.235005 master-0 kubenswrapper[33572]: I1204 22:22:28.234974 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-out\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.235060 master-0 kubenswrapper[33572]: I1204 22:22:28.235011 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.235125 master-0 kubenswrapper[33572]: I1204 22:22:28.235109 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.235173 master-0 kubenswrapper[33572]: I1204 22:22:28.235150 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.235220 master-0 kubenswrapper[33572]: I1204 22:22:28.235188 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.235220 master-0 kubenswrapper[33572]: I1204 22:22:28.235216 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.235306 master-0 kubenswrapper[33572]: I1204 22:22:28.235243 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.235362 master-0 kubenswrapper[33572]: I1204 22:22:28.235321 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-web-config\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.235362 master-0 kubenswrapper[33572]: I1204 22:22:28.235350 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.235627 master-0 kubenswrapper[33572]: E1204 22:22:28.235591 33572 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 04 22:22:28.235697 master-0 kubenswrapper[33572]: E1204 22:22:28.235683 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls podName:2221d215-2187-4dd6-a2de-f0fc6ec54027 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:28.735659962 +0000 UTC m=+212.463185661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027") : secret "alertmanager-main-tls" not found Dec 04 22:22:28.236975 master-0 kubenswrapper[33572]: I1204 22:22:28.236410 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.236975 master-0 kubenswrapper[33572]: I1204 22:22:28.236764 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.237454 master-0 kubenswrapper[33572]: I1204 22:22:28.237416 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.240366 master-0 kubenswrapper[33572]: I1204 22:22:28.238117 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-volume\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.241089 master-0 kubenswrapper[33572]: I1204 22:22:28.241048 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-web-config\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.241218 master-0 kubenswrapper[33572]: I1204 22:22:28.241140 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-out\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.241575 master-0 kubenswrapper[33572]: I1204 22:22:28.241542 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.242712 master-0 kubenswrapper[33572]: I1204 22:22:28.242044 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.242712 master-0 kubenswrapper[33572]: I1204 22:22:28.242435 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.253805 master-0 kubenswrapper[33572]: I1204 22:22:28.253752 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.258539 master-0 kubenswrapper[33572]: I1204 22:22:28.258464 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlfqv\" (UniqueName: \"kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-kube-api-access-qlfqv\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.744071 master-0 kubenswrapper[33572]: I1204 22:22:28.743940 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:28.744375 master-0 kubenswrapper[33572]: E1204 22:22:28.744142 33572 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 04 22:22:28.744375 master-0 kubenswrapper[33572]: E1204 22:22:28.744225 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls podName:2221d215-2187-4dd6-a2de-f0fc6ec54027 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:29.744204388 +0000 UTC m=+213.471730037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027") : secret "alertmanager-main-tls" not found Dec 04 22:22:29.760748 master-0 kubenswrapper[33572]: I1204 22:22:29.760626 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:29.761681 master-0 kubenswrapper[33572]: E1204 22:22:29.760888 33572 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 04 22:22:29.761681 master-0 kubenswrapper[33572]: E1204 22:22:29.761016 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls podName:2221d215-2187-4dd6-a2de-f0fc6ec54027 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:31.760989251 +0000 UTC m=+215.488514910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027") : secret "alertmanager-main-tls" not found Dec 04 22:22:31.799797 master-0 kubenswrapper[33572]: I1204 22:22:31.799653 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:31.800631 master-0 kubenswrapper[33572]: E1204 22:22:31.800017 33572 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 04 22:22:31.800631 master-0 kubenswrapper[33572]: E1204 22:22:31.800167 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls podName:2221d215-2187-4dd6-a2de-f0fc6ec54027 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:35.800128213 +0000 UTC m=+219.527653892 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027") : secret "alertmanager-main-tls" not found Dec 04 22:22:35.872154 master-0 kubenswrapper[33572]: I1204 22:22:35.872053 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:35.872828 master-0 kubenswrapper[33572]: E1204 22:22:35.872261 33572 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 04 22:22:35.872828 master-0 kubenswrapper[33572]: E1204 22:22:35.872365 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls podName:2221d215-2187-4dd6-a2de-f0fc6ec54027 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:43.872341937 +0000 UTC m=+227.599867596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027") : secret "alertmanager-main-tls" not found Dec 04 22:22:37.318436 master-0 kubenswrapper[33572]: I1204 22:22:37.318365 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 04 22:22:37.320595 master-0 kubenswrapper[33572]: I1204 22:22:37.320564 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.322993 master-0 kubenswrapper[33572]: I1204 22:22:37.322934 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 04 22:22:37.323181 master-0 kubenswrapper[33572]: I1204 22:22:37.323163 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 04 22:22:37.323400 master-0 kubenswrapper[33572]: I1204 22:22:37.323368 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 04 22:22:37.323600 master-0 kubenswrapper[33572]: I1204 22:22:37.323561 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 04 22:22:37.323786 master-0 kubenswrapper[33572]: I1204 22:22:37.323754 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-98i4jt5uspsnd" Dec 04 22:22:37.323963 master-0 kubenswrapper[33572]: I1204 22:22:37.323929 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 04 22:22:37.324129 master-0 kubenswrapper[33572]: I1204 22:22:37.324103 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 04 22:22:37.325195 master-0 kubenswrapper[33572]: I1204 22:22:37.325121 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 04 22:22:37.325706 master-0 kubenswrapper[33572]: I1204 22:22:37.325683 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 04 22:22:37.329525 master-0 kubenswrapper[33572]: I1204 22:22:37.329473 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 04 22:22:37.335677 master-0 kubenswrapper[33572]: I1204 22:22:37.335633 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 04 22:22:37.343917 master-0 kubenswrapper[33572]: I1204 22:22:37.343858 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 04 22:22:37.346134 master-0 kubenswrapper[33572]: I1204 22:22:37.346081 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 04 22:22:37.403822 master-0 kubenswrapper[33572]: I1204 22:22:37.403714 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.403822 master-0 kubenswrapper[33572]: I1204 22:22:37.403804 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.403851 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.403886 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-config\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.403913 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.403947 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.403972 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.404003 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-web-config\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.404025 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.404056 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdz8b\" (UniqueName: \"kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-kube-api-access-bdz8b\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.404087 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.404118 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.404141 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404198 master-0 kubenswrapper[33572]: I1204 22:22:37.404207 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404603 master-0 kubenswrapper[33572]: I1204 22:22:37.404235 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404603 master-0 kubenswrapper[33572]: I1204 22:22:37.404270 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-config-out\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404603 master-0 kubenswrapper[33572]: I1204 22:22:37.404296 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.404603 master-0 kubenswrapper[33572]: I1204 22:22:37.404338 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.506492 master-0 kubenswrapper[33572]: I1204 22:22:37.506357 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-config-out\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.506492 master-0 kubenswrapper[33572]: I1204 22:22:37.506498 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.506986 master-0 kubenswrapper[33572]: I1204 22:22:37.506603 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.506986 master-0 kubenswrapper[33572]: I1204 22:22:37.506658 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507057 master-0 kubenswrapper[33572]: I1204 22:22:37.507004 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507211 master-0 kubenswrapper[33572]: I1204 22:22:37.507172 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507323 master-0 kubenswrapper[33572]: I1204 22:22:37.507287 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-config\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507535 master-0 kubenswrapper[33572]: I1204 22:22:37.507478 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507617 master-0 kubenswrapper[33572]: I1204 22:22:37.507577 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507617 master-0 kubenswrapper[33572]: I1204 22:22:37.507613 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507692 master-0 kubenswrapper[33572]: I1204 22:22:37.507672 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-web-config\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507692 master-0 kubenswrapper[33572]: I1204 22:22:37.507690 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507767 master-0 kubenswrapper[33572]: I1204 22:22:37.507735 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdz8b\" (UniqueName: \"kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-kube-api-access-bdz8b\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507802 master-0 kubenswrapper[33572]: I1204 22:22:37.507791 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507858 master-0 kubenswrapper[33572]: I1204 22:22:37.507837 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507899 master-0 kubenswrapper[33572]: I1204 22:22:37.507866 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.507937 master-0 kubenswrapper[33572]: I1204 22:22:37.507907 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.508090 master-0 kubenswrapper[33572]: I1204 22:22:37.508051 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.508132 master-0 kubenswrapper[33572]: I1204 22:22:37.508095 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.508293 master-0 kubenswrapper[33572]: E1204 22:22:37.508250 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 04 22:22:37.508350 master-0 kubenswrapper[33572]: E1204 22:22:37.508326 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls podName:c1b7c3b1-3850-4a37-9a36-e84537557071 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:38.0083061 +0000 UTC m=+221.735831749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 04 22:22:37.508844 master-0 kubenswrapper[33572]: I1204 22:22:37.508798 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.509120 master-0 kubenswrapper[33572]: E1204 22:22:37.509065 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 04 22:22:37.509293 master-0 kubenswrapper[33572]: E1204 22:22:37.509275 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls podName:c1b7c3b1-3850-4a37-9a36-e84537557071 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:38.009241266 +0000 UTC m=+221.736766915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071") : secret "prometheus-k8s-tls" not found Dec 04 22:22:37.509376 master-0 kubenswrapper[33572]: I1204 22:22:37.509085 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.510223 master-0 kubenswrapper[33572]: I1204 22:22:37.510178 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-config-out\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.510662 master-0 kubenswrapper[33572]: I1204 22:22:37.510613 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.510898 master-0 kubenswrapper[33572]: I1204 22:22:37.510859 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.510942 master-0 kubenswrapper[33572]: I1204 22:22:37.510902 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-config\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.510942 master-0 kubenswrapper[33572]: I1204 22:22:37.510912 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.511815 master-0 kubenswrapper[33572]: I1204 22:22:37.511730 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.512402 master-0 kubenswrapper[33572]: I1204 22:22:37.512348 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.512455 master-0 kubenswrapper[33572]: I1204 22:22:37.512356 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.513155 master-0 kubenswrapper[33572]: I1204 22:22:37.513084 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.513451 master-0 kubenswrapper[33572]: I1204 22:22:37.513227 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-web-config\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.515940 master-0 kubenswrapper[33572]: I1204 22:22:37.515892 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.527015 master-0 kubenswrapper[33572]: I1204 22:22:37.526966 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:37.530822 master-0 kubenswrapper[33572]: I1204 22:22:37.530793 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdz8b\" (UniqueName: \"kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-kube-api-access-bdz8b\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:38.016900 master-0 kubenswrapper[33572]: I1204 22:22:38.016783 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:38.017315 master-0 kubenswrapper[33572]: E1204 22:22:38.017111 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 04 22:22:38.017315 master-0 kubenswrapper[33572]: E1204 22:22:38.017273 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls podName:c1b7c3b1-3850-4a37-9a36-e84537557071 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:39.017242666 +0000 UTC m=+222.744768355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071") : secret "prometheus-k8s-tls" not found Dec 04 22:22:38.017415 master-0 kubenswrapper[33572]: I1204 22:22:38.017140 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:38.017415 master-0 kubenswrapper[33572]: E1204 22:22:38.017327 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 04 22:22:38.017490 master-0 kubenswrapper[33572]: E1204 22:22:38.017435 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls podName:c1b7c3b1-3850-4a37-9a36-e84537557071 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:39.017404211 +0000 UTC m=+222.744929950 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 04 22:22:39.034173 master-0 kubenswrapper[33572]: I1204 22:22:39.034076 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:39.035159 master-0 kubenswrapper[33572]: I1204 22:22:39.034225 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:39.035159 master-0 kubenswrapper[33572]: E1204 22:22:39.034349 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 04 22:22:39.035159 master-0 kubenswrapper[33572]: E1204 22:22:39.034490 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls podName:c1b7c3b1-3850-4a37-9a36-e84537557071 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:41.034455852 +0000 UTC m=+224.761981541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071") : secret "prometheus-k8s-tls" not found Dec 04 22:22:39.035159 master-0 kubenswrapper[33572]: E1204 22:22:39.034582 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 04 22:22:39.035159 master-0 kubenswrapper[33572]: E1204 22:22:39.034725 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls podName:c1b7c3b1-3850-4a37-9a36-e84537557071 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:41.034683918 +0000 UTC m=+224.762209597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 04 22:22:41.069120 master-0 kubenswrapper[33572]: I1204 22:22:41.069018 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:41.070043 master-0 kubenswrapper[33572]: I1204 22:22:41.069149 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:41.070043 master-0 kubenswrapper[33572]: E1204 22:22:41.069301 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 04 22:22:41.070043 master-0 kubenswrapper[33572]: E1204 22:22:41.069410 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls podName:c1b7c3b1-3850-4a37-9a36-e84537557071 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:45.069383725 +0000 UTC m=+228.796909414 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071") : secret "prometheus-k8s-tls" not found Dec 04 22:22:41.070043 master-0 kubenswrapper[33572]: E1204 22:22:41.069454 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 04 22:22:41.070043 master-0 kubenswrapper[33572]: E1204 22:22:41.069617 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls podName:c1b7c3b1-3850-4a37-9a36-e84537557071 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:45.06957983 +0000 UTC m=+228.797105539 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 04 22:22:43.925082 master-0 kubenswrapper[33572]: I1204 22:22:43.924984 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:43.926036 master-0 kubenswrapper[33572]: E1204 22:22:43.925338 33572 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 04 22:22:43.926036 master-0 kubenswrapper[33572]: E1204 22:22:43.925460 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls podName:2221d215-2187-4dd6-a2de-f0fc6ec54027 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:59.925430369 +0000 UTC m=+243.652956048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027") : secret "alertmanager-main-tls" not found Dec 04 22:22:45.150026 master-0 kubenswrapper[33572]: I1204 22:22:45.149854 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:45.150690 master-0 kubenswrapper[33572]: I1204 22:22:45.150133 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:45.150742 master-0 kubenswrapper[33572]: E1204 22:22:45.150704 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 04 22:22:45.150803 master-0 kubenswrapper[33572]: E1204 22:22:45.150785 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls podName:c1b7c3b1-3850-4a37-9a36-e84537557071 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:53.150763216 +0000 UTC m=+236.878288865 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 04 22:22:45.151782 master-0 kubenswrapper[33572]: E1204 22:22:45.151762 33572 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 04 22:22:45.151857 master-0 kubenswrapper[33572]: E1204 22:22:45.151809 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls podName:c1b7c3b1-3850-4a37-9a36-e84537557071 nodeName:}" failed. No retries permitted until 2025-12-04 22:22:53.151797155 +0000 UTC m=+236.879322804 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071") : secret "prometheus-k8s-tls" not found Dec 04 22:22:53.204407 master-0 kubenswrapper[33572]: I1204 22:22:53.204250 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:53.204407 master-0 kubenswrapper[33572]: I1204 22:22:53.204369 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:53.212274 master-0 kubenswrapper[33572]: I1204 22:22:53.211184 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:53.223014 master-0 kubenswrapper[33572]: I1204 22:22:53.222938 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:53.288984 master-0 kubenswrapper[33572]: I1204 22:22:53.288871 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:22:53.770799 master-0 kubenswrapper[33572]: I1204 22:22:53.770715 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 04 22:22:53.775494 master-0 kubenswrapper[33572]: W1204 22:22:53.775424 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1b7c3b1_3850_4a37_9a36_e84537557071.slice/crio-5a7c26bd0a3ca94fcc0a15c70a4aff69934c912a02e82c8167d934151047373d WatchSource:0}: Error finding container 5a7c26bd0a3ca94fcc0a15c70a4aff69934c912a02e82c8167d934151047373d: Status 404 returned error can't find the container with id 5a7c26bd0a3ca94fcc0a15c70a4aff69934c912a02e82c8167d934151047373d Dec 04 22:22:54.233657 master-0 kubenswrapper[33572]: I1204 22:22:54.233448 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerStarted","Data":"5a7c26bd0a3ca94fcc0a15c70a4aff69934c912a02e82c8167d934151047373d"} Dec 04 22:22:56.259146 master-0 kubenswrapper[33572]: I1204 22:22:56.258952 33572 generic.go:334] "Generic (PLEG): container finished" podID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerID="a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8" exitCode=0 Dec 04 22:22:56.259146 master-0 kubenswrapper[33572]: I1204 22:22:56.259043 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerDied","Data":"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8"} Dec 04 22:22:59.053320 master-0 kubenswrapper[33572]: I1204 22:22:59.053245 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5c4bw"] Dec 04 22:22:59.054400 master-0 kubenswrapper[33572]: I1204 22:22:59.054367 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.056546 master-0 kubenswrapper[33572]: I1204 22:22:59.056461 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-qh5vm" Dec 04 22:22:59.056658 master-0 kubenswrapper[33572]: I1204 22:22:59.056549 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 22:22:59.151107 master-0 kubenswrapper[33572]: I1204 22:22:59.151014 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/924b2123-1f68-49a0-9c4f-ae978be03e40-host\") pod \"node-ca-5c4bw\" (UID: \"924b2123-1f68-49a0-9c4f-ae978be03e40\") " pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.151376 master-0 kubenswrapper[33572]: I1204 22:22:59.151139 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/924b2123-1f68-49a0-9c4f-ae978be03e40-serviceca\") pod \"node-ca-5c4bw\" (UID: \"924b2123-1f68-49a0-9c4f-ae978be03e40\") " pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.151376 master-0 kubenswrapper[33572]: I1204 22:22:59.151214 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gp6m\" (UniqueName: \"kubernetes.io/projected/924b2123-1f68-49a0-9c4f-ae978be03e40-kube-api-access-7gp6m\") pod \"node-ca-5c4bw\" (UID: \"924b2123-1f68-49a0-9c4f-ae978be03e40\") " pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.253237 master-0 kubenswrapper[33572]: I1204 22:22:59.253139 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/924b2123-1f68-49a0-9c4f-ae978be03e40-host\") pod \"node-ca-5c4bw\" (UID: \"924b2123-1f68-49a0-9c4f-ae978be03e40\") " pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.253427 master-0 kubenswrapper[33572]: I1204 22:22:59.253334 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/924b2123-1f68-49a0-9c4f-ae978be03e40-host\") pod \"node-ca-5c4bw\" (UID: \"924b2123-1f68-49a0-9c4f-ae978be03e40\") " pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.253534 master-0 kubenswrapper[33572]: I1204 22:22:59.253452 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/924b2123-1f68-49a0-9c4f-ae978be03e40-serviceca\") pod \"node-ca-5c4bw\" (UID: \"924b2123-1f68-49a0-9c4f-ae978be03e40\") " pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.253798 master-0 kubenswrapper[33572]: I1204 22:22:59.253750 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gp6m\" (UniqueName: \"kubernetes.io/projected/924b2123-1f68-49a0-9c4f-ae978be03e40-kube-api-access-7gp6m\") pod \"node-ca-5c4bw\" (UID: \"924b2123-1f68-49a0-9c4f-ae978be03e40\") " pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.255831 master-0 kubenswrapper[33572]: I1204 22:22:59.254148 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/924b2123-1f68-49a0-9c4f-ae978be03e40-serviceca\") pod \"node-ca-5c4bw\" (UID: \"924b2123-1f68-49a0-9c4f-ae978be03e40\") " pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.274909 master-0 kubenswrapper[33572]: I1204 22:22:59.274854 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gp6m\" (UniqueName: \"kubernetes.io/projected/924b2123-1f68-49a0-9c4f-ae978be03e40-kube-api-access-7gp6m\") pod \"node-ca-5c4bw\" (UID: \"924b2123-1f68-49a0-9c4f-ae978be03e40\") " pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.382548 master-0 kubenswrapper[33572]: I1204 22:22:59.382326 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5c4bw" Dec 04 22:22:59.602037 master-0 kubenswrapper[33572]: I1204 22:22:59.599836 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-54dbc87ccb-bgbjl"] Dec 04 22:22:59.602037 master-0 kubenswrapper[33572]: I1204 22:22:59.601285 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.610652 master-0 kubenswrapper[33572]: I1204 22:22:59.607611 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-lvtzq" Dec 04 22:22:59.610652 master-0 kubenswrapper[33572]: I1204 22:22:59.607911 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 22:22:59.610652 master-0 kubenswrapper[33572]: I1204 22:22:59.610396 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 22:22:59.611038 master-0 kubenswrapper[33572]: I1204 22:22:59.610923 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 22:22:59.613236 master-0 kubenswrapper[33572]: I1204 22:22:59.611816 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 22:22:59.613645 master-0 kubenswrapper[33572]: I1204 22:22:59.613354 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 22:22:59.636625 master-0 kubenswrapper[33572]: I1204 22:22:59.635383 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-54dbc87ccb-bgbjl"] Dec 04 22:22:59.674533 master-0 kubenswrapper[33572]: I1204 22:22:59.673738 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd4hh\" (UniqueName: \"kubernetes.io/projected/7500e099-8073-4485-9cc4-f4ad90556339-kube-api-access-sd4hh\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.674533 master-0 kubenswrapper[33572]: I1204 22:22:59.673798 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7500e099-8073-4485-9cc4-f4ad90556339-trusted-ca\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.674533 master-0 kubenswrapper[33572]: I1204 22:22:59.673894 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7500e099-8073-4485-9cc4-f4ad90556339-serving-cert\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.674533 master-0 kubenswrapper[33572]: I1204 22:22:59.673941 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7500e099-8073-4485-9cc4-f4ad90556339-config\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.776054 master-0 kubenswrapper[33572]: I1204 22:22:59.775984 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd4hh\" (UniqueName: \"kubernetes.io/projected/7500e099-8073-4485-9cc4-f4ad90556339-kube-api-access-sd4hh\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.776054 master-0 kubenswrapper[33572]: I1204 22:22:59.776048 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7500e099-8073-4485-9cc4-f4ad90556339-trusted-ca\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.776454 master-0 kubenswrapper[33572]: I1204 22:22:59.776308 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7500e099-8073-4485-9cc4-f4ad90556339-serving-cert\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.779186 master-0 kubenswrapper[33572]: I1204 22:22:59.776539 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7500e099-8073-4485-9cc4-f4ad90556339-config\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.779186 master-0 kubenswrapper[33572]: I1204 22:22:59.777728 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7500e099-8073-4485-9cc4-f4ad90556339-config\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.779844 master-0 kubenswrapper[33572]: I1204 22:22:59.779786 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7500e099-8073-4485-9cc4-f4ad90556339-trusted-ca\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.792607 master-0 kubenswrapper[33572]: I1204 22:22:59.792529 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd4hh\" (UniqueName: \"kubernetes.io/projected/7500e099-8073-4485-9cc4-f4ad90556339-kube-api-access-sd4hh\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.793288 master-0 kubenswrapper[33572]: I1204 22:22:59.793229 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7500e099-8073-4485-9cc4-f4ad90556339-serving-cert\") pod \"console-operator-54dbc87ccb-bgbjl\" (UID: \"7500e099-8073-4485-9cc4-f4ad90556339\") " pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:22:59.979864 master-0 kubenswrapper[33572]: I1204 22:22:59.979689 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:59.985533 master-0 kubenswrapper[33572]: I1204 22:22:59.985470 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:22:59.998941 master-0 kubenswrapper[33572]: I1204 22:22:59.998883 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:23:00.081431 master-0 kubenswrapper[33572]: I1204 22:23:00.081348 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:23:00.086326 master-0 kubenswrapper[33572]: I1204 22:23:00.086242 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/0153e881-4d2d-4ff6-9e70-d6163a62970c-telemeter-client-tls\") pod \"telemeter-client-79f5646748-zd47k\" (UID: \"0153e881-4d2d-4ff6-9e70-d6163a62970c\") " pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:23:00.109954 master-0 kubenswrapper[33572]: I1204 22:23:00.109848 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" Dec 04 22:23:00.158566 master-0 kubenswrapper[33572]: I1204 22:23:00.158433 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:23:00.166970 master-0 kubenswrapper[33572]: W1204 22:23:00.166912 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod924b2123_1f68_49a0_9c4f_ae978be03e40.slice/crio-271a1c71cfe0846def3bd8dc39333c4b410a475b6df51c332b030f8515d10e7e WatchSource:0}: Error finding container 271a1c71cfe0846def3bd8dc39333c4b410a475b6df51c332b030f8515d10e7e: Status 404 returned error can't find the container with id 271a1c71cfe0846def3bd8dc39333c4b410a475b6df51c332b030f8515d10e7e Dec 04 22:23:00.301548 master-0 kubenswrapper[33572]: I1204 22:23:00.300960 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5c4bw" event={"ID":"924b2123-1f68-49a0-9c4f-ae978be03e40","Type":"ContainerStarted","Data":"271a1c71cfe0846def3bd8dc39333c4b410a475b6df51c332b030f8515d10e7e"} Dec 04 22:23:00.601738 master-0 kubenswrapper[33572]: I1204 22:23:00.601668 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-54dbc87ccb-bgbjl"] Dec 04 22:23:00.605998 master-0 kubenswrapper[33572]: W1204 22:23:00.605898 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7500e099_8073_4485_9cc4_f4ad90556339.slice/crio-8f239029e270354cef5970d3580787c8f80ec31c061ce0e9ebab0bef879c892b WatchSource:0}: Error finding container 8f239029e270354cef5970d3580787c8f80ec31c061ce0e9ebab0bef879c892b: Status 404 returned error can't find the container with id 8f239029e270354cef5970d3580787c8f80ec31c061ce0e9ebab0bef879c892b Dec 04 22:23:00.680413 master-0 kubenswrapper[33572]: I1204 22:23:00.680352 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-79f5646748-zd47k"] Dec 04 22:23:00.774934 master-0 kubenswrapper[33572]: I1204 22:23:00.774858 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 04 22:23:01.335247 master-0 kubenswrapper[33572]: I1204 22:23:01.335162 33572 generic.go:334] "Generic (PLEG): container finished" podID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerID="374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf" exitCode=0 Dec 04 22:23:01.336738 master-0 kubenswrapper[33572]: I1204 22:23:01.336695 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerDied","Data":"374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf"} Dec 04 22:23:01.336834 master-0 kubenswrapper[33572]: I1204 22:23:01.336740 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerStarted","Data":"2fb06c238ba297f72251dba86e44e92933bfc5c6b7c2e7001636729770ba92c5"} Dec 04 22:23:01.339212 master-0 kubenswrapper[33572]: I1204 22:23:01.339139 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" event={"ID":"0153e881-4d2d-4ff6-9e70-d6163a62970c","Type":"ContainerStarted","Data":"0952462eaeb7e8f9e847975e77c64b778d212a43ba7aba44ab823b41dab5a2ef"} Dec 04 22:23:01.343740 master-0 kubenswrapper[33572]: I1204 22:23:01.343645 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerStarted","Data":"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e"} Dec 04 22:23:01.343740 master-0 kubenswrapper[33572]: I1204 22:23:01.343733 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerStarted","Data":"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48"} Dec 04 22:23:01.344077 master-0 kubenswrapper[33572]: I1204 22:23:01.343748 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerStarted","Data":"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f"} Dec 04 22:23:01.344077 master-0 kubenswrapper[33572]: I1204 22:23:01.343763 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerStarted","Data":"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724"} Dec 04 22:23:01.344077 master-0 kubenswrapper[33572]: I1204 22:23:01.343776 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerStarted","Data":"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426"} Dec 04 22:23:01.345087 master-0 kubenswrapper[33572]: I1204 22:23:01.345055 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" event={"ID":"7500e099-8073-4485-9cc4-f4ad90556339","Type":"ContainerStarted","Data":"8f239029e270354cef5970d3580787c8f80ec31c061ce0e9ebab0bef879c892b"} Dec 04 22:23:01.420456 master-0 kubenswrapper[33572]: I1204 22:23:01.420363 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=18.033608933 podStartE2EDuration="24.420340351s" podCreationTimestamp="2025-12-04 22:22:37 +0000 UTC" firstStartedPulling="2025-12-04 22:22:53.778541093 +0000 UTC m=+237.506066742" lastFinishedPulling="2025-12-04 22:23:00.165272511 +0000 UTC m=+243.892798160" observedRunningTime="2025-12-04 22:23:01.413547893 +0000 UTC m=+245.141073562" watchObservedRunningTime="2025-12-04 22:23:01.420340351 +0000 UTC m=+245.147866000" Dec 04 22:23:02.357766 master-0 kubenswrapper[33572]: I1204 22:23:02.357691 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerStarted","Data":"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4"} Dec 04 22:23:03.289592 master-0 kubenswrapper[33572]: I1204 22:23:03.289533 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:23:03.812739 master-0 kubenswrapper[33572]: I1204 22:23:03.812674 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6559dcc668-87vwg"] Dec 04 22:23:03.814063 master-0 kubenswrapper[33572]: I1204 22:23:03.814033 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6559dcc668-87vwg" Dec 04 22:23:03.816707 master-0 kubenswrapper[33572]: I1204 22:23:03.816664 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-h6fn4" Dec 04 22:23:03.817087 master-0 kubenswrapper[33572]: I1204 22:23:03.817019 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 04 22:23:03.824561 master-0 kubenswrapper[33572]: I1204 22:23:03.824484 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6559dcc668-87vwg"] Dec 04 22:23:03.861372 master-0 kubenswrapper[33572]: I1204 22:23:03.861283 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9414862e-1a68-4ed3-8229-f25e6a2ba0fa-monitoring-plugin-cert\") pod \"monitoring-plugin-6559dcc668-87vwg\" (UID: \"9414862e-1a68-4ed3-8229-f25e6a2ba0fa\") " pod="openshift-monitoring/monitoring-plugin-6559dcc668-87vwg" Dec 04 22:23:03.962547 master-0 kubenswrapper[33572]: I1204 22:23:03.962461 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9414862e-1a68-4ed3-8229-f25e6a2ba0fa-monitoring-plugin-cert\") pod \"monitoring-plugin-6559dcc668-87vwg\" (UID: \"9414862e-1a68-4ed3-8229-f25e6a2ba0fa\") " pod="openshift-monitoring/monitoring-plugin-6559dcc668-87vwg" Dec 04 22:23:03.981709 master-0 kubenswrapper[33572]: I1204 22:23:03.981645 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9414862e-1a68-4ed3-8229-f25e6a2ba0fa-monitoring-plugin-cert\") pod \"monitoring-plugin-6559dcc668-87vwg\" (UID: \"9414862e-1a68-4ed3-8229-f25e6a2ba0fa\") " pod="openshift-monitoring/monitoring-plugin-6559dcc668-87vwg" Dec 04 22:23:04.148963 master-0 kubenswrapper[33572]: I1204 22:23:04.148651 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6559dcc668-87vwg" Dec 04 22:23:04.639588 master-0 kubenswrapper[33572]: I1204 22:23:04.639474 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6559dcc668-87vwg"] Dec 04 22:23:04.649492 master-0 kubenswrapper[33572]: W1204 22:23:04.649438 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9414862e_1a68_4ed3_8229_f25e6a2ba0fa.slice/crio-b12b5b8acee8827094ad3078d3b07dc04d490a7ee556b96ef3efdbba423bd6d9 WatchSource:0}: Error finding container b12b5b8acee8827094ad3078d3b07dc04d490a7ee556b96ef3efdbba423bd6d9: Status 404 returned error can't find the container with id b12b5b8acee8827094ad3078d3b07dc04d490a7ee556b96ef3efdbba423bd6d9 Dec 04 22:23:05.302307 master-0 kubenswrapper[33572]: I1204 22:23:05.302232 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-69cd4c69bf-b4qng"] Dec 04 22:23:05.303174 master-0 kubenswrapper[33572]: I1204 22:23:05.303147 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-69cd4c69bf-b4qng" Dec 04 22:23:05.310983 master-0 kubenswrapper[33572]: I1204 22:23:05.310926 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-phx28" Dec 04 22:23:05.311199 master-0 kubenswrapper[33572]: I1204 22:23:05.311042 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 22:23:05.311467 master-0 kubenswrapper[33572]: I1204 22:23:05.311414 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 22:23:05.322383 master-0 kubenswrapper[33572]: I1204 22:23:05.322303 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-69cd4c69bf-b4qng"] Dec 04 22:23:05.392933 master-0 kubenswrapper[33572]: I1204 22:23:05.392613 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5c4bw" event={"ID":"924b2123-1f68-49a0-9c4f-ae978be03e40","Type":"ContainerStarted","Data":"4e5874366e02c05148cbe5cd946f19f2a6250c1a174a03d0895df420b1ca4b9e"} Dec 04 22:23:05.395218 master-0 kubenswrapper[33572]: I1204 22:23:05.395166 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" event={"ID":"7500e099-8073-4485-9cc4-f4ad90556339","Type":"ContainerStarted","Data":"525e12d1312cd7536122629a10805811a699ec1de4c56a57dc105e21c1b323c9"} Dec 04 22:23:05.395451 master-0 kubenswrapper[33572]: I1204 22:23:05.395397 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:23:05.395872 master-0 kubenswrapper[33572]: I1204 22:23:05.395836 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m87cl\" (UniqueName: \"kubernetes.io/projected/4c1b60d6-ca5c-40b5-a567-a48ef956163b-kube-api-access-m87cl\") pod \"downloads-69cd4c69bf-b4qng\" (UID: \"4c1b60d6-ca5c-40b5-a567-a48ef956163b\") " pod="openshift-console/downloads-69cd4c69bf-b4qng" Dec 04 22:23:05.403310 master-0 kubenswrapper[33572]: I1204 22:23:05.403231 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" Dec 04 22:23:05.409732 master-0 kubenswrapper[33572]: I1204 22:23:05.409286 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerStarted","Data":"d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180"} Dec 04 22:23:05.409732 master-0 kubenswrapper[33572]: I1204 22:23:05.409438 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerStarted","Data":"324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce"} Dec 04 22:23:05.409732 master-0 kubenswrapper[33572]: I1204 22:23:05.409450 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerStarted","Data":"fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac"} Dec 04 22:23:05.409732 master-0 kubenswrapper[33572]: I1204 22:23:05.409470 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerStarted","Data":"2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd"} Dec 04 22:23:05.409732 master-0 kubenswrapper[33572]: I1204 22:23:05.409483 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerStarted","Data":"56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2"} Dec 04 22:23:05.413794 master-0 kubenswrapper[33572]: I1204 22:23:05.413749 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79f5646748-zd47k_0153e881-4d2d-4ff6-9e70-d6163a62970c/telemeter-client/0.log" Dec 04 22:23:05.413875 master-0 kubenswrapper[33572]: I1204 22:23:05.413824 33572 generic.go:334] "Generic (PLEG): container finished" podID="0153e881-4d2d-4ff6-9e70-d6163a62970c" containerID="e2a63d079e5ee11958ec39f8f4a8f847c989ad7e18e50f889fb7e7fc75f4ff71" exitCode=1 Dec 04 22:23:05.413942 master-0 kubenswrapper[33572]: I1204 22:23:05.413899 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" event={"ID":"0153e881-4d2d-4ff6-9e70-d6163a62970c","Type":"ContainerStarted","Data":"d72b55e9ea78ffbcf8282934c892e2c916012a7e67b0f1a49e01e690cdc6ad9e"} Dec 04 22:23:05.413942 master-0 kubenswrapper[33572]: I1204 22:23:05.413938 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" event={"ID":"0153e881-4d2d-4ff6-9e70-d6163a62970c","Type":"ContainerStarted","Data":"17da0ab850254a2a063d8712b7f6bf1daeb7320e69b76aa8d6aa2533dd59d636"} Dec 04 22:23:05.414038 master-0 kubenswrapper[33572]: I1204 22:23:05.413949 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" event={"ID":"0153e881-4d2d-4ff6-9e70-d6163a62970c","Type":"ContainerDied","Data":"e2a63d079e5ee11958ec39f8f4a8f847c989ad7e18e50f889fb7e7fc75f4ff71"} Dec 04 22:23:05.414692 master-0 kubenswrapper[33572]: I1204 22:23:05.414660 33572 scope.go:117] "RemoveContainer" containerID="e2a63d079e5ee11958ec39f8f4a8f847c989ad7e18e50f889fb7e7fc75f4ff71" Dec 04 22:23:05.415172 master-0 kubenswrapper[33572]: I1204 22:23:05.415087 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5c4bw" podStartSLOduration=2.364879515 podStartE2EDuration="6.415033688s" podCreationTimestamp="2025-12-04 22:22:59 +0000 UTC" firstStartedPulling="2025-12-04 22:23:00.179760892 +0000 UTC m=+243.907286541" lastFinishedPulling="2025-12-04 22:23:04.229915065 +0000 UTC m=+247.957440714" observedRunningTime="2025-12-04 22:23:05.41075006 +0000 UTC m=+249.138275709" watchObservedRunningTime="2025-12-04 22:23:05.415033688 +0000 UTC m=+249.142559337" Dec 04 22:23:05.419075 master-0 kubenswrapper[33572]: I1204 22:23:05.417353 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6559dcc668-87vwg" event={"ID":"9414862e-1a68-4ed3-8229-f25e6a2ba0fa","Type":"ContainerStarted","Data":"b12b5b8acee8827094ad3078d3b07dc04d490a7ee556b96ef3efdbba423bd6d9"} Dec 04 22:23:05.437571 master-0 kubenswrapper[33572]: I1204 22:23:05.437164 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-54dbc87ccb-bgbjl" podStartSLOduration=2.789183187 podStartE2EDuration="6.43713986s" podCreationTimestamp="2025-12-04 22:22:59 +0000 UTC" firstStartedPulling="2025-12-04 22:23:00.608138527 +0000 UTC m=+244.335664176" lastFinishedPulling="2025-12-04 22:23:04.2560952 +0000 UTC m=+247.983620849" observedRunningTime="2025-12-04 22:23:05.432149273 +0000 UTC m=+249.159674922" watchObservedRunningTime="2025-12-04 22:23:05.43713986 +0000 UTC m=+249.164665509" Dec 04 22:23:05.498576 master-0 kubenswrapper[33572]: I1204 22:23:05.498358 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m87cl\" (UniqueName: \"kubernetes.io/projected/4c1b60d6-ca5c-40b5-a567-a48ef956163b-kube-api-access-m87cl\") pod \"downloads-69cd4c69bf-b4qng\" (UID: \"4c1b60d6-ca5c-40b5-a567-a48ef956163b\") " pod="openshift-console/downloads-69cd4c69bf-b4qng" Dec 04 22:23:05.518374 master-0 kubenswrapper[33572]: I1204 22:23:05.518038 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m87cl\" (UniqueName: \"kubernetes.io/projected/4c1b60d6-ca5c-40b5-a567-a48ef956163b-kube-api-access-m87cl\") pod \"downloads-69cd4c69bf-b4qng\" (UID: \"4c1b60d6-ca5c-40b5-a567-a48ef956163b\") " pod="openshift-console/downloads-69cd4c69bf-b4qng" Dec 04 22:23:05.671582 master-0 kubenswrapper[33572]: I1204 22:23:05.670588 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-69cd4c69bf-b4qng" Dec 04 22:23:06.135536 master-0 kubenswrapper[33572]: I1204 22:23:06.135163 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-69cd4c69bf-b4qng"] Dec 04 22:23:06.140077 master-0 kubenswrapper[33572]: W1204 22:23:06.139965 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c1b60d6_ca5c_40b5_a567_a48ef956163b.slice/crio-79b4b2db29a287da6bae9e3e6457efde13f6ab031cf2acf5760b0696a7008a09 WatchSource:0}: Error finding container 79b4b2db29a287da6bae9e3e6457efde13f6ab031cf2acf5760b0696a7008a09: Status 404 returned error can't find the container with id 79b4b2db29a287da6bae9e3e6457efde13f6ab031cf2acf5760b0696a7008a09 Dec 04 22:23:06.432015 master-0 kubenswrapper[33572]: I1204 22:23:06.431868 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-69cd4c69bf-b4qng" event={"ID":"4c1b60d6-ca5c-40b5-a567-a48ef956163b","Type":"ContainerStarted","Data":"79b4b2db29a287da6bae9e3e6457efde13f6ab031cf2acf5760b0696a7008a09"} Dec 04 22:23:06.437925 master-0 kubenswrapper[33572]: I1204 22:23:06.437864 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerStarted","Data":"969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f"} Dec 04 22:23:06.441355 master-0 kubenswrapper[33572]: I1204 22:23:06.440729 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79f5646748-zd47k_0153e881-4d2d-4ff6-9e70-d6163a62970c/telemeter-client/0.log" Dec 04 22:23:06.441355 master-0 kubenswrapper[33572]: I1204 22:23:06.441077 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" event={"ID":"0153e881-4d2d-4ff6-9e70-d6163a62970c","Type":"ContainerStarted","Data":"cc4a2835a4e8f84e9bb0ba6a8f75fdbdd29ec77815f9371e8d44d23d85695393"} Dec 04 22:23:06.481873 master-0 kubenswrapper[33572]: I1204 22:23:06.481673 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=36.554748265 podStartE2EDuration="39.481641169s" podCreationTimestamp="2025-12-04 22:22:27 +0000 UTC" firstStartedPulling="2025-12-04 22:23:01.337338972 +0000 UTC m=+245.064864621" lastFinishedPulling="2025-12-04 22:23:04.264231876 +0000 UTC m=+247.991757525" observedRunningTime="2025-12-04 22:23:06.471206961 +0000 UTC m=+250.198732640" watchObservedRunningTime="2025-12-04 22:23:06.481641169 +0000 UTC m=+250.209166828" Dec 04 22:23:06.507663 master-0 kubenswrapper[33572]: I1204 22:23:06.507563 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-79f5646748-zd47k" podStartSLOduration=66.939609058 podStartE2EDuration="1m10.507533266s" podCreationTimestamp="2025-12-04 22:21:56 +0000 UTC" firstStartedPulling="2025-12-04 22:23:00.688707267 +0000 UTC m=+244.416232916" lastFinishedPulling="2025-12-04 22:23:04.256631465 +0000 UTC m=+247.984157124" observedRunningTime="2025-12-04 22:23:06.503612488 +0000 UTC m=+250.231138137" watchObservedRunningTime="2025-12-04 22:23:06.507533266 +0000 UTC m=+250.235058935" Dec 04 22:23:07.480532 master-0 kubenswrapper[33572]: I1204 22:23:07.479655 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6559dcc668-87vwg" event={"ID":"9414862e-1a68-4ed3-8229-f25e6a2ba0fa","Type":"ContainerStarted","Data":"fcb25fb59a1136d88601738ccd770f9a51aea29254969d60a7139e980d0e703b"} Dec 04 22:23:07.481301 master-0 kubenswrapper[33572]: I1204 22:23:07.481252 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6559dcc668-87vwg" Dec 04 22:23:07.501159 master-0 kubenswrapper[33572]: I1204 22:23:07.501104 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6559dcc668-87vwg" Dec 04 22:23:07.520920 master-0 kubenswrapper[33572]: I1204 22:23:07.520835 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6559dcc668-87vwg" podStartSLOduration=2.4065832240000002 podStartE2EDuration="4.52081333s" podCreationTimestamp="2025-12-04 22:23:03 +0000 UTC" firstStartedPulling="2025-12-04 22:23:04.655102431 +0000 UTC m=+248.382628080" lastFinishedPulling="2025-12-04 22:23:06.769332517 +0000 UTC m=+250.496858186" observedRunningTime="2025-12-04 22:23:07.517233401 +0000 UTC m=+251.244759050" watchObservedRunningTime="2025-12-04 22:23:07.52081333 +0000 UTC m=+251.248338979" Dec 04 22:23:08.955533 master-0 kubenswrapper[33572]: I1204 22:23:08.952932 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-66cdb6df67-9rjf8"] Dec 04 22:23:08.955533 master-0 kubenswrapper[33572]: I1204 22:23:08.954309 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:08.959555 master-0 kubenswrapper[33572]: I1204 22:23:08.958469 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 22:23:08.959555 master-0 kubenswrapper[33572]: I1204 22:23:08.958782 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-d9fhh" Dec 04 22:23:08.959555 master-0 kubenswrapper[33572]: I1204 22:23:08.958940 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 22:23:08.959555 master-0 kubenswrapper[33572]: I1204 22:23:08.959093 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 22:23:08.959555 master-0 kubenswrapper[33572]: I1204 22:23:08.959209 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 22:23:08.959555 master-0 kubenswrapper[33572]: I1204 22:23:08.959330 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 22:23:08.974633 master-0 kubenswrapper[33572]: I1204 22:23:08.973802 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66cdb6df67-9rjf8"] Dec 04 22:23:09.063169 master-0 kubenswrapper[33572]: I1204 22:23:09.063039 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-service-ca\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.063169 master-0 kubenswrapper[33572]: I1204 22:23:09.063174 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-serving-cert\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.063587 master-0 kubenswrapper[33572]: I1204 22:23:09.063376 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-oauth-serving-cert\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.063640 master-0 kubenswrapper[33572]: I1204 22:23:09.063602 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-config\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.063831 master-0 kubenswrapper[33572]: I1204 22:23:09.063803 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-oauth-config\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.064108 master-0 kubenswrapper[33572]: I1204 22:23:09.064039 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mxkd\" (UniqueName: \"kubernetes.io/projected/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-kube-api-access-2mxkd\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.166437 master-0 kubenswrapper[33572]: I1204 22:23:09.166364 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-oauth-serving-cert\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.166437 master-0 kubenswrapper[33572]: I1204 22:23:09.166458 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-config\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.167797 master-0 kubenswrapper[33572]: I1204 22:23:09.167766 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-config\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.167903 master-0 kubenswrapper[33572]: I1204 22:23:09.167829 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-oauth-config\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.167903 master-0 kubenswrapper[33572]: I1204 22:23:09.167897 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxkd\" (UniqueName: \"kubernetes.io/projected/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-kube-api-access-2mxkd\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.168837 master-0 kubenswrapper[33572]: I1204 22:23:09.167927 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-oauth-serving-cert\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.168837 master-0 kubenswrapper[33572]: I1204 22:23:09.167954 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-service-ca\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.168837 master-0 kubenswrapper[33572]: I1204 22:23:09.168416 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-serving-cert\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.168837 master-0 kubenswrapper[33572]: I1204 22:23:09.168741 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-service-ca\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.173460 master-0 kubenswrapper[33572]: I1204 22:23:09.173405 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-serving-cert\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.177526 master-0 kubenswrapper[33572]: I1204 22:23:09.177459 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-oauth-config\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.194868 master-0 kubenswrapper[33572]: I1204 22:23:09.194797 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxkd\" (UniqueName: \"kubernetes.io/projected/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-kube-api-access-2mxkd\") pod \"console-66cdb6df67-9rjf8\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.282850 master-0 kubenswrapper[33572]: I1204 22:23:09.282658 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:09.759214 master-0 kubenswrapper[33572]: I1204 22:23:09.759147 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66cdb6df67-9rjf8"] Dec 04 22:23:09.760615 master-0 kubenswrapper[33572]: W1204 22:23:09.760480 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c8931a_5d8a_4aec_b2e2_2bc109a2a506.slice/crio-caeebab2094dcd481fbd4af57e5180bc30a1ca4d0aee70966dd06fcaf65d8eef WatchSource:0}: Error finding container caeebab2094dcd481fbd4af57e5180bc30a1ca4d0aee70966dd06fcaf65d8eef: Status 404 returned error can't find the container with id caeebab2094dcd481fbd4af57e5180bc30a1ca4d0aee70966dd06fcaf65d8eef Dec 04 22:23:10.507869 master-0 kubenswrapper[33572]: I1204 22:23:10.507776 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cdb6df67-9rjf8" event={"ID":"68c8931a-5d8a-4aec-b2e2-2bc109a2a506","Type":"ContainerStarted","Data":"caeebab2094dcd481fbd4af57e5180bc30a1ca4d0aee70966dd06fcaf65d8eef"} Dec 04 22:23:11.972489 master-0 kubenswrapper[33572]: I1204 22:23:11.972413 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-744594955b-qspk5"] Dec 04 22:23:11.977944 master-0 kubenswrapper[33572]: I1204 22:23:11.974052 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:11.986867 master-0 kubenswrapper[33572]: I1204 22:23:11.986817 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 22:23:11.990445 master-0 kubenswrapper[33572]: I1204 22:23:11.990399 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-744594955b-qspk5"] Dec 04 22:23:12.042994 master-0 kubenswrapper[33572]: I1204 22:23:12.042897 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7njp\" (UniqueName: \"kubernetes.io/projected/502c032f-d279-41ec-ac38-16bf4b9b1950-kube-api-access-l7njp\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.043352 master-0 kubenswrapper[33572]: I1204 22:23:12.043181 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-service-ca\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.043352 master-0 kubenswrapper[33572]: I1204 22:23:12.043295 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-trusted-ca-bundle\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.043352 master-0 kubenswrapper[33572]: I1204 22:23:12.043326 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-oauth-serving-cert\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.043517 master-0 kubenswrapper[33572]: I1204 22:23:12.043455 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-serving-cert\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.043654 master-0 kubenswrapper[33572]: I1204 22:23:12.043618 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-oauth-config\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.043710 master-0 kubenswrapper[33572]: I1204 22:23:12.043689 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-console-config\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.148415 master-0 kubenswrapper[33572]: I1204 22:23:12.148372 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-serving-cert\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.148755 master-0 kubenswrapper[33572]: I1204 22:23:12.148711 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-oauth-config\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.148847 master-0 kubenswrapper[33572]: I1204 22:23:12.148831 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-console-config\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.149020 master-0 kubenswrapper[33572]: I1204 22:23:12.148996 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7njp\" (UniqueName: \"kubernetes.io/projected/502c032f-d279-41ec-ac38-16bf4b9b1950-kube-api-access-l7njp\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.149098 master-0 kubenswrapper[33572]: I1204 22:23:12.149082 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-service-ca\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.149176 master-0 kubenswrapper[33572]: I1204 22:23:12.149158 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-trusted-ca-bundle\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.149294 master-0 kubenswrapper[33572]: I1204 22:23:12.149198 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-oauth-serving-cert\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.150004 master-0 kubenswrapper[33572]: I1204 22:23:12.149943 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-console-config\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.150172 master-0 kubenswrapper[33572]: I1204 22:23:12.150133 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-service-ca\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.150813 master-0 kubenswrapper[33572]: I1204 22:23:12.150784 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-oauth-serving-cert\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.151319 master-0 kubenswrapper[33572]: I1204 22:23:12.151276 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-trusted-ca-bundle\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.153050 master-0 kubenswrapper[33572]: I1204 22:23:12.152992 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-serving-cert\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.153115 master-0 kubenswrapper[33572]: I1204 22:23:12.153002 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-oauth-config\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.168452 master-0 kubenswrapper[33572]: I1204 22:23:12.168418 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7njp\" (UniqueName: \"kubernetes.io/projected/502c032f-d279-41ec-ac38-16bf4b9b1950-kube-api-access-l7njp\") pod \"console-744594955b-qspk5\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:12.313003 master-0 kubenswrapper[33572]: I1204 22:23:12.312907 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:17.577018 master-0 kubenswrapper[33572]: I1204 22:23:17.576891 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-744594955b-qspk5"] Dec 04 22:23:18.606383 master-0 kubenswrapper[33572]: I1204 22:23:18.606294 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cdb6df67-9rjf8" event={"ID":"68c8931a-5d8a-4aec-b2e2-2bc109a2a506","Type":"ContainerStarted","Data":"76ba06ad81dad534ad3579c774e0c7eb59c406d88f07c3fb93358c1f334cacfb"} Dec 04 22:23:18.609944 master-0 kubenswrapper[33572]: I1204 22:23:18.609902 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744594955b-qspk5" event={"ID":"502c032f-d279-41ec-ac38-16bf4b9b1950","Type":"ContainerStarted","Data":"960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1"} Dec 04 22:23:18.609944 master-0 kubenswrapper[33572]: I1204 22:23:18.609946 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744594955b-qspk5" event={"ID":"502c032f-d279-41ec-ac38-16bf4b9b1950","Type":"ContainerStarted","Data":"32914af44db102fd411cf24611da49ac9c5321501bc0160bc563aa53daf0132f"} Dec 04 22:23:18.632885 master-0 kubenswrapper[33572]: I1204 22:23:18.632785 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66cdb6df67-9rjf8" podStartSLOduration=2.726152215 podStartE2EDuration="10.632755619s" podCreationTimestamp="2025-12-04 22:23:08 +0000 UTC" firstStartedPulling="2025-12-04 22:23:09.762959659 +0000 UTC m=+253.490485308" lastFinishedPulling="2025-12-04 22:23:17.669563063 +0000 UTC m=+261.397088712" observedRunningTime="2025-12-04 22:23:18.628711397 +0000 UTC m=+262.356237056" watchObservedRunningTime="2025-12-04 22:23:18.632755619 +0000 UTC m=+262.360281288" Dec 04 22:23:18.656232 master-0 kubenswrapper[33572]: I1204 22:23:18.656124 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-744594955b-qspk5" podStartSLOduration=7.656101546 podStartE2EDuration="7.656101546s" podCreationTimestamp="2025-12-04 22:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:23:18.654459971 +0000 UTC m=+262.381985630" watchObservedRunningTime="2025-12-04 22:23:18.656101546 +0000 UTC m=+262.383627195" Dec 04 22:23:19.283004 master-0 kubenswrapper[33572]: I1204 22:23:19.282921 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:19.283293 master-0 kubenswrapper[33572]: I1204 22:23:19.283073 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:19.290272 master-0 kubenswrapper[33572]: I1204 22:23:19.289937 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:19.620021 master-0 kubenswrapper[33572]: I1204 22:23:19.619917 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:22.314209 master-0 kubenswrapper[33572]: I1204 22:23:22.314138 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:22.314209 master-0 kubenswrapper[33572]: I1204 22:23:22.314210 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:22.319137 master-0 kubenswrapper[33572]: I1204 22:23:22.319082 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:22.652160 master-0 kubenswrapper[33572]: I1204 22:23:22.652065 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-744594955b-qspk5" Dec 04 22:23:22.727902 master-0 kubenswrapper[33572]: I1204 22:23:22.727703 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66cdb6df67-9rjf8"] Dec 04 22:23:29.125555 master-0 kubenswrapper[33572]: I1204 22:23:29.125460 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5dd7b479dd-5z246"] Dec 04 22:23:42.842435 master-0 kubenswrapper[33572]: I1204 22:23:42.842364 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-69cd4c69bf-b4qng" event={"ID":"4c1b60d6-ca5c-40b5-a567-a48ef956163b","Type":"ContainerStarted","Data":"25c3c44268394477a042b4a48d34210cb3113a8f47f458e95d6a5812d670aabb"} Dec 04 22:23:42.846669 master-0 kubenswrapper[33572]: I1204 22:23:42.846046 33572 patch_prober.go:28] interesting pod/downloads-69cd4c69bf-b4qng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.96:8080/\": dial tcp 10.128.0.96:8080: connect: connection refused" start-of-body= Dec 04 22:23:42.846769 master-0 kubenswrapper[33572]: I1204 22:23:42.846620 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-69cd4c69bf-b4qng" Dec 04 22:23:42.846822 master-0 kubenswrapper[33572]: I1204 22:23:42.846726 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-69cd4c69bf-b4qng" podUID="4c1b60d6-ca5c-40b5-a567-a48ef956163b" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.96:8080/\": dial tcp 10.128.0.96:8080: connect: connection refused" Dec 04 22:23:43.851534 master-0 kubenswrapper[33572]: I1204 22:23:43.851463 33572 patch_prober.go:28] interesting pod/downloads-69cd4c69bf-b4qng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.96:8080/\": dial tcp 10.128.0.96:8080: connect: connection refused" start-of-body= Dec 04 22:23:43.852150 master-0 kubenswrapper[33572]: I1204 22:23:43.851541 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-69cd4c69bf-b4qng" podUID="4c1b60d6-ca5c-40b5-a567-a48ef956163b" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.96:8080/\": dial tcp 10.128.0.96:8080: connect: connection refused" Dec 04 22:23:44.645608 master-0 kubenswrapper[33572]: I1204 22:23:44.645429 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-69cd4c69bf-b4qng" podStartSLOduration=3.897382535 podStartE2EDuration="39.645347535s" podCreationTimestamp="2025-12-04 22:23:05 +0000 UTC" firstStartedPulling="2025-12-04 22:23:06.143408061 +0000 UTC m=+249.870933710" lastFinishedPulling="2025-12-04 22:23:41.891373061 +0000 UTC m=+285.618898710" observedRunningTime="2025-12-04 22:23:44.165924157 +0000 UTC m=+287.893449846" watchObservedRunningTime="2025-12-04 22:23:44.645347535 +0000 UTC m=+288.372873214" Dec 04 22:23:44.869777 master-0 kubenswrapper[33572]: I1204 22:23:44.869691 33572 patch_prober.go:28] interesting pod/downloads-69cd4c69bf-b4qng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.96:8080/\": dial tcp 10.128.0.96:8080: connect: connection refused" start-of-body= Dec 04 22:23:44.870450 master-0 kubenswrapper[33572]: I1204 22:23:44.869797 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-69cd4c69bf-b4qng" podUID="4c1b60d6-ca5c-40b5-a567-a48ef956163b" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.96:8080/\": dial tcp 10.128.0.96:8080: connect: connection refused" Dec 04 22:23:45.673263 master-0 kubenswrapper[33572]: I1204 22:23:45.673183 33572 patch_prober.go:28] interesting pod/downloads-69cd4c69bf-b4qng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.96:8080/\": dial tcp 10.128.0.96:8080: connect: connection refused" start-of-body= Dec 04 22:23:45.673263 master-0 kubenswrapper[33572]: I1204 22:23:45.673234 33572 patch_prober.go:28] interesting pod/downloads-69cd4c69bf-b4qng container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.128.0.96:8080/\": dial tcp 10.128.0.96:8080: connect: connection refused" start-of-body= Dec 04 22:23:45.673616 master-0 kubenswrapper[33572]: I1204 22:23:45.673270 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-69cd4c69bf-b4qng" podUID="4c1b60d6-ca5c-40b5-a567-a48ef956163b" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.96:8080/\": dial tcp 10.128.0.96:8080: connect: connection refused" Dec 04 22:23:45.673616 master-0 kubenswrapper[33572]: I1204 22:23:45.673316 33572 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-69cd4c69bf-b4qng" podUID="4c1b60d6-ca5c-40b5-a567-a48ef956163b" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.96:8080/\": dial tcp 10.128.0.96:8080: connect: connection refused" Dec 04 22:23:47.785752 master-0 kubenswrapper[33572]: I1204 22:23:47.785656 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-66cdb6df67-9rjf8" podUID="68c8931a-5d8a-4aec-b2e2-2bc109a2a506" containerName="console" containerID="cri-o://76ba06ad81dad534ad3579c774e0c7eb59c406d88f07c3fb93358c1f334cacfb" gracePeriod=15 Dec 04 22:23:48.904243 master-0 kubenswrapper[33572]: I1204 22:23:48.904159 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66cdb6df67-9rjf8_68c8931a-5d8a-4aec-b2e2-2bc109a2a506/console/0.log" Dec 04 22:23:48.904243 master-0 kubenswrapper[33572]: I1204 22:23:48.904234 33572 generic.go:334] "Generic (PLEG): container finished" podID="68c8931a-5d8a-4aec-b2e2-2bc109a2a506" containerID="76ba06ad81dad534ad3579c774e0c7eb59c406d88f07c3fb93358c1f334cacfb" exitCode=2 Dec 04 22:23:48.905191 master-0 kubenswrapper[33572]: I1204 22:23:48.904277 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cdb6df67-9rjf8" event={"ID":"68c8931a-5d8a-4aec-b2e2-2bc109a2a506","Type":"ContainerDied","Data":"76ba06ad81dad534ad3579c774e0c7eb59c406d88f07c3fb93358c1f334cacfb"} Dec 04 22:23:49.284630 master-0 kubenswrapper[33572]: I1204 22:23:49.284490 33572 patch_prober.go:28] interesting pod/console-66cdb6df67-9rjf8 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Dec 04 22:23:49.284973 master-0 kubenswrapper[33572]: I1204 22:23:49.284664 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-66cdb6df67-9rjf8" podUID="68c8931a-5d8a-4aec-b2e2-2bc109a2a506" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Dec 04 22:23:51.133287 master-0 kubenswrapper[33572]: I1204 22:23:51.133226 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66cdb6df67-9rjf8_68c8931a-5d8a-4aec-b2e2-2bc109a2a506/console/0.log" Dec 04 22:23:51.133875 master-0 kubenswrapper[33572]: I1204 22:23:51.133330 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:51.189379 master-0 kubenswrapper[33572]: I1204 22:23:51.188327 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-oauth-serving-cert\") pod \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " Dec 04 22:23:51.189379 master-0 kubenswrapper[33572]: I1204 22:23:51.188455 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mxkd\" (UniqueName: \"kubernetes.io/projected/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-kube-api-access-2mxkd\") pod \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " Dec 04 22:23:51.189379 master-0 kubenswrapper[33572]: I1204 22:23:51.188574 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-config\") pod \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " Dec 04 22:23:51.189379 master-0 kubenswrapper[33572]: I1204 22:23:51.188681 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-service-ca\") pod \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " Dec 04 22:23:51.189379 master-0 kubenswrapper[33572]: I1204 22:23:51.188825 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-serving-cert\") pod \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " Dec 04 22:23:51.189379 master-0 kubenswrapper[33572]: I1204 22:23:51.188873 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-oauth-config\") pod \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\" (UID: \"68c8931a-5d8a-4aec-b2e2-2bc109a2a506\") " Dec 04 22:23:51.189379 master-0 kubenswrapper[33572]: I1204 22:23:51.189141 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "68c8931a-5d8a-4aec-b2e2-2bc109a2a506" (UID: "68c8931a-5d8a-4aec-b2e2-2bc109a2a506"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:23:51.189887 master-0 kubenswrapper[33572]: I1204 22:23:51.189697 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-service-ca" (OuterVolumeSpecName: "service-ca") pod "68c8931a-5d8a-4aec-b2e2-2bc109a2a506" (UID: "68c8931a-5d8a-4aec-b2e2-2bc109a2a506"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:23:51.189937 master-0 kubenswrapper[33572]: I1204 22:23:51.189865 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-config" (OuterVolumeSpecName: "console-config") pod "68c8931a-5d8a-4aec-b2e2-2bc109a2a506" (UID: "68c8931a-5d8a-4aec-b2e2-2bc109a2a506"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:23:51.190429 master-0 kubenswrapper[33572]: I1204 22:23:51.190387 33572 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:51.190429 master-0 kubenswrapper[33572]: I1204 22:23:51.190418 33572 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:51.190580 master-0 kubenswrapper[33572]: I1204 22:23:51.190435 33572 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:51.192738 master-0 kubenswrapper[33572]: I1204 22:23:51.192696 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-kube-api-access-2mxkd" (OuterVolumeSpecName: "kube-api-access-2mxkd") pod "68c8931a-5d8a-4aec-b2e2-2bc109a2a506" (UID: "68c8931a-5d8a-4aec-b2e2-2bc109a2a506"). InnerVolumeSpecName "kube-api-access-2mxkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:23:51.192835 master-0 kubenswrapper[33572]: I1204 22:23:51.192797 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "68c8931a-5d8a-4aec-b2e2-2bc109a2a506" (UID: "68c8931a-5d8a-4aec-b2e2-2bc109a2a506"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:23:51.194610 master-0 kubenswrapper[33572]: I1204 22:23:51.194492 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "68c8931a-5d8a-4aec-b2e2-2bc109a2a506" (UID: "68c8931a-5d8a-4aec-b2e2-2bc109a2a506"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:23:51.291761 master-0 kubenswrapper[33572]: I1204 22:23:51.291588 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mxkd\" (UniqueName: \"kubernetes.io/projected/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-kube-api-access-2mxkd\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:51.291761 master-0 kubenswrapper[33572]: I1204 22:23:51.291641 33572 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:51.291761 master-0 kubenswrapper[33572]: I1204 22:23:51.291653 33572 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/68c8931a-5d8a-4aec-b2e2-2bc109a2a506-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:51.939184 master-0 kubenswrapper[33572]: I1204 22:23:51.939095 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66cdb6df67-9rjf8_68c8931a-5d8a-4aec-b2e2-2bc109a2a506/console/0.log" Dec 04 22:23:51.939643 master-0 kubenswrapper[33572]: I1204 22:23:51.939317 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66cdb6df67-9rjf8" event={"ID":"68c8931a-5d8a-4aec-b2e2-2bc109a2a506","Type":"ContainerDied","Data":"caeebab2094dcd481fbd4af57e5180bc30a1ca4d0aee70966dd06fcaf65d8eef"} Dec 04 22:23:51.939643 master-0 kubenswrapper[33572]: I1204 22:23:51.939381 33572 scope.go:117] "RemoveContainer" containerID="76ba06ad81dad534ad3579c774e0c7eb59c406d88f07c3fb93358c1f334cacfb" Dec 04 22:23:51.939643 master-0 kubenswrapper[33572]: I1204 22:23:51.939537 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66cdb6df67-9rjf8" Dec 04 22:23:53.289472 master-0 kubenswrapper[33572]: I1204 22:23:53.289371 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:23:53.334015 master-0 kubenswrapper[33572]: I1204 22:23:53.333935 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:23:53.998965 master-0 kubenswrapper[33572]: I1204 22:23:53.998881 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:23:54.163467 master-0 kubenswrapper[33572]: I1204 22:23:54.163285 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" podUID="d685460a-9a74-411d-983a-79af235b2cc0" containerName="oauth-openshift" containerID="cri-o://9496356901a9f29e38cb4b8ee6ac1abc544307a1b801d9b2ead4cd1140645187" gracePeriod=15 Dec 04 22:23:55.691843 master-0 kubenswrapper[33572]: I1204 22:23:55.691775 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-69cd4c69bf-b4qng" Dec 04 22:23:56.379360 master-0 kubenswrapper[33572]: I1204 22:23:56.379216 33572 patch_prober.go:28] interesting pod/oauth-openshift-5dd7b479dd-5z246 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.87:6443/healthz\": dial tcp 10.128.0.87:6443: connect: connection refused" start-of-body= Dec 04 22:23:56.379657 master-0 kubenswrapper[33572]: I1204 22:23:56.379371 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" podUID="d685460a-9a74-411d-983a-79af235b2cc0" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.87:6443/healthz\": dial tcp 10.128.0.87:6443: connect: connection refused" Dec 04 22:23:56.492622 master-0 kubenswrapper[33572]: I1204 22:23:56.492229 33572 kubelet.go:1505] "Image garbage collection succeeded" Dec 04 22:23:57.011351 master-0 kubenswrapper[33572]: I1204 22:23:57.011130 33572 generic.go:334] "Generic (PLEG): container finished" podID="d685460a-9a74-411d-983a-79af235b2cc0" containerID="9496356901a9f29e38cb4b8ee6ac1abc544307a1b801d9b2ead4cd1140645187" exitCode=0 Dec 04 22:23:57.011351 master-0 kubenswrapper[33572]: I1204 22:23:57.011222 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" event={"ID":"d685460a-9a74-411d-983a-79af235b2cc0","Type":"ContainerDied","Data":"9496356901a9f29e38cb4b8ee6ac1abc544307a1b801d9b2ead4cd1140645187"} Dec 04 22:23:57.211436 master-0 kubenswrapper[33572]: I1204 22:23:57.211332 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66cdb6df67-9rjf8"] Dec 04 22:23:57.252002 master-0 kubenswrapper[33572]: I1204 22:23:57.251887 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66cdb6df67-9rjf8"] Dec 04 22:23:57.470061 master-0 kubenswrapper[33572]: I1204 22:23:57.469982 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:23:57.518401 master-0 kubenswrapper[33572]: I1204 22:23:57.518297 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-router-certs\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.518727 master-0 kubenswrapper[33572]: I1204 22:23:57.518430 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-cliconfig\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.518727 master-0 kubenswrapper[33572]: I1204 22:23:57.518469 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-login\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.518727 master-0 kubenswrapper[33572]: I1204 22:23:57.518523 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-serving-cert\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.518727 master-0 kubenswrapper[33572]: I1204 22:23:57.518556 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8s2j\" (UniqueName: \"kubernetes.io/projected/d685460a-9a74-411d-983a-79af235b2cc0-kube-api-access-l8s2j\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.518727 master-0 kubenswrapper[33572]: I1204 22:23:57.518603 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-trusted-ca-bundle\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.518990 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-error\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519039 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-provider-selection\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519067 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d685460a-9a74-411d-983a-79af235b2cc0-audit-dir\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519108 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-audit-policies\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519143 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-ocp-branding-template\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519199 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d685460a-9a74-411d-983a-79af235b2cc0-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519227 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-session\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519233 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519277 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-service-ca\") pod \"d685460a-9a74-411d-983a-79af235b2cc0\" (UID: \"d685460a-9a74-411d-983a-79af235b2cc0\") " Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519364 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519713 33572 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519730 33572 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519741 33572 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d685460a-9a74-411d-983a-79af235b2cc0-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.519785 master-0 kubenswrapper[33572]: I1204 22:23:57.519739 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:23:57.520478 master-0 kubenswrapper[33572]: I1204 22:23:57.520390 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:23:57.522962 master-0 kubenswrapper[33572]: I1204 22:23:57.522898 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:23:57.523149 master-0 kubenswrapper[33572]: I1204 22:23:57.523116 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:23:57.523366 master-0 kubenswrapper[33572]: I1204 22:23:57.523314 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:23:57.523417 master-0 kubenswrapper[33572]: I1204 22:23:57.523293 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d685460a-9a74-411d-983a-79af235b2cc0-kube-api-access-l8s2j" (OuterVolumeSpecName: "kube-api-access-l8s2j") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "kube-api-access-l8s2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:23:57.523682 master-0 kubenswrapper[33572]: I1204 22:23:57.523594 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:23:57.523752 master-0 kubenswrapper[33572]: I1204 22:23:57.523715 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:23:57.524425 master-0 kubenswrapper[33572]: I1204 22:23:57.524330 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:23:57.525194 master-0 kubenswrapper[33572]: I1204 22:23:57.525150 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d685460a-9a74-411d-983a-79af235b2cc0" (UID: "d685460a-9a74-411d-983a-79af235b2cc0"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:23:57.621140 master-0 kubenswrapper[33572]: I1204 22:23:57.621050 33572 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.621140 master-0 kubenswrapper[33572]: I1204 22:23:57.621104 33572 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.621140 master-0 kubenswrapper[33572]: I1204 22:23:57.621115 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8s2j\" (UniqueName: \"kubernetes.io/projected/d685460a-9a74-411d-983a-79af235b2cc0-kube-api-access-l8s2j\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.621140 master-0 kubenswrapper[33572]: I1204 22:23:57.621126 33572 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.621739 master-0 kubenswrapper[33572]: I1204 22:23:57.621180 33572 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.621739 master-0 kubenswrapper[33572]: I1204 22:23:57.621241 33572 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-audit-policies\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.621739 master-0 kubenswrapper[33572]: I1204 22:23:57.621265 33572 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.621739 master-0 kubenswrapper[33572]: I1204 22:23:57.621289 33572 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.621739 master-0 kubenswrapper[33572]: I1204 22:23:57.621311 33572 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:57.621739 master-0 kubenswrapper[33572]: I1204 22:23:57.621331 33572 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d685460a-9a74-411d-983a-79af235b2cc0-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:23:58.021863 master-0 kubenswrapper[33572]: I1204 22:23:58.021780 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" event={"ID":"d685460a-9a74-411d-983a-79af235b2cc0","Type":"ContainerDied","Data":"8c6f9f0ddb838fa2eccda391b140ed7bd936b5b53875b4276b92e13799a7b546"} Dec 04 22:23:58.021863 master-0 kubenswrapper[33572]: I1204 22:23:58.021859 33572 scope.go:117] "RemoveContainer" containerID="9496356901a9f29e38cb4b8ee6ac1abc544307a1b801d9b2ead4cd1140645187" Dec 04 22:23:58.022440 master-0 kubenswrapper[33572]: I1204 22:23:58.021874 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5dd7b479dd-5z246" Dec 04 22:23:58.541761 master-0 kubenswrapper[33572]: I1204 22:23:58.541657 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c8931a-5d8a-4aec-b2e2-2bc109a2a506" path="/var/lib/kubelet/pods/68c8931a-5d8a-4aec-b2e2-2bc109a2a506/volumes" Dec 04 22:23:58.818132 master-0 kubenswrapper[33572]: I1204 22:23:58.817954 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6cfff4b945-wlg4k"] Dec 04 22:23:58.818482 master-0 kubenswrapper[33572]: E1204 22:23:58.818418 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d685460a-9a74-411d-983a-79af235b2cc0" containerName="oauth-openshift" Dec 04 22:23:58.818482 master-0 kubenswrapper[33572]: I1204 22:23:58.818447 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d685460a-9a74-411d-983a-79af235b2cc0" containerName="oauth-openshift" Dec 04 22:23:58.818740 master-0 kubenswrapper[33572]: E1204 22:23:58.818494 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c8931a-5d8a-4aec-b2e2-2bc109a2a506" containerName="console" Dec 04 22:23:58.818740 master-0 kubenswrapper[33572]: I1204 22:23:58.818532 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c8931a-5d8a-4aec-b2e2-2bc109a2a506" containerName="console" Dec 04 22:23:58.818884 master-0 kubenswrapper[33572]: I1204 22:23:58.818758 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="d685460a-9a74-411d-983a-79af235b2cc0" containerName="oauth-openshift" Dec 04 22:23:58.818884 master-0 kubenswrapper[33572]: I1204 22:23:58.818833 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c8931a-5d8a-4aec-b2e2-2bc109a2a506" containerName="console" Dec 04 22:23:58.819672 master-0 kubenswrapper[33572]: I1204 22:23:58.819620 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.824538 master-0 kubenswrapper[33572]: I1204 22:23:58.823004 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 22:23:58.824538 master-0 kubenswrapper[33572]: I1204 22:23:58.823287 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 22:23:58.828528 master-0 kubenswrapper[33572]: I1204 22:23:58.827804 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 22:23:58.828528 master-0 kubenswrapper[33572]: I1204 22:23:58.828054 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 22:23:58.828528 master-0 kubenswrapper[33572]: I1204 22:23:58.828333 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 22:23:58.828528 master-0 kubenswrapper[33572]: I1204 22:23:58.828464 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 22:23:58.828800 master-0 kubenswrapper[33572]: I1204 22:23:58.828582 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 22:23:58.828800 master-0 kubenswrapper[33572]: I1204 22:23:58.828610 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 22:23:58.828800 master-0 kubenswrapper[33572]: I1204 22:23:58.828704 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 22:23:58.831492 master-0 kubenswrapper[33572]: I1204 22:23:58.831423 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-qjgpp" Dec 04 22:23:58.831767 master-0 kubenswrapper[33572]: I1204 22:23:58.831706 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 22:23:58.833842 master-0 kubenswrapper[33572]: I1204 22:23:58.833736 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 22:23:58.835560 master-0 kubenswrapper[33572]: I1204 22:23:58.835434 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5dd7b479dd-5z246"] Dec 04 22:23:58.844086 master-0 kubenswrapper[33572]: I1204 22:23:58.844021 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 22:23:58.845485 master-0 kubenswrapper[33572]: I1204 22:23:58.845431 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-5dd7b479dd-5z246"] Dec 04 22:23:58.846494 master-0 kubenswrapper[33572]: I1204 22:23:58.846452 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 22:23:58.851036 master-0 kubenswrapper[33572]: I1204 22:23:58.850993 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cfff4b945-wlg4k"] Dec 04 22:23:58.951368 master-0 kubenswrapper[33572]: I1204 22:23:58.951285 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f9552e4-fda9-4207-a4ed-a0486fd1018e-audit-dir\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.951643 master-0 kubenswrapper[33572]: I1204 22:23:58.951458 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.951687 master-0 kubenswrapper[33572]: I1204 22:23:58.951652 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-user-template-error\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.951724 master-0 kubenswrapper[33572]: I1204 22:23:58.951712 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.951871 master-0 kubenswrapper[33572]: I1204 22:23:58.951768 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-audit-policies\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.951871 master-0 kubenswrapper[33572]: I1204 22:23:58.951853 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-session\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.952274 master-0 kubenswrapper[33572]: I1204 22:23:58.952196 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cl24\" (UniqueName: \"kubernetes.io/projected/1f9552e4-fda9-4207-a4ed-a0486fd1018e-kube-api-access-6cl24\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.952365 master-0 kubenswrapper[33572]: I1204 22:23:58.952330 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.952470 master-0 kubenswrapper[33572]: I1204 22:23:58.952406 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-user-template-login\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.952470 master-0 kubenswrapper[33572]: I1204 22:23:58.952459 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.952638 master-0 kubenswrapper[33572]: I1204 22:23:58.952596 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.952705 master-0 kubenswrapper[33572]: I1204 22:23:58.952687 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:58.952905 master-0 kubenswrapper[33572]: I1204 22:23:58.952867 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.054731 master-0 kubenswrapper[33572]: I1204 22:23:59.054610 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cl24\" (UniqueName: \"kubernetes.io/projected/1f9552e4-fda9-4207-a4ed-a0486fd1018e-kube-api-access-6cl24\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.054731 master-0 kubenswrapper[33572]: I1204 22:23:59.054706 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.055611 master-0 kubenswrapper[33572]: I1204 22:23:59.055093 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-user-template-login\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.055611 master-0 kubenswrapper[33572]: I1204 22:23:59.055470 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.055714 master-0 kubenswrapper[33572]: I1204 22:23:59.055643 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.055822 master-0 kubenswrapper[33572]: I1204 22:23:59.055775 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.056068 master-0 kubenswrapper[33572]: I1204 22:23:59.056032 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.056191 master-0 kubenswrapper[33572]: I1204 22:23:59.056157 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.056306 master-0 kubenswrapper[33572]: I1204 22:23:59.056195 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-user-template-error\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.057384 master-0 kubenswrapper[33572]: I1204 22:23:59.056449 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f9552e4-fda9-4207-a4ed-a0486fd1018e-audit-dir\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.057384 master-0 kubenswrapper[33572]: I1204 22:23:59.056567 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.057384 master-0 kubenswrapper[33572]: I1204 22:23:59.056626 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-audit-policies\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.057384 master-0 kubenswrapper[33572]: I1204 22:23:59.056692 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-session\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.057384 master-0 kubenswrapper[33572]: I1204 22:23:59.056870 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-service-ca\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.057384 master-0 kubenswrapper[33572]: I1204 22:23:59.057199 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.057384 master-0 kubenswrapper[33572]: I1204 22:23:59.057311 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f9552e4-fda9-4207-a4ed-a0486fd1018e-audit-dir\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.057997 master-0 kubenswrapper[33572]: I1204 22:23:59.057667 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.058078 master-0 kubenswrapper[33572]: I1204 22:23:59.058000 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f9552e4-fda9-4207-a4ed-a0486fd1018e-audit-policies\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.059995 master-0 kubenswrapper[33572]: I1204 22:23:59.059955 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-user-template-login\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.061094 master-0 kubenswrapper[33572]: I1204 22:23:59.061040 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-session\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.061094 master-0 kubenswrapper[33572]: I1204 22:23:59.061082 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-user-template-error\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.061268 master-0 kubenswrapper[33572]: I1204 22:23:59.061222 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.061709 master-0 kubenswrapper[33572]: I1204 22:23:59.061675 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-router-certs\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.062275 master-0 kubenswrapper[33572]: I1204 22:23:59.062233 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:23:59.062930 master-0 kubenswrapper[33572]: I1204 22:23:59.062907 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f9552e4-fda9-4207-a4ed-a0486fd1018e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:24:00.538135 master-0 kubenswrapper[33572]: I1204 22:24:00.538049 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d685460a-9a74-411d-983a-79af235b2cc0" path="/var/lib/kubelet/pods/d685460a-9a74-411d-983a-79af235b2cc0/volumes" Dec 04 22:24:04.766710 master-0 kubenswrapper[33572]: I1204 22:24:04.766621 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cl24\" (UniqueName: \"kubernetes.io/projected/1f9552e4-fda9-4207-a4ed-a0486fd1018e-kube-api-access-6cl24\") pod \"oauth-openshift-6cfff4b945-wlg4k\" (UID: \"1f9552e4-fda9-4207-a4ed-a0486fd1018e\") " pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:24:04.852648 master-0 kubenswrapper[33572]: I1204 22:24:04.852543 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:24:08.691561 master-0 kubenswrapper[33572]: W1204 22:24:08.690810 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f9552e4_fda9_4207_a4ed_a0486fd1018e.slice/crio-4120e86e83a0dd6c0120ee8ff3b4c77ecee83eb759478be6e8557260b9564b3e WatchSource:0}: Error finding container 4120e86e83a0dd6c0120ee8ff3b4c77ecee83eb759478be6e8557260b9564b3e: Status 404 returned error can't find the container with id 4120e86e83a0dd6c0120ee8ff3b4c77ecee83eb759478be6e8557260b9564b3e Dec 04 22:24:09.137188 master-0 kubenswrapper[33572]: I1204 22:24:09.137089 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" event={"ID":"1f9552e4-fda9-4207-a4ed-a0486fd1018e","Type":"ContainerStarted","Data":"4120e86e83a0dd6c0120ee8ff3b4c77ecee83eb759478be6e8557260b9564b3e"} Dec 04 22:24:09.494600 master-0 kubenswrapper[33572]: I1204 22:24:09.493690 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6cfff4b945-wlg4k"] Dec 04 22:24:11.159120 master-0 kubenswrapper[33572]: I1204 22:24:11.159002 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" event={"ID":"1f9552e4-fda9-4207-a4ed-a0486fd1018e","Type":"ContainerStarted","Data":"d16c11a94fcf4c951f28f70fa0ea0db5adec1f12b608a48e71c4d6bbcc1b0361"} Dec 04 22:24:12.179493 master-0 kubenswrapper[33572]: I1204 22:24:12.179389 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:24:12.608072 master-0 kubenswrapper[33572]: I1204 22:24:12.607973 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" podStartSLOduration=43.607952975 podStartE2EDuration="43.607952975s" podCreationTimestamp="2025-12-04 22:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:24:12.599329716 +0000 UTC m=+316.326855375" watchObservedRunningTime="2025-12-04 22:24:12.607952975 +0000 UTC m=+316.335478634" Dec 04 22:24:12.659373 master-0 kubenswrapper[33572]: I1204 22:24:12.659323 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6cfff4b945-wlg4k" Dec 04 22:24:12.892436 master-0 kubenswrapper[33572]: I1204 22:24:12.891636 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86785576d9-t7jrz"] Dec 04 22:24:12.892436 master-0 kubenswrapper[33572]: I1204 22:24:12.891865 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" containerID="cri-o://4c72e9186536692b281fe4e714a6c8b9d1a2250b3c26e8d8330699b4c2ec401d" gracePeriod=30 Dec 04 22:24:12.907339 master-0 kubenswrapper[33572]: I1204 22:24:12.907200 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg"] Dec 04 22:24:12.907878 master-0 kubenswrapper[33572]: I1204 22:24:12.907831 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerName="route-controller-manager" containerID="cri-o://f60afa3e3fc7300e00d2058f8d9e1e5f1ef0102a3ee02b211373a7e4b06ab2d2" gracePeriod=30 Dec 04 22:24:13.207100 master-0 kubenswrapper[33572]: I1204 22:24:13.206941 33572 generic.go:334] "Generic (PLEG): container finished" podID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerID="4c72e9186536692b281fe4e714a6c8b9d1a2250b3c26e8d8330699b4c2ec401d" exitCode=0 Dec 04 22:24:13.207100 master-0 kubenswrapper[33572]: I1204 22:24:13.207018 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerDied","Data":"4c72e9186536692b281fe4e714a6c8b9d1a2250b3c26e8d8330699b4c2ec401d"} Dec 04 22:24:13.207100 master-0 kubenswrapper[33572]: I1204 22:24:13.207065 33572 scope.go:117] "RemoveContainer" containerID="1bd03b6d56dba3556ff5faee83fe97db0ea1194b6ba6e4b1aac4ae2f4b0e67d8" Dec 04 22:24:13.228681 master-0 kubenswrapper[33572]: I1204 22:24:13.228620 33572 generic.go:334] "Generic (PLEG): container finished" podID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerID="f60afa3e3fc7300e00d2058f8d9e1e5f1ef0102a3ee02b211373a7e4b06ab2d2" exitCode=0 Dec 04 22:24:13.229316 master-0 kubenswrapper[33572]: I1204 22:24:13.229268 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" event={"ID":"b86ff0e8-2c72-4dc6-ac55-3c21940d044f","Type":"ContainerDied","Data":"f60afa3e3fc7300e00d2058f8d9e1e5f1ef0102a3ee02b211373a7e4b06ab2d2"} Dec 04 22:24:13.245164 master-0 kubenswrapper[33572]: I1204 22:24:13.245120 33572 scope.go:117] "RemoveContainer" containerID="577801a549fb8e8b80c730f6cb1e1c0076264ab71677f9e7afd6abe1e4f77036" Dec 04 22:24:13.272774 master-0 kubenswrapper[33572]: I1204 22:24:13.272254 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-55894b577f-c58wv"] Dec 04 22:24:13.278068 master-0 kubenswrapper[33572]: I1204 22:24:13.276972 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.311299 master-0 kubenswrapper[33572]: I1204 22:24:13.310144 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55894b577f-c58wv"] Dec 04 22:24:13.340418 master-0 kubenswrapper[33572]: I1204 22:24:13.340367 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-oauth-serving-cert\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.340553 master-0 kubenswrapper[33572]: I1204 22:24:13.340442 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8d4f\" (UniqueName: \"kubernetes.io/projected/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-kube-api-access-g8d4f\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.340553 master-0 kubenswrapper[33572]: I1204 22:24:13.340472 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-config\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.340633 master-0 kubenswrapper[33572]: I1204 22:24:13.340574 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-service-ca\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.340787 master-0 kubenswrapper[33572]: I1204 22:24:13.340762 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-serving-cert\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.340841 master-0 kubenswrapper[33572]: I1204 22:24:13.340823 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-trusted-ca-bundle\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.340976 master-0 kubenswrapper[33572]: I1204 22:24:13.340951 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-oauth-config\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.394561 master-0 kubenswrapper[33572]: I1204 22:24:13.393430 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55894b577f-c58wv"] Dec 04 22:24:13.394561 master-0 kubenswrapper[33572]: E1204 22:24:13.394007 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[console-config console-oauth-config console-serving-cert kube-api-access-g8d4f oauth-serving-cert service-ca trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-console/console-55894b577f-c58wv" podUID="2f72f4a3-a913-4cc5-91bf-2296201cbc2d" Dec 04 22:24:13.407895 master-0 kubenswrapper[33572]: I1204 22:24:13.407828 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d6857f96b-g7j6m"] Dec 04 22:24:13.408737 master-0 kubenswrapper[33572]: I1204 22:24:13.408707 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.432105 master-0 kubenswrapper[33572]: I1204 22:24:13.432062 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d6857f96b-g7j6m"] Dec 04 22:24:13.444594 master-0 kubenswrapper[33572]: I1204 22:24:13.444532 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-trusted-ca-bundle\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.445017 master-0 kubenswrapper[33572]: I1204 22:24:13.444992 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-config\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.445111 master-0 kubenswrapper[33572]: I1204 22:24:13.445021 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk8vm\" (UniqueName: \"kubernetes.io/projected/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-kube-api-access-vk8vm\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.445111 master-0 kubenswrapper[33572]: I1204 22:24:13.445058 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-oauth-serving-cert\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.445111 master-0 kubenswrapper[33572]: I1204 22:24:13.445094 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-oauth-config\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.445111 master-0 kubenswrapper[33572]: I1204 22:24:13.445111 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-serving-cert\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.445300 master-0 kubenswrapper[33572]: I1204 22:24:13.445130 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-oauth-serving-cert\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.445300 master-0 kubenswrapper[33572]: I1204 22:24:13.445151 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8d4f\" (UniqueName: \"kubernetes.io/projected/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-kube-api-access-g8d4f\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.445300 master-0 kubenswrapper[33572]: I1204 22:24:13.445168 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-config\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.445300 master-0 kubenswrapper[33572]: I1204 22:24:13.445185 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-service-ca\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.445300 master-0 kubenswrapper[33572]: I1204 22:24:13.445211 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-service-ca\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.445300 master-0 kubenswrapper[33572]: I1204 22:24:13.445252 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-trusted-ca-bundle\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.445300 master-0 kubenswrapper[33572]: I1204 22:24:13.445281 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-oauth-config\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.445300 master-0 kubenswrapper[33572]: I1204 22:24:13.445299 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-serving-cert\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.449266 master-0 kubenswrapper[33572]: I1204 22:24:13.449204 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-trusted-ca-bundle\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.450043 master-0 kubenswrapper[33572]: I1204 22:24:13.450004 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-service-ca\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.450947 master-0 kubenswrapper[33572]: I1204 22:24:13.450883 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-serving-cert\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.452457 master-0 kubenswrapper[33572]: I1204 22:24:13.452402 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-config\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.456577 master-0 kubenswrapper[33572]: I1204 22:24:13.456320 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-oauth-serving-cert\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.463511 master-0 kubenswrapper[33572]: I1204 22:24:13.463371 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-oauth-config\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.471937 master-0 kubenswrapper[33572]: I1204 22:24:13.471887 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8d4f\" (UniqueName: \"kubernetes.io/projected/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-kube-api-access-g8d4f\") pod \"console-55894b577f-c58wv\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:13.491707 master-0 kubenswrapper[33572]: I1204 22:24:13.491635 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:24:13.498008 master-0 kubenswrapper[33572]: I1204 22:24:13.497960 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:24:13.547087 master-0 kubenswrapper[33572]: I1204 22:24:13.547026 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-proxy-ca-bundles\") pod \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " Dec 04 22:24:13.547087 master-0 kubenswrapper[33572]: I1204 22:24:13.547085 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc4z5\" (UniqueName: \"kubernetes.io/projected/c3863c74-8f22-4c67-bef5-2d0d39df4abd-kube-api-access-pc4z5\") pod \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " Dec 04 22:24:13.547340 master-0 kubenswrapper[33572]: I1204 22:24:13.547157 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-config\") pod \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " Dec 04 22:24:13.547340 master-0 kubenswrapper[33572]: I1204 22:24:13.547176 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca\") pod \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " Dec 04 22:24:13.547340 master-0 kubenswrapper[33572]: I1204 22:24:13.547200 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-client-ca\") pod \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " Dec 04 22:24:13.547340 master-0 kubenswrapper[33572]: I1204 22:24:13.547218 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-serving-cert\") pod \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " Dec 04 22:24:13.547340 master-0 kubenswrapper[33572]: I1204 22:24:13.547253 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3863c74-8f22-4c67-bef5-2d0d39df4abd-serving-cert\") pod \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\" (UID: \"c3863c74-8f22-4c67-bef5-2d0d39df4abd\") " Dec 04 22:24:13.547340 master-0 kubenswrapper[33572]: I1204 22:24:13.547278 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vpxd\" (UniqueName: \"kubernetes.io/projected/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-kube-api-access-2vpxd\") pod \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " Dec 04 22:24:13.547340 master-0 kubenswrapper[33572]: I1204 22:24:13.547294 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config\") pod \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\" (UID: \"b86ff0e8-2c72-4dc6-ac55-3c21940d044f\") " Dec 04 22:24:13.548002 master-0 kubenswrapper[33572]: I1204 22:24:13.547963 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c3863c74-8f22-4c67-bef5-2d0d39df4abd" (UID: "c3863c74-8f22-4c67-bef5-2d0d39df4abd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:13.548119 master-0 kubenswrapper[33572]: I1204 22:24:13.548056 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca" (OuterVolumeSpecName: "client-ca") pod "b86ff0e8-2c72-4dc6-ac55-3c21940d044f" (UID: "b86ff0e8-2c72-4dc6-ac55-3c21940d044f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:13.548415 master-0 kubenswrapper[33572]: I1204 22:24:13.548365 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-config" (OuterVolumeSpecName: "config") pod "c3863c74-8f22-4c67-bef5-2d0d39df4abd" (UID: "c3863c74-8f22-4c67-bef5-2d0d39df4abd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:13.548876 master-0 kubenswrapper[33572]: I1204 22:24:13.548819 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-client-ca" (OuterVolumeSpecName: "client-ca") pod "c3863c74-8f22-4c67-bef5-2d0d39df4abd" (UID: "c3863c74-8f22-4c67-bef5-2d0d39df4abd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:13.548949 master-0 kubenswrapper[33572]: I1204 22:24:13.548891 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config" (OuterVolumeSpecName: "config") pod "b86ff0e8-2c72-4dc6-ac55-3c21940d044f" (UID: "b86ff0e8-2c72-4dc6-ac55-3c21940d044f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:13.548949 master-0 kubenswrapper[33572]: I1204 22:24:13.548890 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-oauth-config\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.549049 master-0 kubenswrapper[33572]: I1204 22:24:13.549023 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk8vm\" (UniqueName: \"kubernetes.io/projected/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-kube-api-access-vk8vm\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.549104 master-0 kubenswrapper[33572]: I1204 22:24:13.549060 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-config\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.549151 master-0 kubenswrapper[33572]: I1204 22:24:13.549123 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-oauth-serving-cert\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.549232 master-0 kubenswrapper[33572]: I1204 22:24:13.549207 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-serving-cert\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.549351 master-0 kubenswrapper[33572]: I1204 22:24:13.549326 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-service-ca\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.549489 master-0 kubenswrapper[33572]: I1204 22:24:13.549441 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-trusted-ca-bundle\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.549590 master-0 kubenswrapper[33572]: I1204 22:24:13.549570 33572 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:13.549643 master-0 kubenswrapper[33572]: I1204 22:24:13.549597 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:13.549643 master-0 kubenswrapper[33572]: I1204 22:24:13.549612 33572 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:13.549643 master-0 kubenswrapper[33572]: I1204 22:24:13.549625 33572 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3863c74-8f22-4c67-bef5-2d0d39df4abd-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:13.549643 master-0 kubenswrapper[33572]: I1204 22:24:13.549637 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:13.551079 master-0 kubenswrapper[33572]: I1204 22:24:13.551048 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-trusted-ca-bundle\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.551986 master-0 kubenswrapper[33572]: I1204 22:24:13.551884 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-oauth-serving-cert\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.552298 master-0 kubenswrapper[33572]: I1204 22:24:13.552275 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-service-ca\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.552298 master-0 kubenswrapper[33572]: I1204 22:24:13.552288 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-config\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.553196 master-0 kubenswrapper[33572]: I1204 22:24:13.553145 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b86ff0e8-2c72-4dc6-ac55-3c21940d044f" (UID: "b86ff0e8-2c72-4dc6-ac55-3c21940d044f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:13.553995 master-0 kubenswrapper[33572]: I1204 22:24:13.553956 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-kube-api-access-2vpxd" (OuterVolumeSpecName: "kube-api-access-2vpxd") pod "b86ff0e8-2c72-4dc6-ac55-3c21940d044f" (UID: "b86ff0e8-2c72-4dc6-ac55-3c21940d044f"). InnerVolumeSpecName "kube-api-access-2vpxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:24:13.554371 master-0 kubenswrapper[33572]: I1204 22:24:13.554307 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3863c74-8f22-4c67-bef5-2d0d39df4abd-kube-api-access-pc4z5" (OuterVolumeSpecName: "kube-api-access-pc4z5") pod "c3863c74-8f22-4c67-bef5-2d0d39df4abd" (UID: "c3863c74-8f22-4c67-bef5-2d0d39df4abd"). InnerVolumeSpecName "kube-api-access-pc4z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:24:13.554831 master-0 kubenswrapper[33572]: I1204 22:24:13.554807 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-oauth-config\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.555978 master-0 kubenswrapper[33572]: I1204 22:24:13.555886 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3863c74-8f22-4c67-bef5-2d0d39df4abd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c3863c74-8f22-4c67-bef5-2d0d39df4abd" (UID: "c3863c74-8f22-4c67-bef5-2d0d39df4abd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:13.556057 master-0 kubenswrapper[33572]: I1204 22:24:13.555984 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-serving-cert\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.572112 master-0 kubenswrapper[33572]: I1204 22:24:13.572008 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk8vm\" (UniqueName: \"kubernetes.io/projected/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-kube-api-access-vk8vm\") pod \"console-7d6857f96b-g7j6m\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:13.651289 master-0 kubenswrapper[33572]: I1204 22:24:13.651205 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc4z5\" (UniqueName: \"kubernetes.io/projected/c3863c74-8f22-4c67-bef5-2d0d39df4abd-kube-api-access-pc4z5\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:13.651289 master-0 kubenswrapper[33572]: I1204 22:24:13.651264 33572 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:13.651289 master-0 kubenswrapper[33572]: I1204 22:24:13.651274 33572 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3863c74-8f22-4c67-bef5-2d0d39df4abd-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:13.651289 master-0 kubenswrapper[33572]: I1204 22:24:13.651288 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vpxd\" (UniqueName: \"kubernetes.io/projected/b86ff0e8-2c72-4dc6-ac55-3c21940d044f-kube-api-access-2vpxd\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:13.790336 master-0 kubenswrapper[33572]: I1204 22:24:13.790201 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: I1204 22:24:14.081288 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z"] Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: E1204 22:24:14.081786 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerName="route-controller-manager" Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: I1204 22:24:14.081809 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerName="route-controller-manager" Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: E1204 22:24:14.081850 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: I1204 22:24:14.081863 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: E1204 22:24:14.081892 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: I1204 22:24:14.081905 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: I1204 22:24:14.082134 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerName="route-controller-manager" Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: I1204 22:24:14.082169 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: I1204 22:24:14.082192 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerName="route-controller-manager" Dec 04 22:24:14.082552 master-0 kubenswrapper[33572]: I1204 22:24:14.082211 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" containerName="controller-manager" Dec 04 22:24:14.084203 master-0 kubenswrapper[33572]: I1204 22:24:14.082971 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.086837 master-0 kubenswrapper[33572]: I1204 22:24:14.086772 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-xvjs9" Dec 04 22:24:14.093910 master-0 kubenswrapper[33572]: I1204 22:24:14.093844 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k"] Dec 04 22:24:14.094282 master-0 kubenswrapper[33572]: E1204 22:24:14.094244 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerName="route-controller-manager" Dec 04 22:24:14.094282 master-0 kubenswrapper[33572]: I1204 22:24:14.094275 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" containerName="route-controller-manager" Dec 04 22:24:14.095430 master-0 kubenswrapper[33572]: I1204 22:24:14.095249 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.096974 master-0 kubenswrapper[33572]: I1204 22:24:14.096940 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-tflzj" Dec 04 22:24:14.101792 master-0 kubenswrapper[33572]: I1204 22:24:14.101735 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z"] Dec 04 22:24:14.108016 master-0 kubenswrapper[33572]: I1204 22:24:14.107284 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k"] Dec 04 22:24:14.163444 master-0 kubenswrapper[33572]: I1204 22:24:14.163335 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv74j\" (UniqueName: \"kubernetes.io/projected/784ceb13-af8d-4311-9b69-89fca18ae083-kube-api-access-gv74j\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.163812 master-0 kubenswrapper[33572]: I1204 22:24:14.163581 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3d92451-9229-4434-9991-da4baf1761ed-proxy-ca-bundles\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.163812 master-0 kubenswrapper[33572]: I1204 22:24:14.163679 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b3d92451-9229-4434-9991-da4baf1761ed-client-ca\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.163812 master-0 kubenswrapper[33572]: I1204 22:24:14.163740 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3d92451-9229-4434-9991-da4baf1761ed-serving-cert\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.163992 master-0 kubenswrapper[33572]: I1204 22:24:14.163835 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784ceb13-af8d-4311-9b69-89fca18ae083-serving-cert\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.163992 master-0 kubenswrapper[33572]: I1204 22:24:14.163975 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hlg\" (UniqueName: \"kubernetes.io/projected/b3d92451-9229-4434-9991-da4baf1761ed-kube-api-access-l6hlg\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.164114 master-0 kubenswrapper[33572]: I1204 22:24:14.164075 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/784ceb13-af8d-4311-9b69-89fca18ae083-client-ca\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.164486 master-0 kubenswrapper[33572]: I1204 22:24:14.164159 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784ceb13-af8d-4311-9b69-89fca18ae083-config\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.164486 master-0 kubenswrapper[33572]: I1204 22:24:14.164218 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d92451-9229-4434-9991-da4baf1761ed-config\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.242124 master-0 kubenswrapper[33572]: I1204 22:24:14.242034 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" event={"ID":"c3863c74-8f22-4c67-bef5-2d0d39df4abd","Type":"ContainerDied","Data":"56273487d972eb4bce7bb2a2d532d0ecc4790cf320f72d236deb00d6ee12d734"} Dec 04 22:24:14.242903 master-0 kubenswrapper[33572]: I1204 22:24:14.242377 33572 scope.go:117] "RemoveContainer" containerID="4c72e9186536692b281fe4e714a6c8b9d1a2250b3c26e8d8330699b4c2ec401d" Dec 04 22:24:14.242903 master-0 kubenswrapper[33572]: I1204 22:24:14.242521 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86785576d9-t7jrz" Dec 04 22:24:14.246357 master-0 kubenswrapper[33572]: I1204 22:24:14.246313 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" event={"ID":"b86ff0e8-2c72-4dc6-ac55-3c21940d044f","Type":"ContainerDied","Data":"2d1e4c21a00903c83707b4497502b9b73283be3d30ce9a5aaab7e54bf8f72dba"} Dec 04 22:24:14.246445 master-0 kubenswrapper[33572]: I1204 22:24:14.246426 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg" Dec 04 22:24:14.247135 master-0 kubenswrapper[33572]: I1204 22:24:14.247082 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:14.258973 master-0 kubenswrapper[33572]: I1204 22:24:14.258882 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d6857f96b-g7j6m"] Dec 04 22:24:14.263233 master-0 kubenswrapper[33572]: W1204 22:24:14.263172 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a4bc900_27a4_4a4d_9cce_43e2a4708fec.slice/crio-6c84dec3eea5cbc92b04deeb19a860514de8fd5bde70c91b5c413e4e171a5058 WatchSource:0}: Error finding container 6c84dec3eea5cbc92b04deeb19a860514de8fd5bde70c91b5c413e4e171a5058: Status 404 returned error can't find the container with id 6c84dec3eea5cbc92b04deeb19a860514de8fd5bde70c91b5c413e4e171a5058 Dec 04 22:24:14.269061 master-0 kubenswrapper[33572]: I1204 22:24:14.268969 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/784ceb13-af8d-4311-9b69-89fca18ae083-client-ca\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.269180 master-0 kubenswrapper[33572]: I1204 22:24:14.269076 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784ceb13-af8d-4311-9b69-89fca18ae083-config\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.269180 master-0 kubenswrapper[33572]: I1204 22:24:14.269134 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d92451-9229-4434-9991-da4baf1761ed-config\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.269287 master-0 kubenswrapper[33572]: I1204 22:24:14.269247 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv74j\" (UniqueName: \"kubernetes.io/projected/784ceb13-af8d-4311-9b69-89fca18ae083-kube-api-access-gv74j\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.269340 master-0 kubenswrapper[33572]: I1204 22:24:14.269316 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3d92451-9229-4434-9991-da4baf1761ed-proxy-ca-bundles\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.269385 master-0 kubenswrapper[33572]: I1204 22:24:14.269361 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b3d92451-9229-4434-9991-da4baf1761ed-client-ca\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.269433 master-0 kubenswrapper[33572]: I1204 22:24:14.269393 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3d92451-9229-4434-9991-da4baf1761ed-serving-cert\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.270867 master-0 kubenswrapper[33572]: I1204 22:24:14.269897 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784ceb13-af8d-4311-9b69-89fca18ae083-serving-cert\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.271279 master-0 kubenswrapper[33572]: I1204 22:24:14.271233 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b3d92451-9229-4434-9991-da4baf1761ed-client-ca\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.280320 master-0 kubenswrapper[33572]: I1204 22:24:14.271372 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hlg\" (UniqueName: \"kubernetes.io/projected/b3d92451-9229-4434-9991-da4baf1761ed-kube-api-access-l6hlg\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.280320 master-0 kubenswrapper[33572]: I1204 22:24:14.272788 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d92451-9229-4434-9991-da4baf1761ed-config\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.280320 master-0 kubenswrapper[33572]: I1204 22:24:14.273319 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3d92451-9229-4434-9991-da4baf1761ed-serving-cert\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.280320 master-0 kubenswrapper[33572]: I1204 22:24:14.273851 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3d92451-9229-4434-9991-da4baf1761ed-proxy-ca-bundles\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.280320 master-0 kubenswrapper[33572]: I1204 22:24:14.276373 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784ceb13-af8d-4311-9b69-89fca18ae083-config\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.280320 master-0 kubenswrapper[33572]: I1204 22:24:14.276855 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784ceb13-af8d-4311-9b69-89fca18ae083-serving-cert\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.280320 master-0 kubenswrapper[33572]: I1204 22:24:14.277416 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/784ceb13-af8d-4311-9b69-89fca18ae083-client-ca\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.280320 master-0 kubenswrapper[33572]: I1204 22:24:14.277611 33572 scope.go:117] "RemoveContainer" containerID="f60afa3e3fc7300e00d2058f8d9e1e5f1ef0102a3ee02b211373a7e4b06ab2d2" Dec 04 22:24:14.292838 master-0 kubenswrapper[33572]: I1204 22:24:14.292778 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv74j\" (UniqueName: \"kubernetes.io/projected/784ceb13-af8d-4311-9b69-89fca18ae083-kube-api-access-gv74j\") pod \"route-controller-manager-5795987f7c-w2z9k\" (UID: \"784ceb13-af8d-4311-9b69-89fca18ae083\") " pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.304166 master-0 kubenswrapper[33572]: I1204 22:24:14.304101 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hlg\" (UniqueName: \"kubernetes.io/projected/b3d92451-9229-4434-9991-da4baf1761ed-kube-api-access-l6hlg\") pod \"controller-manager-6b4d7dfbdb-v9q4z\" (UID: \"b3d92451-9229-4434-9991-da4baf1761ed\") " pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.359752 master-0 kubenswrapper[33572]: I1204 22:24:14.359622 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:14.424111 master-0 kubenswrapper[33572]: I1204 22:24:14.422690 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:14.426663 master-0 kubenswrapper[33572]: I1204 22:24:14.426566 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg"] Dec 04 22:24:14.442003 master-0 kubenswrapper[33572]: I1204 22:24:14.441426 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9db9db957-zdrjg"] Dec 04 22:24:14.446438 master-0 kubenswrapper[33572]: I1204 22:24:14.444786 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:14.457905 master-0 kubenswrapper[33572]: I1204 22:24:14.457836 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86785576d9-t7jrz"] Dec 04 22:24:14.470998 master-0 kubenswrapper[33572]: I1204 22:24:14.470261 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86785576d9-t7jrz"] Dec 04 22:24:14.474544 master-0 kubenswrapper[33572]: I1204 22:24:14.474472 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8d4f\" (UniqueName: \"kubernetes.io/projected/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-kube-api-access-g8d4f\") pod \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " Dec 04 22:24:14.474714 master-0 kubenswrapper[33572]: I1204 22:24:14.474633 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-trusted-ca-bundle\") pod \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " Dec 04 22:24:14.474773 master-0 kubenswrapper[33572]: I1204 22:24:14.474755 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-serving-cert\") pod \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " Dec 04 22:24:14.474809 master-0 kubenswrapper[33572]: I1204 22:24:14.474792 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-oauth-config\") pod \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " Dec 04 22:24:14.474847 master-0 kubenswrapper[33572]: I1204 22:24:14.474815 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-oauth-serving-cert\") pod \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " Dec 04 22:24:14.474894 master-0 kubenswrapper[33572]: I1204 22:24:14.474863 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-service-ca\") pod \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " Dec 04 22:24:14.474997 master-0 kubenswrapper[33572]: I1204 22:24:14.474978 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-config\") pod \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\" (UID: \"2f72f4a3-a913-4cc5-91bf-2296201cbc2d\") " Dec 04 22:24:14.475888 master-0 kubenswrapper[33572]: I1204 22:24:14.475832 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2f72f4a3-a913-4cc5-91bf-2296201cbc2d" (UID: "2f72f4a3-a913-4cc5-91bf-2296201cbc2d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:14.476050 master-0 kubenswrapper[33572]: I1204 22:24:14.476026 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-config" (OuterVolumeSpecName: "console-config") pod "2f72f4a3-a913-4cc5-91bf-2296201cbc2d" (UID: "2f72f4a3-a913-4cc5-91bf-2296201cbc2d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:14.476510 master-0 kubenswrapper[33572]: I1204 22:24:14.476480 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2f72f4a3-a913-4cc5-91bf-2296201cbc2d" (UID: "2f72f4a3-a913-4cc5-91bf-2296201cbc2d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:14.476999 master-0 kubenswrapper[33572]: I1204 22:24:14.476978 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-service-ca" (OuterVolumeSpecName: "service-ca") pod "2f72f4a3-a913-4cc5-91bf-2296201cbc2d" (UID: "2f72f4a3-a913-4cc5-91bf-2296201cbc2d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:14.477478 master-0 kubenswrapper[33572]: I1204 22:24:14.477448 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-kube-api-access-g8d4f" (OuterVolumeSpecName: "kube-api-access-g8d4f") pod "2f72f4a3-a913-4cc5-91bf-2296201cbc2d" (UID: "2f72f4a3-a913-4cc5-91bf-2296201cbc2d"). InnerVolumeSpecName "kube-api-access-g8d4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:24:14.479208 master-0 kubenswrapper[33572]: I1204 22:24:14.479173 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2f72f4a3-a913-4cc5-91bf-2296201cbc2d" (UID: "2f72f4a3-a913-4cc5-91bf-2296201cbc2d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:14.479626 master-0 kubenswrapper[33572]: I1204 22:24:14.479586 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2f72f4a3-a913-4cc5-91bf-2296201cbc2d" (UID: "2f72f4a3-a913-4cc5-91bf-2296201cbc2d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:14.542344 master-0 kubenswrapper[33572]: I1204 22:24:14.542270 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b86ff0e8-2c72-4dc6-ac55-3c21940d044f" path="/var/lib/kubelet/pods/b86ff0e8-2c72-4dc6-ac55-3c21940d044f/volumes" Dec 04 22:24:14.543487 master-0 kubenswrapper[33572]: I1204 22:24:14.543098 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3863c74-8f22-4c67-bef5-2d0d39df4abd" path="/var/lib/kubelet/pods/c3863c74-8f22-4c67-bef5-2d0d39df4abd/volumes" Dec 04 22:24:14.577076 master-0 kubenswrapper[33572]: I1204 22:24:14.577016 33572 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:14.577076 master-0 kubenswrapper[33572]: I1204 22:24:14.577061 33572 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:14.577076 master-0 kubenswrapper[33572]: I1204 22:24:14.577078 33572 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:14.577387 master-0 kubenswrapper[33572]: I1204 22:24:14.577204 33572 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:14.577387 master-0 kubenswrapper[33572]: I1204 22:24:14.577273 33572 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-console-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:14.577387 master-0 kubenswrapper[33572]: I1204 22:24:14.577308 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8d4f\" (UniqueName: \"kubernetes.io/projected/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-kube-api-access-g8d4f\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:14.577387 master-0 kubenswrapper[33572]: I1204 22:24:14.577322 33572 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f72f4a3-a913-4cc5-91bf-2296201cbc2d-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:14.916860 master-0 kubenswrapper[33572]: W1204 22:24:14.916788 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d92451_9229_4434_9991_da4baf1761ed.slice/crio-e39dcfac0afc60ff7fb6a9c6ba4b31bc2431bea8982b248904a20005dc1a37ef WatchSource:0}: Error finding container e39dcfac0afc60ff7fb6a9c6ba4b31bc2431bea8982b248904a20005dc1a37ef: Status 404 returned error can't find the container with id e39dcfac0afc60ff7fb6a9c6ba4b31bc2431bea8982b248904a20005dc1a37ef Dec 04 22:24:14.919157 master-0 kubenswrapper[33572]: I1204 22:24:14.919087 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z"] Dec 04 22:24:14.987142 master-0 kubenswrapper[33572]: I1204 22:24:14.987073 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k"] Dec 04 22:24:15.000692 master-0 kubenswrapper[33572]: W1204 22:24:15.000635 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod784ceb13_af8d_4311_9b69_89fca18ae083.slice/crio-916dae1dd0cf212e75a42676c92319adc6477ca66e39917094d018060f274b22 WatchSource:0}: Error finding container 916dae1dd0cf212e75a42676c92319adc6477ca66e39917094d018060f274b22: Status 404 returned error can't find the container with id 916dae1dd0cf212e75a42676c92319adc6477ca66e39917094d018060f274b22 Dec 04 22:24:15.258326 master-0 kubenswrapper[33572]: I1204 22:24:15.258262 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" event={"ID":"784ceb13-af8d-4311-9b69-89fca18ae083","Type":"ContainerStarted","Data":"9565a817334e5942ce86de37a51ae1fe916bb478e1f47f2c38eb76bad08fdf4c"} Dec 04 22:24:15.258326 master-0 kubenswrapper[33572]: I1204 22:24:15.258318 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" event={"ID":"784ceb13-af8d-4311-9b69-89fca18ae083","Type":"ContainerStarted","Data":"916dae1dd0cf212e75a42676c92319adc6477ca66e39917094d018060f274b22"} Dec 04 22:24:15.258950 master-0 kubenswrapper[33572]: I1204 22:24:15.258673 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:15.260648 master-0 kubenswrapper[33572]: I1204 22:24:15.260597 33572 patch_prober.go:28] interesting pod/route-controller-manager-5795987f7c-w2z9k container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.103:8443/healthz\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Dec 04 22:24:15.260736 master-0 kubenswrapper[33572]: I1204 22:24:15.260660 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" podUID="784ceb13-af8d-4311-9b69-89fca18ae083" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.103:8443/healthz\": dial tcp 10.128.0.103:8443: connect: connection refused" Dec 04 22:24:15.261201 master-0 kubenswrapper[33572]: I1204 22:24:15.261156 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" event={"ID":"b3d92451-9229-4434-9991-da4baf1761ed","Type":"ContainerStarted","Data":"02c4171cfac5b4b5b6bfb61ac4085a067de3ae55febe58323df285a48f25457d"} Dec 04 22:24:15.261201 master-0 kubenswrapper[33572]: I1204 22:24:15.261200 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" event={"ID":"b3d92451-9229-4434-9991-da4baf1761ed","Type":"ContainerStarted","Data":"e39dcfac0afc60ff7fb6a9c6ba4b31bc2431bea8982b248904a20005dc1a37ef"} Dec 04 22:24:15.261448 master-0 kubenswrapper[33572]: I1204 22:24:15.261411 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:15.265382 master-0 kubenswrapper[33572]: I1204 22:24:15.264952 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55894b577f-c58wv" Dec 04 22:24:15.265710 master-0 kubenswrapper[33572]: I1204 22:24:15.265662 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d6857f96b-g7j6m" event={"ID":"8a4bc900-27a4-4a4d-9cce-43e2a4708fec","Type":"ContainerStarted","Data":"4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326"} Dec 04 22:24:15.265779 master-0 kubenswrapper[33572]: I1204 22:24:15.265712 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d6857f96b-g7j6m" event={"ID":"8a4bc900-27a4-4a4d-9cce-43e2a4708fec","Type":"ContainerStarted","Data":"6c84dec3eea5cbc92b04deeb19a860514de8fd5bde70c91b5c413e4e171a5058"} Dec 04 22:24:15.269658 master-0 kubenswrapper[33572]: I1204 22:24:15.267123 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" Dec 04 22:24:15.311790 master-0 kubenswrapper[33572]: I1204 22:24:15.310849 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" podStartSLOduration=2.310813943 podStartE2EDuration="2.310813943s" podCreationTimestamp="2025-12-04 22:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:24:15.304027185 +0000 UTC m=+319.031552844" watchObservedRunningTime="2025-12-04 22:24:15.310813943 +0000 UTC m=+319.038339592" Dec 04 22:24:15.342530 master-0 kubenswrapper[33572]: I1204 22:24:15.342083 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b4d7dfbdb-v9q4z" podStartSLOduration=3.342059099 podStartE2EDuration="3.342059099s" podCreationTimestamp="2025-12-04 22:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:24:15.340072443 +0000 UTC m=+319.067598092" watchObservedRunningTime="2025-12-04 22:24:15.342059099 +0000 UTC m=+319.069584748" Dec 04 22:24:15.396610 master-0 kubenswrapper[33572]: I1204 22:24:15.396571 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55894b577f-c58wv"] Dec 04 22:24:15.402738 master-0 kubenswrapper[33572]: I1204 22:24:15.402679 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55894b577f-c58wv"] Dec 04 22:24:16.278212 master-0 kubenswrapper[33572]: I1204 22:24:16.278164 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5795987f7c-w2z9k" Dec 04 22:24:16.303025 master-0 kubenswrapper[33572]: I1204 22:24:16.302926 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d6857f96b-g7j6m" podStartSLOduration=3.30289657 podStartE2EDuration="3.30289657s" podCreationTimestamp="2025-12-04 22:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:24:15.431214778 +0000 UTC m=+319.158740437" watchObservedRunningTime="2025-12-04 22:24:16.30289657 +0000 UTC m=+320.030422229" Dec 04 22:24:16.537466 master-0 kubenswrapper[33572]: I1204 22:24:16.537291 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f72f4a3-a913-4cc5-91bf-2296201cbc2d" path="/var/lib/kubelet/pods/2f72f4a3-a913-4cc5-91bf-2296201cbc2d/volumes" Dec 04 22:24:17.174168 master-0 kubenswrapper[33572]: I1204 22:24:17.174101 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 04 22:24:17.174494 master-0 kubenswrapper[33572]: I1204 22:24:17.174462 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="alertmanager" containerID="cri-o://56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2" gracePeriod=120 Dec 04 22:24:17.174654 master-0 kubenswrapper[33572]: I1204 22:24:17.174592 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy-metric" containerID="cri-o://d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180" gracePeriod=120 Dec 04 22:24:17.174750 master-0 kubenswrapper[33572]: I1204 22:24:17.174668 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy-web" containerID="cri-o://fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac" gracePeriod=120 Dec 04 22:24:17.174870 master-0 kubenswrapper[33572]: I1204 22:24:17.174626 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="config-reloader" containerID="cri-o://2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd" gracePeriod=120 Dec 04 22:24:17.174943 master-0 kubenswrapper[33572]: I1204 22:24:17.174833 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy" containerID="cri-o://324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce" gracePeriod=120 Dec 04 22:24:17.175529 master-0 kubenswrapper[33572]: I1204 22:24:17.174758 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="prom-label-proxy" containerID="cri-o://969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f" gracePeriod=120 Dec 04 22:24:18.292094 master-0 kubenswrapper[33572]: I1204 22:24:18.292040 33572 generic.go:334] "Generic (PLEG): container finished" podID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerID="969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f" exitCode=0 Dec 04 22:24:18.292094 master-0 kubenswrapper[33572]: I1204 22:24:18.292076 33572 generic.go:334] "Generic (PLEG): container finished" podID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerID="d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180" exitCode=0 Dec 04 22:24:18.292094 master-0 kubenswrapper[33572]: I1204 22:24:18.292085 33572 generic.go:334] "Generic (PLEG): container finished" podID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerID="324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce" exitCode=0 Dec 04 22:24:18.292094 master-0 kubenswrapper[33572]: I1204 22:24:18.292093 33572 generic.go:334] "Generic (PLEG): container finished" podID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerID="2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd" exitCode=0 Dec 04 22:24:18.292094 master-0 kubenswrapper[33572]: I1204 22:24:18.292099 33572 generic.go:334] "Generic (PLEG): container finished" podID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerID="56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2" exitCode=0 Dec 04 22:24:18.292709 master-0 kubenswrapper[33572]: I1204 22:24:18.292120 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerDied","Data":"969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f"} Dec 04 22:24:18.292709 master-0 kubenswrapper[33572]: I1204 22:24:18.292145 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerDied","Data":"d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180"} Dec 04 22:24:18.292709 master-0 kubenswrapper[33572]: I1204 22:24:18.292156 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerDied","Data":"324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce"} Dec 04 22:24:18.292709 master-0 kubenswrapper[33572]: I1204 22:24:18.292167 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerDied","Data":"2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd"} Dec 04 22:24:18.292709 master-0 kubenswrapper[33572]: I1204 22:24:18.292177 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerDied","Data":"56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2"} Dec 04 22:24:18.749894 master-0 kubenswrapper[33572]: I1204 22:24:18.749815 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:18.756069 master-0 kubenswrapper[33572]: I1204 22:24:18.756006 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.756282 master-0 kubenswrapper[33572]: I1204 22:24:18.756175 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-trusted-ca-bundle\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.756407 master-0 kubenswrapper[33572]: I1204 22:24:18.756332 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.756488 master-0 kubenswrapper[33572]: I1204 22:24:18.756439 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-main-db\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.756646 master-0 kubenswrapper[33572]: I1204 22:24:18.756551 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-tls-assets\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.756646 master-0 kubenswrapper[33572]: I1204 22:24:18.756619 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-out\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.756794 master-0 kubenswrapper[33572]: I1204 22:24:18.756699 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-volume\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.756794 master-0 kubenswrapper[33572]: I1204 22:24:18.756780 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlfqv\" (UniqueName: \"kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-kube-api-access-qlfqv\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.756927 master-0 kubenswrapper[33572]: I1204 22:24:18.756886 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-metrics-client-ca\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.756998 master-0 kubenswrapper[33572]: I1204 22:24:18.756954 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-web\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.757079 master-0 kubenswrapper[33572]: I1204 22:24:18.757017 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-metric\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.757325 master-0 kubenswrapper[33572]: I1204 22:24:18.757072 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-web-config\") pod \"2221d215-2187-4dd6-a2de-f0fc6ec54027\" (UID: \"2221d215-2187-4dd6-a2de-f0fc6ec54027\") " Dec 04 22:24:18.757440 master-0 kubenswrapper[33572]: I1204 22:24:18.757389 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:24:18.758175 master-0 kubenswrapper[33572]: I1204 22:24:18.758108 33572 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.758983 master-0 kubenswrapper[33572]: I1204 22:24:18.758902 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:18.759177 master-0 kubenswrapper[33572]: I1204 22:24:18.758919 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:18.760794 master-0 kubenswrapper[33572]: I1204 22:24:18.760697 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-volume" (OuterVolumeSpecName: "config-volume") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:18.761429 master-0 kubenswrapper[33572]: I1204 22:24:18.761362 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:18.762079 master-0 kubenswrapper[33572]: I1204 22:24:18.761932 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-kube-api-access-qlfqv" (OuterVolumeSpecName: "kube-api-access-qlfqv") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "kube-api-access-qlfqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:24:18.762468 master-0 kubenswrapper[33572]: I1204 22:24:18.762333 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:24:18.763300 master-0 kubenswrapper[33572]: I1204 22:24:18.762960 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:18.763300 master-0 kubenswrapper[33572]: I1204 22:24:18.762989 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-out" (OuterVolumeSpecName: "config-out") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:24:18.764220 master-0 kubenswrapper[33572]: I1204 22:24:18.764150 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:18.764861 master-0 kubenswrapper[33572]: I1204 22:24:18.764676 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:18.824865 master-0 kubenswrapper[33572]: I1204 22:24:18.824770 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-web-config" (OuterVolumeSpecName: "web-config") pod "2221d215-2187-4dd6-a2de-f0fc6ec54027" (UID: "2221d215-2187-4dd6-a2de-f0fc6ec54027"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:18.858979 master-0 kubenswrapper[33572]: I1204 22:24:18.858912 33572 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.858979 master-0 kubenswrapper[33572]: I1204 22:24:18.858953 33572 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-tls-assets\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.858979 master-0 kubenswrapper[33572]: I1204 22:24:18.858967 33572 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-out\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.858979 master-0 kubenswrapper[33572]: I1204 22:24:18.858982 33572 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.858979 master-0 kubenswrapper[33572]: I1204 22:24:18.858994 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlfqv\" (UniqueName: \"kubernetes.io/projected/2221d215-2187-4dd6-a2de-f0fc6ec54027-kube-api-access-qlfqv\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.859528 master-0 kubenswrapper[33572]: I1204 22:24:18.859008 33572 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.859528 master-0 kubenswrapper[33572]: I1204 22:24:18.859021 33572 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.859528 master-0 kubenswrapper[33572]: I1204 22:24:18.859035 33572 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.859528 master-0 kubenswrapper[33572]: I1204 22:24:18.859050 33572 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-web-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.859528 master-0 kubenswrapper[33572]: I1204 22:24:18.859062 33572 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2221d215-2187-4dd6-a2de-f0fc6ec54027-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:18.859528 master-0 kubenswrapper[33572]: I1204 22:24:18.859074 33572 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2221d215-2187-4dd6-a2de-f0fc6ec54027-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:19.306473 master-0 kubenswrapper[33572]: I1204 22:24:19.306307 33572 generic.go:334] "Generic (PLEG): container finished" podID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerID="fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac" exitCode=0 Dec 04 22:24:19.306473 master-0 kubenswrapper[33572]: I1204 22:24:19.306394 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerDied","Data":"fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac"} Dec 04 22:24:19.306473 master-0 kubenswrapper[33572]: I1204 22:24:19.306456 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:19.308014 master-0 kubenswrapper[33572]: I1204 22:24:19.306493 33572 scope.go:117] "RemoveContainer" containerID="969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f" Dec 04 22:24:19.308014 master-0 kubenswrapper[33572]: I1204 22:24:19.306470 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2221d215-2187-4dd6-a2de-f0fc6ec54027","Type":"ContainerDied","Data":"2fb06c238ba297f72251dba86e44e92933bfc5c6b7c2e7001636729770ba92c5"} Dec 04 22:24:19.375732 master-0 kubenswrapper[33572]: I1204 22:24:19.375575 33572 scope.go:117] "RemoveContainer" containerID="d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180" Dec 04 22:24:19.402343 master-0 kubenswrapper[33572]: I1204 22:24:19.402283 33572 scope.go:117] "RemoveContainer" containerID="324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce" Dec 04 22:24:19.425602 master-0 kubenswrapper[33572]: I1204 22:24:19.425455 33572 scope.go:117] "RemoveContainer" containerID="fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac" Dec 04 22:24:19.446668 master-0 kubenswrapper[33572]: I1204 22:24:19.446453 33572 scope.go:117] "RemoveContainer" containerID="2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd" Dec 04 22:24:19.470814 master-0 kubenswrapper[33572]: I1204 22:24:19.470767 33572 scope.go:117] "RemoveContainer" containerID="56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2" Dec 04 22:24:19.498157 master-0 kubenswrapper[33572]: I1204 22:24:19.497723 33572 scope.go:117] "RemoveContainer" containerID="374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf" Dec 04 22:24:19.520088 master-0 kubenswrapper[33572]: I1204 22:24:19.520035 33572 scope.go:117] "RemoveContainer" containerID="969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f" Dec 04 22:24:19.520867 master-0 kubenswrapper[33572]: E1204 22:24:19.520754 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f\": container with ID starting with 969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f not found: ID does not exist" containerID="969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f" Dec 04 22:24:19.520956 master-0 kubenswrapper[33572]: I1204 22:24:19.520833 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f"} err="failed to get container status \"969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f\": rpc error: code = NotFound desc = could not find container \"969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f\": container with ID starting with 969d9203226bd5984801b92b8a69b52e7bfa98a060c3a32af519510346f59d9f not found: ID does not exist" Dec 04 22:24:19.520956 master-0 kubenswrapper[33572]: I1204 22:24:19.520887 33572 scope.go:117] "RemoveContainer" containerID="d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180" Dec 04 22:24:19.521535 master-0 kubenswrapper[33572]: E1204 22:24:19.521382 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180\": container with ID starting with d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180 not found: ID does not exist" containerID="d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180" Dec 04 22:24:19.521535 master-0 kubenswrapper[33572]: I1204 22:24:19.521461 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180"} err="failed to get container status \"d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180\": rpc error: code = NotFound desc = could not find container \"d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180\": container with ID starting with d243f8075d17a7763ba15eacaf99b7d85a4369307e009439b8117263eb8ff180 not found: ID does not exist" Dec 04 22:24:19.521535 master-0 kubenswrapper[33572]: I1204 22:24:19.521529 33572 scope.go:117] "RemoveContainer" containerID="324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce" Dec 04 22:24:19.522118 master-0 kubenswrapper[33572]: E1204 22:24:19.522056 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce\": container with ID starting with 324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce not found: ID does not exist" containerID="324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce" Dec 04 22:24:19.522184 master-0 kubenswrapper[33572]: I1204 22:24:19.522127 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce"} err="failed to get container status \"324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce\": rpc error: code = NotFound desc = could not find container \"324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce\": container with ID starting with 324aa92ce5c6dab4097c61d25b4a20aa50d9114558ad41db92624b64a8a9c3ce not found: ID does not exist" Dec 04 22:24:19.522233 master-0 kubenswrapper[33572]: I1204 22:24:19.522176 33572 scope.go:117] "RemoveContainer" containerID="fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac" Dec 04 22:24:19.522790 master-0 kubenswrapper[33572]: E1204 22:24:19.522755 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac\": container with ID starting with fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac not found: ID does not exist" containerID="fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac" Dec 04 22:24:19.522790 master-0 kubenswrapper[33572]: I1204 22:24:19.522784 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac"} err="failed to get container status \"fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac\": rpc error: code = NotFound desc = could not find container \"fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac\": container with ID starting with fd4b2ab8da3339b63acc157e462eaf3560c5afeb7156ce17bf041b3d282271ac not found: ID does not exist" Dec 04 22:24:19.522950 master-0 kubenswrapper[33572]: I1204 22:24:19.522800 33572 scope.go:117] "RemoveContainer" containerID="2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd" Dec 04 22:24:19.523148 master-0 kubenswrapper[33572]: E1204 22:24:19.523105 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd\": container with ID starting with 2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd not found: ID does not exist" containerID="2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd" Dec 04 22:24:19.523148 master-0 kubenswrapper[33572]: I1204 22:24:19.523140 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd"} err="failed to get container status \"2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd\": rpc error: code = NotFound desc = could not find container \"2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd\": container with ID starting with 2540dc3a983a2ffa531fb829037086bf24662ae62f5410325234e39d7c422edd not found: ID does not exist" Dec 04 22:24:19.523292 master-0 kubenswrapper[33572]: I1204 22:24:19.523160 33572 scope.go:117] "RemoveContainer" containerID="56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2" Dec 04 22:24:19.523656 master-0 kubenswrapper[33572]: E1204 22:24:19.523545 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2\": container with ID starting with 56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2 not found: ID does not exist" containerID="56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2" Dec 04 22:24:19.523656 master-0 kubenswrapper[33572]: I1204 22:24:19.523583 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2"} err="failed to get container status \"56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2\": rpc error: code = NotFound desc = could not find container \"56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2\": container with ID starting with 56e17c4cc394ce0b135cdb8354cd1d78c7bc1ea22b43ef0f36814938a41307b2 not found: ID does not exist" Dec 04 22:24:19.523656 master-0 kubenswrapper[33572]: I1204 22:24:19.523610 33572 scope.go:117] "RemoveContainer" containerID="374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf" Dec 04 22:24:19.523992 master-0 kubenswrapper[33572]: E1204 22:24:19.523962 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf\": container with ID starting with 374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf not found: ID does not exist" containerID="374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf" Dec 04 22:24:19.524068 master-0 kubenswrapper[33572]: I1204 22:24:19.523997 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf"} err="failed to get container status \"374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf\": rpc error: code = NotFound desc = could not find container \"374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf\": container with ID starting with 374b98f5c581bdbc5a9ef5f86fbad7cf2c3d42b470f8b7de1d75cdfca20afccf not found: ID does not exist" Dec 04 22:24:20.113580 master-0 kubenswrapper[33572]: I1204 22:24:20.110880 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 04 22:24:20.130971 master-0 kubenswrapper[33572]: I1204 22:24:20.130883 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 04 22:24:20.176712 master-0 kubenswrapper[33572]: I1204 22:24:20.176422 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: E1204 22:24:20.176792 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy-metric" Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: I1204 22:24:20.176809 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy-metric" Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: E1204 22:24:20.176828 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy" Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: I1204 22:24:20.176837 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy" Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: E1204 22:24:20.176858 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy-web" Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: I1204 22:24:20.176867 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy-web" Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: E1204 22:24:20.176880 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="prom-label-proxy" Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: I1204 22:24:20.176890 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="prom-label-proxy" Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: E1204 22:24:20.176905 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="config-reloader" Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: I1204 22:24:20.176914 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="config-reloader" Dec 04 22:24:20.176925 master-0 kubenswrapper[33572]: E1204 22:24:20.176929 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="init-config-reloader" Dec 04 22:24:20.177244 master-0 kubenswrapper[33572]: I1204 22:24:20.176938 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="init-config-reloader" Dec 04 22:24:20.177244 master-0 kubenswrapper[33572]: E1204 22:24:20.176957 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="alertmanager" Dec 04 22:24:20.177244 master-0 kubenswrapper[33572]: I1204 22:24:20.176965 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="alertmanager" Dec 04 22:24:20.177244 master-0 kubenswrapper[33572]: I1204 22:24:20.177139 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy" Dec 04 22:24:20.177244 master-0 kubenswrapper[33572]: I1204 22:24:20.177169 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy-metric" Dec 04 22:24:20.177244 master-0 kubenswrapper[33572]: I1204 22:24:20.177199 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="config-reloader" Dec 04 22:24:20.177244 master-0 kubenswrapper[33572]: I1204 22:24:20.177229 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="prom-label-proxy" Dec 04 22:24:20.177466 master-0 kubenswrapper[33572]: I1204 22:24:20.177251 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="kube-rbac-proxy-web" Dec 04 22:24:20.177466 master-0 kubenswrapper[33572]: I1204 22:24:20.177277 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" containerName="alertmanager" Dec 04 22:24:20.181524 master-0 kubenswrapper[33572]: I1204 22:24:20.179983 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.188301 master-0 kubenswrapper[33572]: I1204 22:24:20.188085 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 04 22:24:20.188301 master-0 kubenswrapper[33572]: I1204 22:24:20.188249 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-vlsdp" Dec 04 22:24:20.189697 master-0 kubenswrapper[33572]: I1204 22:24:20.188644 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 04 22:24:20.192556 master-0 kubenswrapper[33572]: I1204 22:24:20.192425 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 04 22:24:20.192900 master-0 kubenswrapper[33572]: I1204 22:24:20.192451 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 04 22:24:20.192900 master-0 kubenswrapper[33572]: I1204 22:24:20.192517 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 04 22:24:20.192900 master-0 kubenswrapper[33572]: I1204 22:24:20.192558 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 04 22:24:20.192900 master-0 kubenswrapper[33572]: I1204 22:24:20.192607 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 04 22:24:20.203087 master-0 kubenswrapper[33572]: I1204 22:24:20.197485 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 04 22:24:20.207587 master-0 kubenswrapper[33572]: I1204 22:24:20.206591 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.285379 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.285478 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6d2204da-2102-48fb-9865-ff8f367a02f3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.285606 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.285678 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d2204da-2102-48fb-9865-ff8f367a02f3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.285782 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6d2204da-2102-48fb-9865-ff8f367a02f3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.285831 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.285872 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5gk7\" (UniqueName: \"kubernetes.io/projected/6d2204da-2102-48fb-9865-ff8f367a02f3-kube-api-access-j5gk7\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.285927 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d2204da-2102-48fb-9865-ff8f367a02f3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.285975 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6d2204da-2102-48fb-9865-ff8f367a02f3-config-out\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.286025 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.286065 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-web-config\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.285463 master-0 kubenswrapper[33572]: I1204 22:24:20.286096 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-config-volume\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.388064 master-0 kubenswrapper[33572]: I1204 22:24:20.387914 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6d2204da-2102-48fb-9865-ff8f367a02f3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.388064 master-0 kubenswrapper[33572]: I1204 22:24:20.387986 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.388064 master-0 kubenswrapper[33572]: I1204 22:24:20.388029 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5gk7\" (UniqueName: \"kubernetes.io/projected/6d2204da-2102-48fb-9865-ff8f367a02f3-kube-api-access-j5gk7\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.389711 master-0 kubenswrapper[33572]: I1204 22:24:20.388794 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d2204da-2102-48fb-9865-ff8f367a02f3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.389711 master-0 kubenswrapper[33572]: I1204 22:24:20.388895 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6d2204da-2102-48fb-9865-ff8f367a02f3-config-out\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.389711 master-0 kubenswrapper[33572]: I1204 22:24:20.388962 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.389711 master-0 kubenswrapper[33572]: I1204 22:24:20.388991 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-web-config\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.389711 master-0 kubenswrapper[33572]: I1204 22:24:20.389018 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-config-volume\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.389711 master-0 kubenswrapper[33572]: I1204 22:24:20.389113 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.389711 master-0 kubenswrapper[33572]: I1204 22:24:20.389151 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6d2204da-2102-48fb-9865-ff8f367a02f3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.389711 master-0 kubenswrapper[33572]: I1204 22:24:20.389182 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.389711 master-0 kubenswrapper[33572]: I1204 22:24:20.389249 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d2204da-2102-48fb-9865-ff8f367a02f3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.390242 master-0 kubenswrapper[33572]: I1204 22:24:20.390032 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6d2204da-2102-48fb-9865-ff8f367a02f3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.391528 master-0 kubenswrapper[33572]: I1204 22:24:20.391477 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d2204da-2102-48fb-9865-ff8f367a02f3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.392738 master-0 kubenswrapper[33572]: I1204 22:24:20.392681 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6d2204da-2102-48fb-9865-ff8f367a02f3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.392809 master-0 kubenswrapper[33572]: I1204 22:24:20.392761 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d2204da-2102-48fb-9865-ff8f367a02f3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.394017 master-0 kubenswrapper[33572]: I1204 22:24:20.393973 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-config-volume\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.394636 master-0 kubenswrapper[33572]: I1204 22:24:20.394607 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.395394 master-0 kubenswrapper[33572]: I1204 22:24:20.395024 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6d2204da-2102-48fb-9865-ff8f367a02f3-config-out\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.395394 master-0 kubenswrapper[33572]: I1204 22:24:20.395047 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-web-config\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.395394 master-0 kubenswrapper[33572]: I1204 22:24:20.395342 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.395969 master-0 kubenswrapper[33572]: I1204 22:24:20.395903 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.397581 master-0 kubenswrapper[33572]: I1204 22:24:20.397551 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6d2204da-2102-48fb-9865-ff8f367a02f3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.417621 master-0 kubenswrapper[33572]: I1204 22:24:20.417588 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5gk7\" (UniqueName: \"kubernetes.io/projected/6d2204da-2102-48fb-9865-ff8f367a02f3-kube-api-access-j5gk7\") pod \"alertmanager-main-0\" (UID: \"6d2204da-2102-48fb-9865-ff8f367a02f3\") " pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.507915 master-0 kubenswrapper[33572]: I1204 22:24:20.507853 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 04 22:24:20.538486 master-0 kubenswrapper[33572]: I1204 22:24:20.538425 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2221d215-2187-4dd6-a2de-f0fc6ec54027" path="/var/lib/kubelet/pods/2221d215-2187-4dd6-a2de-f0fc6ec54027/volumes" Dec 04 22:24:21.035090 master-0 kubenswrapper[33572]: W1204 22:24:21.035023 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d2204da_2102_48fb_9865_ff8f367a02f3.slice/crio-f256ad4fca3cefc6c2a5e763321e8b56c6ae5f599c679fa83be59b1e7e38a025 WatchSource:0}: Error finding container f256ad4fca3cefc6c2a5e763321e8b56c6ae5f599c679fa83be59b1e7e38a025: Status 404 returned error can't find the container with id f256ad4fca3cefc6c2a5e763321e8b56c6ae5f599c679fa83be59b1e7e38a025 Dec 04 22:24:21.035245 master-0 kubenswrapper[33572]: I1204 22:24:21.035205 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 04 22:24:21.335980 master-0 kubenswrapper[33572]: I1204 22:24:21.335889 33572 generic.go:334] "Generic (PLEG): container finished" podID="6d2204da-2102-48fb-9865-ff8f367a02f3" containerID="b1bbcac05099d1c531b878cb7504ef2743f3125e4b16abd4f7ed77ee50c5e9a1" exitCode=0 Dec 04 22:24:21.336224 master-0 kubenswrapper[33572]: I1204 22:24:21.336002 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d2204da-2102-48fb-9865-ff8f367a02f3","Type":"ContainerDied","Data":"b1bbcac05099d1c531b878cb7504ef2743f3125e4b16abd4f7ed77ee50c5e9a1"} Dec 04 22:24:21.336224 master-0 kubenswrapper[33572]: I1204 22:24:21.336074 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d2204da-2102-48fb-9865-ff8f367a02f3","Type":"ContainerStarted","Data":"f256ad4fca3cefc6c2a5e763321e8b56c6ae5f599c679fa83be59b1e7e38a025"} Dec 04 22:24:22.354848 master-0 kubenswrapper[33572]: I1204 22:24:22.354612 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d2204da-2102-48fb-9865-ff8f367a02f3","Type":"ContainerStarted","Data":"fa3e9a1e2094d8194bfadbe3fff75809ea81db3170216df02b2d2deb49c89a12"} Dec 04 22:24:22.354848 master-0 kubenswrapper[33572]: I1204 22:24:22.354753 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d2204da-2102-48fb-9865-ff8f367a02f3","Type":"ContainerStarted","Data":"d02b5dab31ac16056d5558d34766ceddc76962d277610c264f2bcba8d7035a70"} Dec 04 22:24:22.354848 master-0 kubenswrapper[33572]: I1204 22:24:22.354767 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d2204da-2102-48fb-9865-ff8f367a02f3","Type":"ContainerStarted","Data":"a07e49e52f5ce87af829359bd7b01d1a4239e4b0b6dce2a2aeb43190b56c3406"} Dec 04 22:24:22.354848 master-0 kubenswrapper[33572]: I1204 22:24:22.354779 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d2204da-2102-48fb-9865-ff8f367a02f3","Type":"ContainerStarted","Data":"2e8912724d490b81b35c1535295d7190c29fa6e9ff9e62d9197c129f49b69ba0"} Dec 04 22:24:22.354848 master-0 kubenswrapper[33572]: I1204 22:24:22.354789 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d2204da-2102-48fb-9865-ff8f367a02f3","Type":"ContainerStarted","Data":"21f3293ee654e2bb781472c9cd61aed5e595436f4345a611989cd72a5ea655a7"} Dec 04 22:24:22.539181 master-0 kubenswrapper[33572]: I1204 22:24:22.539120 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 04 22:24:22.539727 master-0 kubenswrapper[33572]: I1204 22:24:22.539665 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="thanos-sidecar" containerID="cri-o://07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f" gracePeriod=600 Dec 04 22:24:22.539815 master-0 kubenswrapper[33572]: I1204 22:24:22.539720 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy-thanos" containerID="cri-o://c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4" gracePeriod=600 Dec 04 22:24:22.539815 master-0 kubenswrapper[33572]: I1204 22:24:22.539661 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy" containerID="cri-o://ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e" gracePeriod=600 Dec 04 22:24:22.539926 master-0 kubenswrapper[33572]: I1204 22:24:22.539729 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy-web" containerID="cri-o://d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48" gracePeriod=600 Dec 04 22:24:22.539997 master-0 kubenswrapper[33572]: I1204 22:24:22.539697 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="config-reloader" containerID="cri-o://00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724" gracePeriod=600 Dec 04 22:24:22.540147 master-0 kubenswrapper[33572]: I1204 22:24:22.539849 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="prometheus" containerID="cri-o://49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426" gracePeriod=600 Dec 04 22:24:23.171592 master-0 kubenswrapper[33572]: I1204 22:24:23.169417 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263386 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-tls-assets\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263477 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263546 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-web-config\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263580 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-config-out\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263627 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-rulefiles-0\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263663 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-serving-certs-ca-bundle\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263711 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-metrics-client-certs\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263742 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263783 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-thanos-prometheus-http-client-file\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263870 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdz8b\" (UniqueName: \"kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-kube-api-access-bdz8b\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.264533 master-0 kubenswrapper[33572]: I1204 22:24:23.263901 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-db\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.265211 master-0 kubenswrapper[33572]: I1204 22:24:23.264974 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:23.267531 master-0 kubenswrapper[33572]: I1204 22:24:23.265469 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-config\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.267531 master-0 kubenswrapper[33572]: I1204 22:24:23.265604 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-trusted-ca-bundle\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.267531 master-0 kubenswrapper[33572]: I1204 22:24:23.265642 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-kube-rbac-proxy\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.267531 master-0 kubenswrapper[33572]: I1204 22:24:23.265674 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-metrics-client-ca\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.267531 master-0 kubenswrapper[33572]: I1204 22:24:23.265710 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-kubelet-serving-ca-bundle\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.267531 master-0 kubenswrapper[33572]: I1204 22:24:23.265751 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-grpc-tls\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.267531 master-0 kubenswrapper[33572]: I1204 22:24:23.265800 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"c1b7c3b1-3850-4a37-9a36-e84537557071\" (UID: \"c1b7c3b1-3850-4a37-9a36-e84537557071\") " Dec 04 22:24:23.267531 master-0 kubenswrapper[33572]: I1204 22:24:23.266163 33572 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.267531 master-0 kubenswrapper[33572]: I1204 22:24:23.267087 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:24:23.274197 master-0 kubenswrapper[33572]: I1204 22:24:23.268261 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-config-out" (OuterVolumeSpecName: "config-out") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:24:23.274197 master-0 kubenswrapper[33572]: I1204 22:24:23.270403 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:23.274197 master-0 kubenswrapper[33572]: I1204 22:24:23.271101 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:23.274197 master-0 kubenswrapper[33572]: I1204 22:24:23.273467 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:23.274197 master-0 kubenswrapper[33572]: I1204 22:24:23.273639 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:23.274612 master-0 kubenswrapper[33572]: I1204 22:24:23.274439 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:23.277376 master-0 kubenswrapper[33572]: I1204 22:24:23.276812 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-config" (OuterVolumeSpecName: "config") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:23.277376 master-0 kubenswrapper[33572]: I1204 22:24:23.277102 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:24:23.277943 master-0 kubenswrapper[33572]: I1204 22:24:23.277897 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:23.279447 master-0 kubenswrapper[33572]: I1204 22:24:23.278703 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:23.279757 master-0 kubenswrapper[33572]: I1204 22:24:23.279714 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:23.279884 master-0 kubenswrapper[33572]: I1204 22:24:23.279761 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:23.281219 master-0 kubenswrapper[33572]: I1204 22:24:23.281166 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:23.283620 master-0 kubenswrapper[33572]: I1204 22:24:23.283281 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:23.283620 master-0 kubenswrapper[33572]: I1204 22:24:23.283414 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-kube-api-access-bdz8b" (OuterVolumeSpecName: "kube-api-access-bdz8b") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "kube-api-access-bdz8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:24:23.339900 master-0 kubenswrapper[33572]: I1204 22:24:23.339790 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-web-config" (OuterVolumeSpecName: "web-config") pod "c1b7c3b1-3850-4a37-9a36-e84537557071" (UID: "c1b7c3b1-3850-4a37-9a36-e84537557071"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:23.367573 master-0 kubenswrapper[33572]: I1204 22:24:23.367470 33572 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-tls-assets\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.367573 master-0 kubenswrapper[33572]: I1204 22:24:23.367572 33572 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367608 33572 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-web-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367628 33572 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-config-out\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367654 33572 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367673 33572 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367695 33572 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367721 33572 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367752 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdz8b\" (UniqueName: \"kubernetes.io/projected/c1b7c3b1-3850-4a37-9a36-e84537557071-kube-api-access-bdz8b\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367813 33572 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367857 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367875 33572 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367893 33572 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367915 33572 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367934 33572 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1b7c3b1-3850-4a37-9a36-e84537557071-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367956 33572 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.368205 master-0 kubenswrapper[33572]: I1204 22:24:23.367975 33572 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1b7c3b1-3850-4a37-9a36-e84537557071-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:23.372245 master-0 kubenswrapper[33572]: I1204 22:24:23.372192 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6d2204da-2102-48fb-9865-ff8f367a02f3","Type":"ContainerStarted","Data":"629f77829046e641288d660dc54efe2809aaadfb53721a0a405d65ee2aa63aa2"} Dec 04 22:24:23.378552 master-0 kubenswrapper[33572]: I1204 22:24:23.378480 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.379309 master-0 kubenswrapper[33572]: I1204 22:24:23.378543 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerDied","Data":"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4"} Dec 04 22:24:23.379309 master-0 kubenswrapper[33572]: I1204 22:24:23.378659 33572 scope.go:117] "RemoveContainer" containerID="c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4" Dec 04 22:24:23.379309 master-0 kubenswrapper[33572]: I1204 22:24:23.378321 33572 generic.go:334] "Generic (PLEG): container finished" podID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerID="c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4" exitCode=0 Dec 04 22:24:23.379309 master-0 kubenswrapper[33572]: I1204 22:24:23.379171 33572 generic.go:334] "Generic (PLEG): container finished" podID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerID="ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e" exitCode=0 Dec 04 22:24:23.379309 master-0 kubenswrapper[33572]: I1204 22:24:23.379195 33572 generic.go:334] "Generic (PLEG): container finished" podID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerID="d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48" exitCode=0 Dec 04 22:24:23.379309 master-0 kubenswrapper[33572]: I1204 22:24:23.379210 33572 generic.go:334] "Generic (PLEG): container finished" podID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerID="07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f" exitCode=0 Dec 04 22:24:23.379309 master-0 kubenswrapper[33572]: I1204 22:24:23.379224 33572 generic.go:334] "Generic (PLEG): container finished" podID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerID="00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724" exitCode=0 Dec 04 22:24:23.379309 master-0 kubenswrapper[33572]: I1204 22:24:23.379236 33572 generic.go:334] "Generic (PLEG): container finished" podID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerID="49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426" exitCode=0 Dec 04 22:24:23.379309 master-0 kubenswrapper[33572]: I1204 22:24:23.379245 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerDied","Data":"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e"} Dec 04 22:24:23.379689 master-0 kubenswrapper[33572]: I1204 22:24:23.379329 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerDied","Data":"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48"} Dec 04 22:24:23.379689 master-0 kubenswrapper[33572]: I1204 22:24:23.379365 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerDied","Data":"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f"} Dec 04 22:24:23.379689 master-0 kubenswrapper[33572]: I1204 22:24:23.379390 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerDied","Data":"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724"} Dec 04 22:24:23.379689 master-0 kubenswrapper[33572]: I1204 22:24:23.379419 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerDied","Data":"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426"} Dec 04 22:24:23.379689 master-0 kubenswrapper[33572]: I1204 22:24:23.379457 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c1b7c3b1-3850-4a37-9a36-e84537557071","Type":"ContainerDied","Data":"5a7c26bd0a3ca94fcc0a15c70a4aff69934c912a02e82c8167d934151047373d"} Dec 04 22:24:23.425298 master-0 kubenswrapper[33572]: I1204 22:24:23.425259 33572 scope.go:117] "RemoveContainer" containerID="ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e" Dec 04 22:24:23.433479 master-0 kubenswrapper[33572]: I1204 22:24:23.433358 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.433327274 podStartE2EDuration="3.433327274s" podCreationTimestamp="2025-12-04 22:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:24:23.416916591 +0000 UTC m=+327.144442250" watchObservedRunningTime="2025-12-04 22:24:23.433327274 +0000 UTC m=+327.160852933" Dec 04 22:24:23.458584 master-0 kubenswrapper[33572]: I1204 22:24:23.458167 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 04 22:24:23.471642 master-0 kubenswrapper[33572]: I1204 22:24:23.471577 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 04 22:24:23.484250 master-0 kubenswrapper[33572]: I1204 22:24:23.484193 33572 scope.go:117] "RemoveContainer" containerID="d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48" Dec 04 22:24:23.487149 master-0 kubenswrapper[33572]: I1204 22:24:23.487100 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 04 22:24:23.487431 master-0 kubenswrapper[33572]: E1204 22:24:23.487405 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="init-config-reloader" Dec 04 22:24:23.487473 master-0 kubenswrapper[33572]: I1204 22:24:23.487440 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="init-config-reloader" Dec 04 22:24:23.487473 master-0 kubenswrapper[33572]: E1204 22:24:23.487464 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy" Dec 04 22:24:23.487473 master-0 kubenswrapper[33572]: I1204 22:24:23.487471 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy" Dec 04 22:24:23.487604 master-0 kubenswrapper[33572]: E1204 22:24:23.487494 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy-thanos" Dec 04 22:24:23.487604 master-0 kubenswrapper[33572]: I1204 22:24:23.487529 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy-thanos" Dec 04 22:24:23.487604 master-0 kubenswrapper[33572]: E1204 22:24:23.487543 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy-web" Dec 04 22:24:23.487604 master-0 kubenswrapper[33572]: I1204 22:24:23.487550 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy-web" Dec 04 22:24:23.487604 master-0 kubenswrapper[33572]: E1204 22:24:23.487562 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="prometheus" Dec 04 22:24:23.487604 master-0 kubenswrapper[33572]: I1204 22:24:23.487569 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="prometheus" Dec 04 22:24:23.488754 master-0 kubenswrapper[33572]: E1204 22:24:23.488646 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="thanos-sidecar" Dec 04 22:24:23.488754 master-0 kubenswrapper[33572]: I1204 22:24:23.488746 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="thanos-sidecar" Dec 04 22:24:23.488837 master-0 kubenswrapper[33572]: E1204 22:24:23.488763 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="config-reloader" Dec 04 22:24:23.488837 master-0 kubenswrapper[33572]: I1204 22:24:23.488769 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="config-reloader" Dec 04 22:24:23.488964 master-0 kubenswrapper[33572]: I1204 22:24:23.488939 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy" Dec 04 22:24:23.489013 master-0 kubenswrapper[33572]: I1204 22:24:23.488996 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy-web" Dec 04 22:24:23.489047 master-0 kubenswrapper[33572]: I1204 22:24:23.489018 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="thanos-sidecar" Dec 04 22:24:23.489047 master-0 kubenswrapper[33572]: I1204 22:24:23.489028 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="prometheus" Dec 04 22:24:23.489105 master-0 kubenswrapper[33572]: I1204 22:24:23.489059 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="config-reloader" Dec 04 22:24:23.489105 master-0 kubenswrapper[33572]: I1204 22:24:23.489070 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" containerName="kube-rbac-proxy-thanos" Dec 04 22:24:23.494447 master-0 kubenswrapper[33572]: I1204 22:24:23.494301 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.497965 master-0 kubenswrapper[33572]: I1204 22:24:23.497913 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 04 22:24:23.498025 master-0 kubenswrapper[33572]: I1204 22:24:23.497978 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-98i4jt5uspsnd" Dec 04 22:24:23.498214 master-0 kubenswrapper[33572]: I1204 22:24:23.498188 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 04 22:24:23.498653 master-0 kubenswrapper[33572]: I1204 22:24:23.498592 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-s6gwp" Dec 04 22:24:23.498721 master-0 kubenswrapper[33572]: I1204 22:24:23.498695 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 04 22:24:23.498925 master-0 kubenswrapper[33572]: I1204 22:24:23.498897 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 04 22:24:23.499040 master-0 kubenswrapper[33572]: I1204 22:24:23.499015 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 04 22:24:23.499344 master-0 kubenswrapper[33572]: I1204 22:24:23.499306 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 04 22:24:23.499344 master-0 kubenswrapper[33572]: I1204 22:24:23.499340 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 04 22:24:23.499472 master-0 kubenswrapper[33572]: I1204 22:24:23.499405 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 04 22:24:23.504937 master-0 kubenswrapper[33572]: I1204 22:24:23.504876 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 04 22:24:23.507012 master-0 kubenswrapper[33572]: I1204 22:24:23.506971 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 04 22:24:23.510807 master-0 kubenswrapper[33572]: I1204 22:24:23.510761 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 04 22:24:23.510963 master-0 kubenswrapper[33572]: I1204 22:24:23.510925 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 04 22:24:23.511430 master-0 kubenswrapper[33572]: I1204 22:24:23.511367 33572 scope.go:117] "RemoveContainer" containerID="07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f" Dec 04 22:24:23.539272 master-0 kubenswrapper[33572]: I1204 22:24:23.539128 33572 scope.go:117] "RemoveContainer" containerID="00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724" Dec 04 22:24:23.572102 master-0 kubenswrapper[33572]: I1204 22:24:23.572030 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572102 master-0 kubenswrapper[33572]: I1204 22:24:23.572094 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7c874e55-3cdf-44cf-85a1-36bfadc33a31-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572346 master-0 kubenswrapper[33572]: I1204 22:24:23.572132 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572346 master-0 kubenswrapper[33572]: I1204 22:24:23.572160 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572346 master-0 kubenswrapper[33572]: I1204 22:24:23.572231 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4l4s\" (UniqueName: \"kubernetes.io/projected/7c874e55-3cdf-44cf-85a1-36bfadc33a31-kube-api-access-c4l4s\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572346 master-0 kubenswrapper[33572]: I1204 22:24:23.572283 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572346 master-0 kubenswrapper[33572]: I1204 22:24:23.572311 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572346 master-0 kubenswrapper[33572]: I1204 22:24:23.572339 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572535 master-0 kubenswrapper[33572]: I1204 22:24:23.572372 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572535 master-0 kubenswrapper[33572]: I1204 22:24:23.572404 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-web-config\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572535 master-0 kubenswrapper[33572]: I1204 22:24:23.572433 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572535 master-0 kubenswrapper[33572]: I1204 22:24:23.572473 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572535 master-0 kubenswrapper[33572]: I1204 22:24:23.572516 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572690 master-0 kubenswrapper[33572]: I1204 22:24:23.572567 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7c874e55-3cdf-44cf-85a1-36bfadc33a31-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572690 master-0 kubenswrapper[33572]: I1204 22:24:23.572614 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572690 master-0 kubenswrapper[33572]: I1204 22:24:23.572650 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572784 master-0 kubenswrapper[33572]: I1204 22:24:23.572704 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7c874e55-3cdf-44cf-85a1-36bfadc33a31-config-out\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.572784 master-0 kubenswrapper[33572]: I1204 22:24:23.572763 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-config\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.574891 master-0 kubenswrapper[33572]: I1204 22:24:23.574861 33572 scope.go:117] "RemoveContainer" containerID="49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426" Dec 04 22:24:23.590959 master-0 kubenswrapper[33572]: I1204 22:24:23.590900 33572 scope.go:117] "RemoveContainer" containerID="a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8" Dec 04 22:24:23.603491 master-0 kubenswrapper[33572]: I1204 22:24:23.603445 33572 scope.go:117] "RemoveContainer" containerID="c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4" Dec 04 22:24:23.603914 master-0 kubenswrapper[33572]: E1204 22:24:23.603877 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": container with ID starting with c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4 not found: ID does not exist" containerID="c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4" Dec 04 22:24:23.603962 master-0 kubenswrapper[33572]: I1204 22:24:23.603917 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4"} err="failed to get container status \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": rpc error: code = NotFound desc = could not find container \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": container with ID starting with c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4 not found: ID does not exist" Dec 04 22:24:23.603962 master-0 kubenswrapper[33572]: I1204 22:24:23.603943 33572 scope.go:117] "RemoveContainer" containerID="ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e" Dec 04 22:24:23.604265 master-0 kubenswrapper[33572]: E1204 22:24:23.604231 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": container with ID starting with ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e not found: ID does not exist" containerID="ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e" Dec 04 22:24:23.604309 master-0 kubenswrapper[33572]: I1204 22:24:23.604281 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e"} err="failed to get container status \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": rpc error: code = NotFound desc = could not find container \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": container with ID starting with ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e not found: ID does not exist" Dec 04 22:24:23.604309 master-0 kubenswrapper[33572]: I1204 22:24:23.604300 33572 scope.go:117] "RemoveContainer" containerID="d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48" Dec 04 22:24:23.604658 master-0 kubenswrapper[33572]: E1204 22:24:23.604604 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": container with ID starting with d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48 not found: ID does not exist" containerID="d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48" Dec 04 22:24:23.604707 master-0 kubenswrapper[33572]: I1204 22:24:23.604654 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48"} err="failed to get container status \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": rpc error: code = NotFound desc = could not find container \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": container with ID starting with d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48 not found: ID does not exist" Dec 04 22:24:23.604707 master-0 kubenswrapper[33572]: I1204 22:24:23.604673 33572 scope.go:117] "RemoveContainer" containerID="07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f" Dec 04 22:24:23.604977 master-0 kubenswrapper[33572]: E1204 22:24:23.604944 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": container with ID starting with 07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f not found: ID does not exist" containerID="07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f" Dec 04 22:24:23.605012 master-0 kubenswrapper[33572]: I1204 22:24:23.604975 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f"} err="failed to get container status \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": rpc error: code = NotFound desc = could not find container \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": container with ID starting with 07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f not found: ID does not exist" Dec 04 22:24:23.605047 master-0 kubenswrapper[33572]: I1204 22:24:23.605011 33572 scope.go:117] "RemoveContainer" containerID="00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724" Dec 04 22:24:23.605275 master-0 kubenswrapper[33572]: E1204 22:24:23.605239 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": container with ID starting with 00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724 not found: ID does not exist" containerID="00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724" Dec 04 22:24:23.605308 master-0 kubenswrapper[33572]: I1204 22:24:23.605272 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724"} err="failed to get container status \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": rpc error: code = NotFound desc = could not find container \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": container with ID starting with 00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724 not found: ID does not exist" Dec 04 22:24:23.605308 master-0 kubenswrapper[33572]: I1204 22:24:23.605294 33572 scope.go:117] "RemoveContainer" containerID="49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426" Dec 04 22:24:23.605570 master-0 kubenswrapper[33572]: E1204 22:24:23.605545 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": container with ID starting with 49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426 not found: ID does not exist" containerID="49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426" Dec 04 22:24:23.605627 master-0 kubenswrapper[33572]: I1204 22:24:23.605572 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426"} err="failed to get container status \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": rpc error: code = NotFound desc = could not find container \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": container with ID starting with 49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426 not found: ID does not exist" Dec 04 22:24:23.605627 master-0 kubenswrapper[33572]: I1204 22:24:23.605593 33572 scope.go:117] "RemoveContainer" containerID="a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8" Dec 04 22:24:23.605897 master-0 kubenswrapper[33572]: E1204 22:24:23.605854 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": container with ID starting with a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8 not found: ID does not exist" containerID="a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8" Dec 04 22:24:23.605950 master-0 kubenswrapper[33572]: I1204 22:24:23.605909 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8"} err="failed to get container status \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": rpc error: code = NotFound desc = could not find container \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": container with ID starting with a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8 not found: ID does not exist" Dec 04 22:24:23.605950 master-0 kubenswrapper[33572]: I1204 22:24:23.605943 33572 scope.go:117] "RemoveContainer" containerID="c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4" Dec 04 22:24:23.606273 master-0 kubenswrapper[33572]: I1204 22:24:23.606235 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4"} err="failed to get container status \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": rpc error: code = NotFound desc = could not find container \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": container with ID starting with c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4 not found: ID does not exist" Dec 04 22:24:23.606273 master-0 kubenswrapper[33572]: I1204 22:24:23.606265 33572 scope.go:117] "RemoveContainer" containerID="ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e" Dec 04 22:24:23.606582 master-0 kubenswrapper[33572]: I1204 22:24:23.606545 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e"} err="failed to get container status \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": rpc error: code = NotFound desc = could not find container \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": container with ID starting with ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e not found: ID does not exist" Dec 04 22:24:23.606582 master-0 kubenswrapper[33572]: I1204 22:24:23.606577 33572 scope.go:117] "RemoveContainer" containerID="d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48" Dec 04 22:24:23.606825 master-0 kubenswrapper[33572]: I1204 22:24:23.606803 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48"} err="failed to get container status \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": rpc error: code = NotFound desc = could not find container \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": container with ID starting with d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48 not found: ID does not exist" Dec 04 22:24:23.606825 master-0 kubenswrapper[33572]: I1204 22:24:23.606824 33572 scope.go:117] "RemoveContainer" containerID="07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f" Dec 04 22:24:23.607066 master-0 kubenswrapper[33572]: I1204 22:24:23.607041 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f"} err="failed to get container status \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": rpc error: code = NotFound desc = could not find container \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": container with ID starting with 07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f not found: ID does not exist" Dec 04 22:24:23.607134 master-0 kubenswrapper[33572]: I1204 22:24:23.607066 33572 scope.go:117] "RemoveContainer" containerID="00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724" Dec 04 22:24:23.607347 master-0 kubenswrapper[33572]: I1204 22:24:23.607305 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724"} err="failed to get container status \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": rpc error: code = NotFound desc = could not find container \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": container with ID starting with 00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724 not found: ID does not exist" Dec 04 22:24:23.607347 master-0 kubenswrapper[33572]: I1204 22:24:23.607331 33572 scope.go:117] "RemoveContainer" containerID="49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426" Dec 04 22:24:23.607578 master-0 kubenswrapper[33572]: I1204 22:24:23.607551 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426"} err="failed to get container status \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": rpc error: code = NotFound desc = could not find container \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": container with ID starting with 49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426 not found: ID does not exist" Dec 04 22:24:23.607620 master-0 kubenswrapper[33572]: I1204 22:24:23.607579 33572 scope.go:117] "RemoveContainer" containerID="a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8" Dec 04 22:24:23.608066 master-0 kubenswrapper[33572]: I1204 22:24:23.608032 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8"} err="failed to get container status \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": rpc error: code = NotFound desc = could not find container \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": container with ID starting with a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8 not found: ID does not exist" Dec 04 22:24:23.608066 master-0 kubenswrapper[33572]: I1204 22:24:23.608057 33572 scope.go:117] "RemoveContainer" containerID="c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4" Dec 04 22:24:23.608333 master-0 kubenswrapper[33572]: I1204 22:24:23.608303 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4"} err="failed to get container status \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": rpc error: code = NotFound desc = could not find container \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": container with ID starting with c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4 not found: ID does not exist" Dec 04 22:24:23.608333 master-0 kubenswrapper[33572]: I1204 22:24:23.608325 33572 scope.go:117] "RemoveContainer" containerID="ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e" Dec 04 22:24:23.608604 master-0 kubenswrapper[33572]: I1204 22:24:23.608579 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e"} err="failed to get container status \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": rpc error: code = NotFound desc = could not find container \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": container with ID starting with ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e not found: ID does not exist" Dec 04 22:24:23.608654 master-0 kubenswrapper[33572]: I1204 22:24:23.608602 33572 scope.go:117] "RemoveContainer" containerID="d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48" Dec 04 22:24:23.608921 master-0 kubenswrapper[33572]: I1204 22:24:23.608893 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48"} err="failed to get container status \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": rpc error: code = NotFound desc = could not find container \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": container with ID starting with d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48 not found: ID does not exist" Dec 04 22:24:23.608921 master-0 kubenswrapper[33572]: I1204 22:24:23.608913 33572 scope.go:117] "RemoveContainer" containerID="07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f" Dec 04 22:24:23.609137 master-0 kubenswrapper[33572]: I1204 22:24:23.609099 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f"} err="failed to get container status \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": rpc error: code = NotFound desc = could not find container \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": container with ID starting with 07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f not found: ID does not exist" Dec 04 22:24:23.609137 master-0 kubenswrapper[33572]: I1204 22:24:23.609129 33572 scope.go:117] "RemoveContainer" containerID="00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724" Dec 04 22:24:23.609370 master-0 kubenswrapper[33572]: I1204 22:24:23.609333 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724"} err="failed to get container status \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": rpc error: code = NotFound desc = could not find container \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": container with ID starting with 00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724 not found: ID does not exist" Dec 04 22:24:23.609370 master-0 kubenswrapper[33572]: I1204 22:24:23.609358 33572 scope.go:117] "RemoveContainer" containerID="49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426" Dec 04 22:24:23.609720 master-0 kubenswrapper[33572]: I1204 22:24:23.609682 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426"} err="failed to get container status \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": rpc error: code = NotFound desc = could not find container \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": container with ID starting with 49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426 not found: ID does not exist" Dec 04 22:24:23.609720 master-0 kubenswrapper[33572]: I1204 22:24:23.609711 33572 scope.go:117] "RemoveContainer" containerID="a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8" Dec 04 22:24:23.610011 master-0 kubenswrapper[33572]: I1204 22:24:23.609977 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8"} err="failed to get container status \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": rpc error: code = NotFound desc = could not find container \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": container with ID starting with a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8 not found: ID does not exist" Dec 04 22:24:23.610011 master-0 kubenswrapper[33572]: I1204 22:24:23.610006 33572 scope.go:117] "RemoveContainer" containerID="c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4" Dec 04 22:24:23.610285 master-0 kubenswrapper[33572]: I1204 22:24:23.610248 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4"} err="failed to get container status \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": rpc error: code = NotFound desc = could not find container \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": container with ID starting with c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4 not found: ID does not exist" Dec 04 22:24:23.610285 master-0 kubenswrapper[33572]: I1204 22:24:23.610278 33572 scope.go:117] "RemoveContainer" containerID="ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e" Dec 04 22:24:23.610788 master-0 kubenswrapper[33572]: I1204 22:24:23.610751 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e"} err="failed to get container status \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": rpc error: code = NotFound desc = could not find container \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": container with ID starting with ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e not found: ID does not exist" Dec 04 22:24:23.610788 master-0 kubenswrapper[33572]: I1204 22:24:23.610781 33572 scope.go:117] "RemoveContainer" containerID="d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48" Dec 04 22:24:23.611052 master-0 kubenswrapper[33572]: I1204 22:24:23.611019 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48"} err="failed to get container status \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": rpc error: code = NotFound desc = could not find container \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": container with ID starting with d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48 not found: ID does not exist" Dec 04 22:24:23.611052 master-0 kubenswrapper[33572]: I1204 22:24:23.611046 33572 scope.go:117] "RemoveContainer" containerID="07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f" Dec 04 22:24:23.611318 master-0 kubenswrapper[33572]: I1204 22:24:23.611284 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f"} err="failed to get container status \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": rpc error: code = NotFound desc = could not find container \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": container with ID starting with 07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f not found: ID does not exist" Dec 04 22:24:23.611318 master-0 kubenswrapper[33572]: I1204 22:24:23.611309 33572 scope.go:117] "RemoveContainer" containerID="00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724" Dec 04 22:24:23.611591 master-0 kubenswrapper[33572]: I1204 22:24:23.611557 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724"} err="failed to get container status \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": rpc error: code = NotFound desc = could not find container \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": container with ID starting with 00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724 not found: ID does not exist" Dec 04 22:24:23.611591 master-0 kubenswrapper[33572]: I1204 22:24:23.611583 33572 scope.go:117] "RemoveContainer" containerID="49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426" Dec 04 22:24:23.611820 master-0 kubenswrapper[33572]: I1204 22:24:23.611788 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426"} err="failed to get container status \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": rpc error: code = NotFound desc = could not find container \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": container with ID starting with 49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426 not found: ID does not exist" Dec 04 22:24:23.611820 master-0 kubenswrapper[33572]: I1204 22:24:23.611811 33572 scope.go:117] "RemoveContainer" containerID="a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8" Dec 04 22:24:23.612000 master-0 kubenswrapper[33572]: I1204 22:24:23.611974 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8"} err="failed to get container status \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": rpc error: code = NotFound desc = could not find container \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": container with ID starting with a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8 not found: ID does not exist" Dec 04 22:24:23.612035 master-0 kubenswrapper[33572]: I1204 22:24:23.611999 33572 scope.go:117] "RemoveContainer" containerID="c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4" Dec 04 22:24:23.612173 master-0 kubenswrapper[33572]: I1204 22:24:23.612150 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4"} err="failed to get container status \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": rpc error: code = NotFound desc = could not find container \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": container with ID starting with c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4 not found: ID does not exist" Dec 04 22:24:23.612173 master-0 kubenswrapper[33572]: I1204 22:24:23.612170 33572 scope.go:117] "RemoveContainer" containerID="ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e" Dec 04 22:24:23.612343 master-0 kubenswrapper[33572]: I1204 22:24:23.612320 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e"} err="failed to get container status \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": rpc error: code = NotFound desc = could not find container \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": container with ID starting with ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e not found: ID does not exist" Dec 04 22:24:23.612381 master-0 kubenswrapper[33572]: I1204 22:24:23.612341 33572 scope.go:117] "RemoveContainer" containerID="d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48" Dec 04 22:24:23.612546 master-0 kubenswrapper[33572]: I1204 22:24:23.612525 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48"} err="failed to get container status \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": rpc error: code = NotFound desc = could not find container \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": container with ID starting with d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48 not found: ID does not exist" Dec 04 22:24:23.612546 master-0 kubenswrapper[33572]: I1204 22:24:23.612543 33572 scope.go:117] "RemoveContainer" containerID="07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f" Dec 04 22:24:23.612746 master-0 kubenswrapper[33572]: I1204 22:24:23.612721 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f"} err="failed to get container status \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": rpc error: code = NotFound desc = could not find container \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": container with ID starting with 07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f not found: ID does not exist" Dec 04 22:24:23.612784 master-0 kubenswrapper[33572]: I1204 22:24:23.612744 33572 scope.go:117] "RemoveContainer" containerID="00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724" Dec 04 22:24:23.612950 master-0 kubenswrapper[33572]: I1204 22:24:23.612922 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724"} err="failed to get container status \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": rpc error: code = NotFound desc = could not find container \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": container with ID starting with 00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724 not found: ID does not exist" Dec 04 22:24:23.612985 master-0 kubenswrapper[33572]: I1204 22:24:23.612949 33572 scope.go:117] "RemoveContainer" containerID="49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426" Dec 04 22:24:23.613145 master-0 kubenswrapper[33572]: I1204 22:24:23.613123 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426"} err="failed to get container status \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": rpc error: code = NotFound desc = could not find container \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": container with ID starting with 49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426 not found: ID does not exist" Dec 04 22:24:23.613145 master-0 kubenswrapper[33572]: I1204 22:24:23.613142 33572 scope.go:117] "RemoveContainer" containerID="a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8" Dec 04 22:24:23.613338 master-0 kubenswrapper[33572]: I1204 22:24:23.613313 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8"} err="failed to get container status \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": rpc error: code = NotFound desc = could not find container \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": container with ID starting with a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8 not found: ID does not exist" Dec 04 22:24:23.613338 master-0 kubenswrapper[33572]: I1204 22:24:23.613336 33572 scope.go:117] "RemoveContainer" containerID="c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4" Dec 04 22:24:23.613650 master-0 kubenswrapper[33572]: I1204 22:24:23.613571 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4"} err="failed to get container status \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": rpc error: code = NotFound desc = could not find container \"c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4\": container with ID starting with c905bcb39fb43d82649405c460735d040cfd8f98b92922b9bdde6c34b49617a4 not found: ID does not exist" Dec 04 22:24:23.613684 master-0 kubenswrapper[33572]: I1204 22:24:23.613651 33572 scope.go:117] "RemoveContainer" containerID="ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e" Dec 04 22:24:23.613867 master-0 kubenswrapper[33572]: I1204 22:24:23.613846 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e"} err="failed to get container status \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": rpc error: code = NotFound desc = could not find container \"ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e\": container with ID starting with ba15d55d72a8c9eb9e5ddaefbf7e51d2ac20c8d0d9bb5842137a1830cb0c781e not found: ID does not exist" Dec 04 22:24:23.613907 master-0 kubenswrapper[33572]: I1204 22:24:23.613868 33572 scope.go:117] "RemoveContainer" containerID="d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48" Dec 04 22:24:23.614066 master-0 kubenswrapper[33572]: I1204 22:24:23.614042 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48"} err="failed to get container status \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": rpc error: code = NotFound desc = could not find container \"d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48\": container with ID starting with d63d55de206903c98d0e7bc7531c9bc70a72b892ef1307a6adcbb583b3ddaa48 not found: ID does not exist" Dec 04 22:24:23.614107 master-0 kubenswrapper[33572]: I1204 22:24:23.614064 33572 scope.go:117] "RemoveContainer" containerID="07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f" Dec 04 22:24:23.614266 master-0 kubenswrapper[33572]: I1204 22:24:23.614238 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f"} err="failed to get container status \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": rpc error: code = NotFound desc = could not find container \"07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f\": container with ID starting with 07972316c6687db6220a8a1f7e5dcfe61f43310f787cd444e2ab783522b0ea9f not found: ID does not exist" Dec 04 22:24:23.614304 master-0 kubenswrapper[33572]: I1204 22:24:23.614265 33572 scope.go:117] "RemoveContainer" containerID="00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724" Dec 04 22:24:23.614523 master-0 kubenswrapper[33572]: I1204 22:24:23.614485 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724"} err="failed to get container status \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": rpc error: code = NotFound desc = could not find container \"00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724\": container with ID starting with 00a6653fc9eb307af8bead59770e92974a7d28da54b724a34cde7d1c72acb724 not found: ID does not exist" Dec 04 22:24:23.614562 master-0 kubenswrapper[33572]: I1204 22:24:23.614523 33572 scope.go:117] "RemoveContainer" containerID="49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426" Dec 04 22:24:23.614752 master-0 kubenswrapper[33572]: I1204 22:24:23.614724 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426"} err="failed to get container status \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": rpc error: code = NotFound desc = could not find container \"49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426\": container with ID starting with 49181c383f32ce02423c29509c5b3d1ec978b1a85d513502f992f2ba9485f426 not found: ID does not exist" Dec 04 22:24:23.614789 master-0 kubenswrapper[33572]: I1204 22:24:23.614752 33572 scope.go:117] "RemoveContainer" containerID="a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8" Dec 04 22:24:23.614941 master-0 kubenswrapper[33572]: I1204 22:24:23.614919 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8"} err="failed to get container status \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": rpc error: code = NotFound desc = could not find container \"a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8\": container with ID starting with a77d43b618b372dda8199b9337a17b551291a253337b2047f62b9c57068b83b8 not found: ID does not exist" Dec 04 22:24:23.674321 master-0 kubenswrapper[33572]: I1204 22:24:23.674181 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7c874e55-3cdf-44cf-85a1-36bfadc33a31-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.674321 master-0 kubenswrapper[33572]: I1204 22:24:23.674265 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.674638 master-0 kubenswrapper[33572]: I1204 22:24:23.674450 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.674638 master-0 kubenswrapper[33572]: I1204 22:24:23.674570 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4l4s\" (UniqueName: \"kubernetes.io/projected/7c874e55-3cdf-44cf-85a1-36bfadc33a31-kube-api-access-c4l4s\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.674839 master-0 kubenswrapper[33572]: I1204 22:24:23.674799 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7c874e55-3cdf-44cf-85a1-36bfadc33a31-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675160 master-0 kubenswrapper[33572]: I1204 22:24:23.674666 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675199 master-0 kubenswrapper[33572]: I1204 22:24:23.675179 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675244 master-0 kubenswrapper[33572]: I1204 22:24:23.675224 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675319 master-0 kubenswrapper[33572]: I1204 22:24:23.675293 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675356 master-0 kubenswrapper[33572]: I1204 22:24:23.675342 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-web-config\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675434 master-0 kubenswrapper[33572]: I1204 22:24:23.675409 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675482 master-0 kubenswrapper[33572]: I1204 22:24:23.675467 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675541 master-0 kubenswrapper[33572]: I1204 22:24:23.675491 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675573 master-0 kubenswrapper[33572]: I1204 22:24:23.675539 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7c874e55-3cdf-44cf-85a1-36bfadc33a31-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675573 master-0 kubenswrapper[33572]: I1204 22:24:23.675565 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675651 master-0 kubenswrapper[33572]: I1204 22:24:23.675624 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675701 master-0 kubenswrapper[33572]: I1204 22:24:23.675673 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675775 master-0 kubenswrapper[33572]: I1204 22:24:23.675745 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7c874e55-3cdf-44cf-85a1-36bfadc33a31-config-out\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675869 master-0 kubenswrapper[33572]: I1204 22:24:23.675840 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-config\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.675909 master-0 kubenswrapper[33572]: I1204 22:24:23.675879 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.676005 master-0 kubenswrapper[33572]: I1204 22:24:23.675979 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.678283 master-0 kubenswrapper[33572]: I1204 22:24:23.678251 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.686858 master-0 kubenswrapper[33572]: I1204 22:24:23.681151 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-config\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.686858 master-0 kubenswrapper[33572]: I1204 22:24:23.681205 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-web-config\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.686858 master-0 kubenswrapper[33572]: I1204 22:24:23.681404 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.686858 master-0 kubenswrapper[33572]: I1204 22:24:23.681516 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.686858 master-0 kubenswrapper[33572]: I1204 22:24:23.681545 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.686858 master-0 kubenswrapper[33572]: I1204 22:24:23.681833 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.686858 master-0 kubenswrapper[33572]: I1204 22:24:23.685100 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7c874e55-3cdf-44cf-85a1-36bfadc33a31-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.686858 master-0 kubenswrapper[33572]: I1204 22:24:23.685715 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7c874e55-3cdf-44cf-85a1-36bfadc33a31-config-out\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.686858 master-0 kubenswrapper[33572]: I1204 22:24:23.686166 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.692561 master-0 kubenswrapper[33572]: I1204 22:24:23.692497 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7c874e55-3cdf-44cf-85a1-36bfadc33a31-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.694246 master-0 kubenswrapper[33572]: I1204 22:24:23.694205 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.694471 master-0 kubenswrapper[33572]: I1204 22:24:23.694421 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.697160 master-0 kubenswrapper[33572]: I1204 22:24:23.697092 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7c874e55-3cdf-44cf-85a1-36bfadc33a31-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.704550 master-0 kubenswrapper[33572]: I1204 22:24:23.703828 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4l4s\" (UniqueName: \"kubernetes.io/projected/7c874e55-3cdf-44cf-85a1-36bfadc33a31-kube-api-access-c4l4s\") pod \"prometheus-k8s-0\" (UID: \"7c874e55-3cdf-44cf-85a1-36bfadc33a31\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:23.791397 master-0 kubenswrapper[33572]: I1204 22:24:23.791297 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:23.791919 master-0 kubenswrapper[33572]: I1204 22:24:23.791856 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:23.800300 master-0 kubenswrapper[33572]: I1204 22:24:23.800211 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:23.821618 master-0 kubenswrapper[33572]: I1204 22:24:23.821537 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:24.362277 master-0 kubenswrapper[33572]: I1204 22:24:24.362186 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 04 22:24:24.366466 master-0 kubenswrapper[33572]: W1204 22:24:24.366383 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c874e55_3cdf_44cf_85a1_36bfadc33a31.slice/crio-371d3d241d82b31c04e10b4d9658bf383f6311bd0c1ac14f9eca775eec771a1b WatchSource:0}: Error finding container 371d3d241d82b31c04e10b4d9658bf383f6311bd0c1ac14f9eca775eec771a1b: Status 404 returned error can't find the container with id 371d3d241d82b31c04e10b4d9658bf383f6311bd0c1ac14f9eca775eec771a1b Dec 04 22:24:24.396876 master-0 kubenswrapper[33572]: I1204 22:24:24.396779 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7c874e55-3cdf-44cf-85a1-36bfadc33a31","Type":"ContainerStarted","Data":"371d3d241d82b31c04e10b4d9658bf383f6311bd0c1ac14f9eca775eec771a1b"} Dec 04 22:24:24.403358 master-0 kubenswrapper[33572]: I1204 22:24:24.403299 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:24:24.510741 master-0 kubenswrapper[33572]: I1204 22:24:24.504618 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-744594955b-qspk5"] Dec 04 22:24:24.546741 master-0 kubenswrapper[33572]: I1204 22:24:24.546632 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1b7c3b1-3850-4a37-9a36-e84537557071" path="/var/lib/kubelet/pods/c1b7c3b1-3850-4a37-9a36-e84537557071/volumes" Dec 04 22:24:25.409543 master-0 kubenswrapper[33572]: I1204 22:24:25.409431 33572 generic.go:334] "Generic (PLEG): container finished" podID="7c874e55-3cdf-44cf-85a1-36bfadc33a31" containerID="8c134559c5523aa99201d81b316d0d195133c666f1a9f96492068c758681887c" exitCode=0 Dec 04 22:24:25.409543 master-0 kubenswrapper[33572]: I1204 22:24:25.409495 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7c874e55-3cdf-44cf-85a1-36bfadc33a31","Type":"ContainerDied","Data":"8c134559c5523aa99201d81b316d0d195133c666f1a9f96492068c758681887c"} Dec 04 22:24:26.340691 master-0 kubenswrapper[33572]: I1204 22:24:26.340451 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:24:26.423555 master-0 kubenswrapper[33572]: I1204 22:24:26.423472 33572 generic.go:334] "Generic (PLEG): container finished" podID="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" containerID="8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c" exitCode=0 Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.423579 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" event={"ID":"72679051-6a4b-4991-85c4-e5d2cbbc6ed7","Type":"ContainerDied","Data":"8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c"} Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.423611 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" event={"ID":"72679051-6a4b-4991-85c4-e5d2cbbc6ed7","Type":"ContainerDied","Data":"8688dbdb4594010fcb7f8a4ca909c4672a49bcb4f5b75dd3a827d0833c1ed0fe"} Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.423614 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55c77559c8-g74sm" Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.423630 33572 scope.go:117] "RemoveContainer" containerID="8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c" Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.424380 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs\") pod \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.424572 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles\") pod \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.424643 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-audit-log\") pod \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.424689 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls\") pod \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.424914 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle\") pod \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.424972 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w82st\" (UniqueName: \"kubernetes.io/projected/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-kube-api-access-w82st\") pod \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " Dec 04 22:24:26.426896 master-0 kubenswrapper[33572]: I1204 22:24:26.425054 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle\") pod \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\" (UID: \"72679051-6a4b-4991-85c4-e5d2cbbc6ed7\") " Dec 04 22:24:26.428134 master-0 kubenswrapper[33572]: I1204 22:24:26.427580 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "72679051-6a4b-4991-85c4-e5d2cbbc6ed7" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:26.428670 master-0 kubenswrapper[33572]: I1204 22:24:26.428523 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "72679051-6a4b-4991-85c4-e5d2cbbc6ed7" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:26.429377 master-0 kubenswrapper[33572]: I1204 22:24:26.429324 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-audit-log" (OuterVolumeSpecName: "audit-log") pod "72679051-6a4b-4991-85c4-e5d2cbbc6ed7" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:24:26.429469 master-0 kubenswrapper[33572]: I1204 22:24:26.429369 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "72679051-6a4b-4991-85c4-e5d2cbbc6ed7" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:26.429816 master-0 kubenswrapper[33572]: I1204 22:24:26.429776 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "72679051-6a4b-4991-85c4-e5d2cbbc6ed7" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:26.432626 master-0 kubenswrapper[33572]: I1204 22:24:26.432578 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-kube-api-access-w82st" (OuterVolumeSpecName: "kube-api-access-w82st") pod "72679051-6a4b-4991-85c4-e5d2cbbc6ed7" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7"). InnerVolumeSpecName "kube-api-access-w82st". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:24:26.436370 master-0 kubenswrapper[33572]: I1204 22:24:26.435217 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "72679051-6a4b-4991-85c4-e5d2cbbc6ed7" (UID: "72679051-6a4b-4991-85c4-e5d2cbbc6ed7"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:26.437647 master-0 kubenswrapper[33572]: I1204 22:24:26.437545 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7c874e55-3cdf-44cf-85a1-36bfadc33a31","Type":"ContainerStarted","Data":"fdc1fc03d2c9f2b02476609c61829b515db2f62b08380cefc79f5ae078f6b10a"} Dec 04 22:24:26.437647 master-0 kubenswrapper[33572]: I1204 22:24:26.437579 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7c874e55-3cdf-44cf-85a1-36bfadc33a31","Type":"ContainerStarted","Data":"907b145264a5cdfdc3f7331b1e93749e53e57bf4261f47f6d0799c9182ae53e2"} Dec 04 22:24:26.437647 master-0 kubenswrapper[33572]: I1204 22:24:26.437589 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7c874e55-3cdf-44cf-85a1-36bfadc33a31","Type":"ContainerStarted","Data":"4136517ce274e82373329f671258fe6d5f891d272813cd8bd76214407099e430"} Dec 04 22:24:26.437647 master-0 kubenswrapper[33572]: I1204 22:24:26.437598 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7c874e55-3cdf-44cf-85a1-36bfadc33a31","Type":"ContainerStarted","Data":"f729704421a4f2505275baad9089bee817300af9da6dd2b3d82b7e0bbf3c9869"} Dec 04 22:24:26.437647 master-0 kubenswrapper[33572]: I1204 22:24:26.437608 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7c874e55-3cdf-44cf-85a1-36bfadc33a31","Type":"ContainerStarted","Data":"886fad862c47886831fede32a354d4e747d283c29d6d97cc0ba341b2f120a8f8"} Dec 04 22:24:26.502950 master-0 kubenswrapper[33572]: I1204 22:24:26.502163 33572 scope.go:117] "RemoveContainer" containerID="8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c" Dec 04 22:24:26.503736 master-0 kubenswrapper[33572]: E1204 22:24:26.503674 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c\": container with ID starting with 8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c not found: ID does not exist" containerID="8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c" Dec 04 22:24:26.503843 master-0 kubenswrapper[33572]: I1204 22:24:26.503750 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c"} err="failed to get container status \"8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c\": rpc error: code = NotFound desc = could not find container \"8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c\": container with ID starting with 8b77d62e5f0868c5194ca5becac825b8747fa3012bd686856746667bdd18a36c not found: ID does not exist" Dec 04 22:24:26.529705 master-0 kubenswrapper[33572]: I1204 22:24:26.527433 33572 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:26.529705 master-0 kubenswrapper[33572]: I1204 22:24:26.527692 33572 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-metrics-server-audit-profiles\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:26.529705 master-0 kubenswrapper[33572]: I1204 22:24:26.527720 33572 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-audit-log\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:26.529705 master-0 kubenswrapper[33572]: I1204 22:24:26.527744 33572 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-secret-metrics-server-tls\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:26.529705 master-0 kubenswrapper[33572]: I1204 22:24:26.527763 33572 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:26.529705 master-0 kubenswrapper[33572]: I1204 22:24:26.527786 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w82st\" (UniqueName: \"kubernetes.io/projected/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-kube-api-access-w82st\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:26.529705 master-0 kubenswrapper[33572]: I1204 22:24:26.527804 33572 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72679051-6a4b-4991-85c4-e5d2cbbc6ed7-client-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:26.758614 master-0 kubenswrapper[33572]: I1204 22:24:26.758357 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-55c77559c8-g74sm"] Dec 04 22:24:26.765275 master-0 kubenswrapper[33572]: I1204 22:24:26.765197 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-55c77559c8-g74sm"] Dec 04 22:24:27.456433 master-0 kubenswrapper[33572]: I1204 22:24:27.456332 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7c874e55-3cdf-44cf-85a1-36bfadc33a31","Type":"ContainerStarted","Data":"d4a15be1115c356986585ade7a2e3d7307a8e722bef457fff550c8100a634368"} Dec 04 22:24:27.502358 master-0 kubenswrapper[33572]: I1204 22:24:27.502255 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.502232838 podStartE2EDuration="4.502232838s" podCreationTimestamp="2025-12-04 22:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:24:27.496810878 +0000 UTC m=+331.224336537" watchObservedRunningTime="2025-12-04 22:24:27.502232838 +0000 UTC m=+331.229758487" Dec 04 22:24:28.541829 master-0 kubenswrapper[33572]: I1204 22:24:28.541732 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" path="/var/lib/kubelet/pods/72679051-6a4b-4991-85c4-e5d2cbbc6ed7/volumes" Dec 04 22:24:28.822867 master-0 kubenswrapper[33572]: I1204 22:24:28.822681 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:24:44.968408 master-0 kubenswrapper[33572]: I1204 22:24:44.968281 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f9495c789-qq8pz"] Dec 04 22:24:44.970047 master-0 kubenswrapper[33572]: E1204 22:24:44.968600 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" containerName="metrics-server" Dec 04 22:24:44.970047 master-0 kubenswrapper[33572]: I1204 22:24:44.968613 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" containerName="metrics-server" Dec 04 22:24:44.970047 master-0 kubenswrapper[33572]: I1204 22:24:44.968766 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="72679051-6a4b-4991-85c4-e5d2cbbc6ed7" containerName="metrics-server" Dec 04 22:24:44.970047 master-0 kubenswrapper[33572]: I1204 22:24:44.969214 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:44.986309 master-0 kubenswrapper[33572]: I1204 22:24:44.986253 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f9495c789-qq8pz"] Dec 04 22:24:45.075082 master-0 kubenswrapper[33572]: I1204 22:24:45.075001 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-serving-cert\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.075298 master-0 kubenswrapper[33572]: I1204 22:24:45.075112 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-trusted-ca-bundle\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.075298 master-0 kubenswrapper[33572]: I1204 22:24:45.075231 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-config\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.075298 master-0 kubenswrapper[33572]: I1204 22:24:45.075261 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-oauth-config\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.075440 master-0 kubenswrapper[33572]: I1204 22:24:45.075396 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-service-ca\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.075520 master-0 kubenswrapper[33572]: I1204 22:24:45.075483 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lnss\" (UniqueName: \"kubernetes.io/projected/c0072c22-7f9a-4a92-9b46-ca145709c1bb-kube-api-access-8lnss\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.075564 master-0 kubenswrapper[33572]: I1204 22:24:45.075537 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-oauth-serving-cert\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.178018 master-0 kubenswrapper[33572]: I1204 22:24:45.177942 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-service-ca\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.178306 master-0 kubenswrapper[33572]: I1204 22:24:45.178199 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lnss\" (UniqueName: \"kubernetes.io/projected/c0072c22-7f9a-4a92-9b46-ca145709c1bb-kube-api-access-8lnss\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.178306 master-0 kubenswrapper[33572]: I1204 22:24:45.178256 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-oauth-serving-cert\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.178456 master-0 kubenswrapper[33572]: I1204 22:24:45.178360 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-serving-cert\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.178456 master-0 kubenswrapper[33572]: I1204 22:24:45.178447 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-trusted-ca-bundle\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.178794 master-0 kubenswrapper[33572]: I1204 22:24:45.178720 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-config\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.179040 master-0 kubenswrapper[33572]: I1204 22:24:45.178987 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-oauth-config\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.180672 master-0 kubenswrapper[33572]: I1204 22:24:45.179656 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-service-ca\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.180672 master-0 kubenswrapper[33572]: I1204 22:24:45.179956 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-oauth-serving-cert\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.180672 master-0 kubenswrapper[33572]: I1204 22:24:45.180610 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-config\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.182309 master-0 kubenswrapper[33572]: I1204 22:24:45.182244 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-trusted-ca-bundle\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.183363 master-0 kubenswrapper[33572]: I1204 22:24:45.183303 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-serving-cert\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.183739 master-0 kubenswrapper[33572]: I1204 22:24:45.183693 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-oauth-config\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.215786 master-0 kubenswrapper[33572]: I1204 22:24:45.215716 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lnss\" (UniqueName: \"kubernetes.io/projected/c0072c22-7f9a-4a92-9b46-ca145709c1bb-kube-api-access-8lnss\") pod \"console-7f9495c789-qq8pz\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.299020 master-0 kubenswrapper[33572]: I1204 22:24:45.298946 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:45.782964 master-0 kubenswrapper[33572]: I1204 22:24:45.782911 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f9495c789-qq8pz"] Dec 04 22:24:45.791258 master-0 kubenswrapper[33572]: W1204 22:24:45.791161 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0072c22_7f9a_4a92_9b46_ca145709c1bb.slice/crio-1505b514fb1a65c520eafd05e8b5a96e39ef0a58c6b6cd6d08df0fc2c3f47624 WatchSource:0}: Error finding container 1505b514fb1a65c520eafd05e8b5a96e39ef0a58c6b6cd6d08df0fc2c3f47624: Status 404 returned error can't find the container with id 1505b514fb1a65c520eafd05e8b5a96e39ef0a58c6b6cd6d08df0fc2c3f47624 Dec 04 22:24:46.654660 master-0 kubenswrapper[33572]: I1204 22:24:46.654579 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f9495c789-qq8pz" event={"ID":"c0072c22-7f9a-4a92-9b46-ca145709c1bb","Type":"ContainerStarted","Data":"6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8"} Dec 04 22:24:46.654660 master-0 kubenswrapper[33572]: I1204 22:24:46.654660 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f9495c789-qq8pz" event={"ID":"c0072c22-7f9a-4a92-9b46-ca145709c1bb","Type":"ContainerStarted","Data":"1505b514fb1a65c520eafd05e8b5a96e39ef0a58c6b6cd6d08df0fc2c3f47624"} Dec 04 22:24:46.690790 master-0 kubenswrapper[33572]: I1204 22:24:46.687857 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f9495c789-qq8pz" podStartSLOduration=2.687773713 podStartE2EDuration="2.687773713s" podCreationTimestamp="2025-12-04 22:24:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:24:46.685056858 +0000 UTC m=+350.412582547" watchObservedRunningTime="2025-12-04 22:24:46.687773713 +0000 UTC m=+350.415299412" Dec 04 22:24:49.564231 master-0 kubenswrapper[33572]: I1204 22:24:49.564074 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-744594955b-qspk5" podUID="502c032f-d279-41ec-ac38-16bf4b9b1950" containerName="console" containerID="cri-o://960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1" gracePeriod=15 Dec 04 22:24:50.201785 master-0 kubenswrapper[33572]: I1204 22:24:50.200673 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-744594955b-qspk5_502c032f-d279-41ec-ac38-16bf4b9b1950/console/0.log" Dec 04 22:24:50.201785 master-0 kubenswrapper[33572]: I1204 22:24:50.200743 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-744594955b-qspk5" Dec 04 22:24:50.277272 master-0 kubenswrapper[33572]: I1204 22:24:50.277184 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-oauth-config\") pod \"502c032f-d279-41ec-ac38-16bf4b9b1950\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " Dec 04 22:24:50.277272 master-0 kubenswrapper[33572]: I1204 22:24:50.277254 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-console-config\") pod \"502c032f-d279-41ec-ac38-16bf4b9b1950\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " Dec 04 22:24:50.277272 master-0 kubenswrapper[33572]: I1204 22:24:50.277284 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-oauth-serving-cert\") pod \"502c032f-d279-41ec-ac38-16bf4b9b1950\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " Dec 04 22:24:50.277931 master-0 kubenswrapper[33572]: I1204 22:24:50.277306 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-serving-cert\") pod \"502c032f-d279-41ec-ac38-16bf4b9b1950\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " Dec 04 22:24:50.277931 master-0 kubenswrapper[33572]: I1204 22:24:50.277350 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-trusted-ca-bundle\") pod \"502c032f-d279-41ec-ac38-16bf4b9b1950\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " Dec 04 22:24:50.277931 master-0 kubenswrapper[33572]: I1204 22:24:50.277412 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-service-ca\") pod \"502c032f-d279-41ec-ac38-16bf4b9b1950\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " Dec 04 22:24:50.277931 master-0 kubenswrapper[33572]: I1204 22:24:50.277776 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7njp\" (UniqueName: \"kubernetes.io/projected/502c032f-d279-41ec-ac38-16bf4b9b1950-kube-api-access-l7njp\") pod \"502c032f-d279-41ec-ac38-16bf4b9b1950\" (UID: \"502c032f-d279-41ec-ac38-16bf4b9b1950\") " Dec 04 22:24:50.278643 master-0 kubenswrapper[33572]: I1204 22:24:50.278585 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "502c032f-d279-41ec-ac38-16bf4b9b1950" (UID: "502c032f-d279-41ec-ac38-16bf4b9b1950"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:50.278643 master-0 kubenswrapper[33572]: I1204 22:24:50.278610 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-service-ca" (OuterVolumeSpecName: "service-ca") pod "502c032f-d279-41ec-ac38-16bf4b9b1950" (UID: "502c032f-d279-41ec-ac38-16bf4b9b1950"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:50.279099 master-0 kubenswrapper[33572]: I1204 22:24:50.279052 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-console-config" (OuterVolumeSpecName: "console-config") pod "502c032f-d279-41ec-ac38-16bf4b9b1950" (UID: "502c032f-d279-41ec-ac38-16bf4b9b1950"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:50.279533 master-0 kubenswrapper[33572]: I1204 22:24:50.279446 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "502c032f-d279-41ec-ac38-16bf4b9b1950" (UID: "502c032f-d279-41ec-ac38-16bf4b9b1950"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:24:50.281871 master-0 kubenswrapper[33572]: I1204 22:24:50.281818 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "502c032f-d279-41ec-ac38-16bf4b9b1950" (UID: "502c032f-d279-41ec-ac38-16bf4b9b1950"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:50.282016 master-0 kubenswrapper[33572]: I1204 22:24:50.281884 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502c032f-d279-41ec-ac38-16bf4b9b1950-kube-api-access-l7njp" (OuterVolumeSpecName: "kube-api-access-l7njp") pod "502c032f-d279-41ec-ac38-16bf4b9b1950" (UID: "502c032f-d279-41ec-ac38-16bf4b9b1950"). InnerVolumeSpecName "kube-api-access-l7njp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:24:50.290965 master-0 kubenswrapper[33572]: I1204 22:24:50.290867 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "502c032f-d279-41ec-ac38-16bf4b9b1950" (UID: "502c032f-d279-41ec-ac38-16bf4b9b1950"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:24:50.379872 master-0 kubenswrapper[33572]: I1204 22:24:50.379702 33572 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:50.379872 master-0 kubenswrapper[33572]: I1204 22:24:50.379767 33572 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-console-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:50.379872 master-0 kubenswrapper[33572]: I1204 22:24:50.379788 33572 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:50.379872 master-0 kubenswrapper[33572]: I1204 22:24:50.379806 33572 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/502c032f-d279-41ec-ac38-16bf4b9b1950-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:50.379872 master-0 kubenswrapper[33572]: I1204 22:24:50.379826 33572 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:50.379872 master-0 kubenswrapper[33572]: I1204 22:24:50.379846 33572 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/502c032f-d279-41ec-ac38-16bf4b9b1950-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:50.379872 master-0 kubenswrapper[33572]: I1204 22:24:50.379865 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7njp\" (UniqueName: \"kubernetes.io/projected/502c032f-d279-41ec-ac38-16bf4b9b1950-kube-api-access-l7njp\") on node \"master-0\" DevicePath \"\"" Dec 04 22:24:50.693965 master-0 kubenswrapper[33572]: I1204 22:24:50.693654 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-744594955b-qspk5_502c032f-d279-41ec-ac38-16bf4b9b1950/console/0.log" Dec 04 22:24:50.693965 master-0 kubenswrapper[33572]: I1204 22:24:50.693763 33572 generic.go:334] "Generic (PLEG): container finished" podID="502c032f-d279-41ec-ac38-16bf4b9b1950" containerID="960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1" exitCode=2 Dec 04 22:24:50.693965 master-0 kubenswrapper[33572]: I1204 22:24:50.693823 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744594955b-qspk5" event={"ID":"502c032f-d279-41ec-ac38-16bf4b9b1950","Type":"ContainerDied","Data":"960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1"} Dec 04 22:24:50.693965 master-0 kubenswrapper[33572]: I1204 22:24:50.693938 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-744594955b-qspk5" event={"ID":"502c032f-d279-41ec-ac38-16bf4b9b1950","Type":"ContainerDied","Data":"32914af44db102fd411cf24611da49ac9c5321501bc0160bc563aa53daf0132f"} Dec 04 22:24:50.693965 master-0 kubenswrapper[33572]: I1204 22:24:50.693972 33572 scope.go:117] "RemoveContainer" containerID="960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1" Dec 04 22:24:50.695823 master-0 kubenswrapper[33572]: I1204 22:24:50.694173 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-744594955b-qspk5" Dec 04 22:24:50.722213 master-0 kubenswrapper[33572]: I1204 22:24:50.722161 33572 scope.go:117] "RemoveContainer" containerID="960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1" Dec 04 22:24:50.722901 master-0 kubenswrapper[33572]: E1204 22:24:50.722800 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1\": container with ID starting with 960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1 not found: ID does not exist" containerID="960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1" Dec 04 22:24:50.722901 master-0 kubenswrapper[33572]: I1204 22:24:50.722858 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1"} err="failed to get container status \"960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1\": rpc error: code = NotFound desc = could not find container \"960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1\": container with ID starting with 960621942f2be5b0bf42a1b14f287cbf59ec604162f9b2697454dc19b2a9f0d1 not found: ID does not exist" Dec 04 22:24:50.748238 master-0 kubenswrapper[33572]: I1204 22:24:50.748171 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-744594955b-qspk5"] Dec 04 22:24:50.758686 master-0 kubenswrapper[33572]: I1204 22:24:50.758633 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-744594955b-qspk5"] Dec 04 22:24:50.799068 master-0 kubenswrapper[33572]: I1204 22:24:50.798992 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6cfff4b945-wlg4k_1f9552e4-fda9-4207-a4ed-a0486fd1018e/oauth-openshift/0.log" Dec 04 22:24:51.003743 master-0 kubenswrapper[33572]: I1204 22:24:51.003634 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-bm2pk_f893663c-7c1e-4eda-9839-99c1c0440304/authentication-operator/2.log" Dec 04 22:24:51.202051 master-0 kubenswrapper[33572]: I1204 22:24:51.201958 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-bm2pk_f893663c-7c1e-4eda-9839-99c1c0440304/authentication-operator/3.log" Dec 04 22:24:51.400154 master-0 kubenswrapper[33572]: I1204 22:24:51.400086 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5465c8b4db-8vm66_c178afcf-b713-4c74-b22b-6169ba3123f5/router/4.log" Dec 04 22:24:51.592295 master-0 kubenswrapper[33572]: I1204 22:24:51.592226 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5465c8b4db-8vm66_c178afcf-b713-4c74-b22b-6169ba3123f5/router/5.log" Dec 04 22:24:51.788652 master-0 kubenswrapper[33572]: I1204 22:24:51.788540 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-58574fc8d8-gg42x_989a73ce-3898-4f65-a437-2c7061f9375f/fix-audit-permissions/0.log" Dec 04 22:24:51.998791 master-0 kubenswrapper[33572]: I1204 22:24:51.998713 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-58574fc8d8-gg42x_989a73ce-3898-4f65-a437-2c7061f9375f/oauth-apiserver/0.log" Dec 04 22:24:52.541938 master-0 kubenswrapper[33572]: I1204 22:24:52.541818 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="502c032f-d279-41ec-ac38-16bf4b9b1950" path="/var/lib/kubelet/pods/502c032f-d279-41ec-ac38-16bf4b9b1950/volumes" Dec 04 22:24:55.299876 master-0 kubenswrapper[33572]: I1204 22:24:55.299807 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:55.300913 master-0 kubenswrapper[33572]: I1204 22:24:55.300727 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:55.307327 master-0 kubenswrapper[33572]: I1204 22:24:55.307252 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:55.750819 master-0 kubenswrapper[33572]: I1204 22:24:55.750673 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:24:55.855881 master-0 kubenswrapper[33572]: I1204 22:24:55.855809 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d6857f96b-g7j6m"] Dec 04 22:25:08.556936 master-0 kubenswrapper[33572]: I1204 22:25:08.556861 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s"] Dec 04 22:25:08.558043 master-0 kubenswrapper[33572]: E1204 22:25:08.557195 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502c032f-d279-41ec-ac38-16bf4b9b1950" containerName="console" Dec 04 22:25:08.558043 master-0 kubenswrapper[33572]: I1204 22:25:08.557212 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="502c032f-d279-41ec-ac38-16bf4b9b1950" containerName="console" Dec 04 22:25:08.558043 master-0 kubenswrapper[33572]: I1204 22:25:08.557376 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="502c032f-d279-41ec-ac38-16bf4b9b1950" containerName="console" Dec 04 22:25:08.558043 master-0 kubenswrapper[33572]: I1204 22:25:08.558018 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s"] Dec 04 22:25:08.558237 master-0 kubenswrapper[33572]: I1204 22:25:08.558126 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" Dec 04 22:25:08.562707 master-0 kubenswrapper[33572]: I1204 22:25:08.562427 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 22:25:08.563880 master-0 kubenswrapper[33572]: I1204 22:25:08.563839 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 22:25:08.619267 master-0 kubenswrapper[33572]: I1204 22:25:08.607695 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8260d9f8-c5d5-4ed0-9896-486e34d29486-networking-console-plugin-cert\") pod \"networking-console-plugin-7d45bf9455-kqq2s\" (UID: \"8260d9f8-c5d5-4ed0-9896-486e34d29486\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" Dec 04 22:25:08.619267 master-0 kubenswrapper[33572]: I1204 22:25:08.613494 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8260d9f8-c5d5-4ed0-9896-486e34d29486-nginx-conf\") pod \"networking-console-plugin-7d45bf9455-kqq2s\" (UID: \"8260d9f8-c5d5-4ed0-9896-486e34d29486\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" Dec 04 22:25:08.680599 master-0 kubenswrapper[33572]: I1204 22:25:08.680521 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64b5bcd658-ztwxm"] Dec 04 22:25:08.682130 master-0 kubenswrapper[33572]: I1204 22:25:08.682091 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.700274 master-0 kubenswrapper[33572]: I1204 22:25:08.698652 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b5bcd658-ztwxm"] Dec 04 22:25:08.715775 master-0 kubenswrapper[33572]: I1204 22:25:08.714647 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8260d9f8-c5d5-4ed0-9896-486e34d29486-networking-console-plugin-cert\") pod \"networking-console-plugin-7d45bf9455-kqq2s\" (UID: \"8260d9f8-c5d5-4ed0-9896-486e34d29486\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" Dec 04 22:25:08.715775 master-0 kubenswrapper[33572]: I1204 22:25:08.714709 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8260d9f8-c5d5-4ed0-9896-486e34d29486-nginx-conf\") pod \"networking-console-plugin-7d45bf9455-kqq2s\" (UID: \"8260d9f8-c5d5-4ed0-9896-486e34d29486\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" Dec 04 22:25:08.716183 master-0 kubenswrapper[33572]: I1204 22:25:08.716151 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8260d9f8-c5d5-4ed0-9896-486e34d29486-nginx-conf\") pod \"networking-console-plugin-7d45bf9455-kqq2s\" (UID: \"8260d9f8-c5d5-4ed0-9896-486e34d29486\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" Dec 04 22:25:08.735425 master-0 kubenswrapper[33572]: I1204 22:25:08.735290 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8260d9f8-c5d5-4ed0-9896-486e34d29486-networking-console-plugin-cert\") pod \"networking-console-plugin-7d45bf9455-kqq2s\" (UID: \"8260d9f8-c5d5-4ed0-9896-486e34d29486\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" Dec 04 22:25:08.816974 master-0 kubenswrapper[33572]: I1204 22:25:08.816737 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-serving-cert\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.816974 master-0 kubenswrapper[33572]: I1204 22:25:08.816823 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-oauth-serving-cert\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.816974 master-0 kubenswrapper[33572]: I1204 22:25:08.816854 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-service-ca\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.816974 master-0 kubenswrapper[33572]: I1204 22:25:08.816876 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-trusted-ca-bundle\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.816974 master-0 kubenswrapper[33572]: I1204 22:25:08.816926 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-config\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.816974 master-0 kubenswrapper[33572]: I1204 22:25:08.816941 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-oauth-config\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.816974 master-0 kubenswrapper[33572]: I1204 22:25:08.816974 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7cg2\" (UniqueName: \"kubernetes.io/projected/da9b6cb4-13db-408a-8a53-c03f64ccea5a-kube-api-access-t7cg2\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.906338 master-0 kubenswrapper[33572]: I1204 22:25:08.906232 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" Dec 04 22:25:08.919250 master-0 kubenswrapper[33572]: I1204 22:25:08.919172 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-serving-cert\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.919250 master-0 kubenswrapper[33572]: I1204 22:25:08.919246 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-oauth-serving-cert\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.919754 master-0 kubenswrapper[33572]: I1204 22:25:08.919573 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-service-ca\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.919754 master-0 kubenswrapper[33572]: I1204 22:25:08.919711 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-trusted-ca-bundle\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.920062 master-0 kubenswrapper[33572]: I1204 22:25:08.919994 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-config\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.920137 master-0 kubenswrapper[33572]: I1204 22:25:08.920095 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-oauth-config\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.920312 master-0 kubenswrapper[33572]: I1204 22:25:08.920271 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7cg2\" (UniqueName: \"kubernetes.io/projected/da9b6cb4-13db-408a-8a53-c03f64ccea5a-kube-api-access-t7cg2\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.920969 master-0 kubenswrapper[33572]: I1204 22:25:08.920600 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-oauth-serving-cert\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.921280 master-0 kubenswrapper[33572]: I1204 22:25:08.921222 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-service-ca\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.923117 master-0 kubenswrapper[33572]: I1204 22:25:08.922899 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-config\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.923371 master-0 kubenswrapper[33572]: I1204 22:25:08.923324 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-trusted-ca-bundle\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.925247 master-0 kubenswrapper[33572]: I1204 22:25:08.925198 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-oauth-config\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.927021 master-0 kubenswrapper[33572]: I1204 22:25:08.926947 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-serving-cert\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:08.945832 master-0 kubenswrapper[33572]: I1204 22:25:08.945767 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7cg2\" (UniqueName: \"kubernetes.io/projected/da9b6cb4-13db-408a-8a53-c03f64ccea5a-kube-api-access-t7cg2\") pod \"console-64b5bcd658-ztwxm\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:09.020257 master-0 kubenswrapper[33572]: I1204 22:25:09.020193 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:09.368869 master-0 kubenswrapper[33572]: I1204 22:25:09.368828 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s"] Dec 04 22:25:09.371014 master-0 kubenswrapper[33572]: W1204 22:25:09.370950 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8260d9f8_c5d5_4ed0_9896_486e34d29486.slice/crio-4a8a480839924b6df7c0657567d6874e5f9336d85ae9516fa61ede764b443987 WatchSource:0}: Error finding container 4a8a480839924b6df7c0657567d6874e5f9336d85ae9516fa61ede764b443987: Status 404 returned error can't find the container with id 4a8a480839924b6df7c0657567d6874e5f9336d85ae9516fa61ede764b443987 Dec 04 22:25:09.372868 master-0 kubenswrapper[33572]: I1204 22:25:09.372849 33572 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 22:25:09.513857 master-0 kubenswrapper[33572]: I1204 22:25:09.513770 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b5bcd658-ztwxm"] Dec 04 22:25:09.896655 master-0 kubenswrapper[33572]: I1204 22:25:09.896476 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" event={"ID":"8260d9f8-c5d5-4ed0-9896-486e34d29486","Type":"ContainerStarted","Data":"4a8a480839924b6df7c0657567d6874e5f9336d85ae9516fa61ede764b443987"} Dec 04 22:25:09.904298 master-0 kubenswrapper[33572]: I1204 22:25:09.904230 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b5bcd658-ztwxm" event={"ID":"da9b6cb4-13db-408a-8a53-c03f64ccea5a","Type":"ContainerStarted","Data":"4c5f8fe348cfd361d4cff444c1a27d696284b954e7c06243153cba9f36ad5161"} Dec 04 22:25:09.904402 master-0 kubenswrapper[33572]: I1204 22:25:09.904316 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b5bcd658-ztwxm" event={"ID":"da9b6cb4-13db-408a-8a53-c03f64ccea5a","Type":"ContainerStarted","Data":"6084e4ae3f1babbf58d4e534213427d4af231f66bbf8a2790288527947b2adda"} Dec 04 22:25:09.931313 master-0 kubenswrapper[33572]: I1204 22:25:09.931233 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64b5bcd658-ztwxm" podStartSLOduration=1.9312101990000001 podStartE2EDuration="1.931210199s" podCreationTimestamp="2025-12-04 22:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:25:09.926977351 +0000 UTC m=+373.654503000" watchObservedRunningTime="2025-12-04 22:25:09.931210199 +0000 UTC m=+373.658735858" Dec 04 22:25:10.918422 master-0 kubenswrapper[33572]: I1204 22:25:10.918195 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" event={"ID":"8260d9f8-c5d5-4ed0-9896-486e34d29486","Type":"ContainerStarted","Data":"d198fbd215df2c72f2438a70a8c0d025559a83e5b05f6931908c32ebc33f2548"} Dec 04 22:25:10.947933 master-0 kubenswrapper[33572]: I1204 22:25:10.947798 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-7d45bf9455-kqq2s" podStartSLOduration=1.671475144 podStartE2EDuration="2.947764863s" podCreationTimestamp="2025-12-04 22:25:08 +0000 UTC" firstStartedPulling="2025-12-04 22:25:09.372814443 +0000 UTC m=+373.100340092" lastFinishedPulling="2025-12-04 22:25:10.649104152 +0000 UTC m=+374.376629811" observedRunningTime="2025-12-04 22:25:10.939909916 +0000 UTC m=+374.667435635" watchObservedRunningTime="2025-12-04 22:25:10.947764863 +0000 UTC m=+374.675290552" Dec 04 22:25:12.880807 master-0 kubenswrapper[33572]: I1204 22:25:12.880726 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 04 22:25:12.883741 master-0 kubenswrapper[33572]: I1204 22:25:12.883684 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:12.886671 master-0 kubenswrapper[33572]: I1204 22:25:12.886622 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-wkcjf" Dec 04 22:25:12.887287 master-0 kubenswrapper[33572]: I1204 22:25:12.887200 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 04 22:25:12.894013 master-0 kubenswrapper[33572]: I1204 22:25:12.893954 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 04 22:25:13.007131 master-0 kubenswrapper[33572]: I1204 22:25:13.007032 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-var-lock\") pod \"installer-4-master-0\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:13.007481 master-0 kubenswrapper[33572]: I1204 22:25:13.007404 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41c30024-8bc2-42a7-87ec-544c44a34473-kube-api-access\") pod \"installer-4-master-0\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:13.007653 master-0 kubenswrapper[33572]: I1204 22:25:13.007564 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:13.109543 master-0 kubenswrapper[33572]: I1204 22:25:13.108830 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41c30024-8bc2-42a7-87ec-544c44a34473-kube-api-access\") pod \"installer-4-master-0\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:13.109543 master-0 kubenswrapper[33572]: I1204 22:25:13.109455 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:13.109912 master-0 kubenswrapper[33572]: I1204 22:25:13.109574 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:13.109912 master-0 kubenswrapper[33572]: I1204 22:25:13.109725 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-var-lock\") pod \"installer-4-master-0\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:13.109912 master-0 kubenswrapper[33572]: I1204 22:25:13.109817 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-var-lock\") pod \"installer-4-master-0\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:13.137090 master-0 kubenswrapper[33572]: I1204 22:25:13.136897 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41c30024-8bc2-42a7-87ec-544c44a34473-kube-api-access\") pod \"installer-4-master-0\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:13.226608 master-0 kubenswrapper[33572]: I1204 22:25:13.226543 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:13.759947 master-0 kubenswrapper[33572]: I1204 22:25:13.759875 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 04 22:25:13.948374 master-0 kubenswrapper[33572]: I1204 22:25:13.948302 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"41c30024-8bc2-42a7-87ec-544c44a34473","Type":"ContainerStarted","Data":"eb834a848019b608d5675eb50ff5c4c3823b0d9dbbc3f36c234cc8d99132d790"} Dec 04 22:25:14.963051 master-0 kubenswrapper[33572]: I1204 22:25:14.962936 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"41c30024-8bc2-42a7-87ec-544c44a34473","Type":"ContainerStarted","Data":"66c5f73c0677f3b432b1b092b80c6fb920b36c7fe1269fed502d1a3e7d6963f0"} Dec 04 22:25:14.993128 master-0 kubenswrapper[33572]: I1204 22:25:14.992944 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=2.992918258 podStartE2EDuration="2.992918258s" podCreationTimestamp="2025-12-04 22:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:25:14.988017713 +0000 UTC m=+378.715543372" watchObservedRunningTime="2025-12-04 22:25:14.992918258 +0000 UTC m=+378.720443947" Dec 04 22:25:19.021541 master-0 kubenswrapper[33572]: I1204 22:25:19.021425 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:19.021541 master-0 kubenswrapper[33572]: I1204 22:25:19.021532 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:19.027777 master-0 kubenswrapper[33572]: I1204 22:25:19.027723 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:20.016911 master-0 kubenswrapper[33572]: I1204 22:25:20.016829 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:25:20.600888 master-0 kubenswrapper[33572]: I1204 22:25:20.600798 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f9495c789-qq8pz"] Dec 04 22:25:20.905187 master-0 kubenswrapper[33572]: I1204 22:25:20.904982 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7d6857f96b-g7j6m" podUID="8a4bc900-27a4-4a4d-9cce-43e2a4708fec" containerName="console" containerID="cri-o://4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326" gracePeriod=15 Dec 04 22:25:21.404325 master-0 kubenswrapper[33572]: I1204 22:25:21.404254 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d6857f96b-g7j6m_8a4bc900-27a4-4a4d-9cce-43e2a4708fec/console/0.log" Dec 04 22:25:21.404739 master-0 kubenswrapper[33572]: I1204 22:25:21.404389 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:25:21.465313 master-0 kubenswrapper[33572]: I1204 22:25:21.465229 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-oauth-serving-cert\") pod \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " Dec 04 22:25:21.465624 master-0 kubenswrapper[33572]: I1204 22:25:21.465372 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-trusted-ca-bundle\") pod \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " Dec 04 22:25:21.465624 master-0 kubenswrapper[33572]: I1204 22:25:21.465461 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-serving-cert\") pod \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " Dec 04 22:25:21.465834 master-0 kubenswrapper[33572]: I1204 22:25:21.465770 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-oauth-config\") pod \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " Dec 04 22:25:21.465950 master-0 kubenswrapper[33572]: I1204 22:25:21.465874 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-service-ca\") pod \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " Dec 04 22:25:21.466118 master-0 kubenswrapper[33572]: I1204 22:25:21.465976 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk8vm\" (UniqueName: \"kubernetes.io/projected/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-kube-api-access-vk8vm\") pod \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " Dec 04 22:25:21.466118 master-0 kubenswrapper[33572]: I1204 22:25:21.466052 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8a4bc900-27a4-4a4d-9cce-43e2a4708fec" (UID: "8a4bc900-27a4-4a4d-9cce-43e2a4708fec"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:25:21.466244 master-0 kubenswrapper[33572]: I1204 22:25:21.466183 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-config\") pod \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\" (UID: \"8a4bc900-27a4-4a4d-9cce-43e2a4708fec\") " Dec 04 22:25:21.466731 master-0 kubenswrapper[33572]: I1204 22:25:21.466669 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-service-ca" (OuterVolumeSpecName: "service-ca") pod "8a4bc900-27a4-4a4d-9cce-43e2a4708fec" (UID: "8a4bc900-27a4-4a4d-9cce-43e2a4708fec"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:25:21.466811 master-0 kubenswrapper[33572]: I1204 22:25:21.466679 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-config" (OuterVolumeSpecName: "console-config") pod "8a4bc900-27a4-4a4d-9cce-43e2a4708fec" (UID: "8a4bc900-27a4-4a4d-9cce-43e2a4708fec"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:25:21.466865 master-0 kubenswrapper[33572]: I1204 22:25:21.466827 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8a4bc900-27a4-4a4d-9cce-43e2a4708fec" (UID: "8a4bc900-27a4-4a4d-9cce-43e2a4708fec"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:25:21.467333 master-0 kubenswrapper[33572]: I1204 22:25:21.467290 33572 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:21.467333 master-0 kubenswrapper[33572]: I1204 22:25:21.467320 33572 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:21.467333 master-0 kubenswrapper[33572]: I1204 22:25:21.467329 33572 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:21.467477 master-0 kubenswrapper[33572]: I1204 22:25:21.467340 33572 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:21.470668 master-0 kubenswrapper[33572]: I1204 22:25:21.470603 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8a4bc900-27a4-4a4d-9cce-43e2a4708fec" (UID: "8a4bc900-27a4-4a4d-9cce-43e2a4708fec"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:25:21.472235 master-0 kubenswrapper[33572]: I1204 22:25:21.472122 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-kube-api-access-vk8vm" (OuterVolumeSpecName: "kube-api-access-vk8vm") pod "8a4bc900-27a4-4a4d-9cce-43e2a4708fec" (UID: "8a4bc900-27a4-4a4d-9cce-43e2a4708fec"). InnerVolumeSpecName "kube-api-access-vk8vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:25:21.473171 master-0 kubenswrapper[33572]: I1204 22:25:21.473087 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8a4bc900-27a4-4a4d-9cce-43e2a4708fec" (UID: "8a4bc900-27a4-4a4d-9cce-43e2a4708fec"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:25:21.570458 master-0 kubenswrapper[33572]: I1204 22:25:21.570376 33572 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:21.570458 master-0 kubenswrapper[33572]: I1204 22:25:21.570444 33572 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:21.570458 master-0 kubenswrapper[33572]: I1204 22:25:21.570467 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk8vm\" (UniqueName: \"kubernetes.io/projected/8a4bc900-27a4-4a4d-9cce-43e2a4708fec-kube-api-access-vk8vm\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:22.033436 master-0 kubenswrapper[33572]: I1204 22:25:22.033358 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d6857f96b-g7j6m_8a4bc900-27a4-4a4d-9cce-43e2a4708fec/console/0.log" Dec 04 22:25:22.033436 master-0 kubenswrapper[33572]: I1204 22:25:22.033438 33572 generic.go:334] "Generic (PLEG): container finished" podID="8a4bc900-27a4-4a4d-9cce-43e2a4708fec" containerID="4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326" exitCode=2 Dec 04 22:25:22.034466 master-0 kubenswrapper[33572]: I1204 22:25:22.033907 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d6857f96b-g7j6m" event={"ID":"8a4bc900-27a4-4a4d-9cce-43e2a4708fec","Type":"ContainerDied","Data":"4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326"} Dec 04 22:25:22.034466 master-0 kubenswrapper[33572]: I1204 22:25:22.033922 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d6857f96b-g7j6m" Dec 04 22:25:22.034466 master-0 kubenswrapper[33572]: I1204 22:25:22.033980 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d6857f96b-g7j6m" event={"ID":"8a4bc900-27a4-4a4d-9cce-43e2a4708fec","Type":"ContainerDied","Data":"6c84dec3eea5cbc92b04deeb19a860514de8fd5bde70c91b5c413e4e171a5058"} Dec 04 22:25:22.034466 master-0 kubenswrapper[33572]: I1204 22:25:22.034003 33572 scope.go:117] "RemoveContainer" containerID="4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326" Dec 04 22:25:22.068389 master-0 kubenswrapper[33572]: I1204 22:25:22.068249 33572 scope.go:117] "RemoveContainer" containerID="4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326" Dec 04 22:25:22.069484 master-0 kubenswrapper[33572]: E1204 22:25:22.069396 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326\": container with ID starting with 4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326 not found: ID does not exist" containerID="4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326" Dec 04 22:25:22.069703 master-0 kubenswrapper[33572]: I1204 22:25:22.069479 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326"} err="failed to get container status \"4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326\": rpc error: code = NotFound desc = could not find container \"4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326\": container with ID starting with 4f13c3d03928d7640f0f28adb99b94f18080c11c1cd1707f8016cf89e2c56326 not found: ID does not exist" Dec 04 22:25:22.096454 master-0 kubenswrapper[33572]: I1204 22:25:22.095881 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d6857f96b-g7j6m"] Dec 04 22:25:22.105068 master-0 kubenswrapper[33572]: I1204 22:25:22.104780 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d6857f96b-g7j6m"] Dec 04 22:25:22.540825 master-0 kubenswrapper[33572]: I1204 22:25:22.540707 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a4bc900-27a4-4a4d-9cce-43e2a4708fec" path="/var/lib/kubelet/pods/8a4bc900-27a4-4a4d-9cce-43e2a4708fec/volumes" Dec 04 22:25:23.822849 master-0 kubenswrapper[33572]: I1204 22:25:23.822769 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:25:23.856790 master-0 kubenswrapper[33572]: I1204 22:25:23.856689 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:25:24.127076 master-0 kubenswrapper[33572]: I1204 22:25:24.126907 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 04 22:25:45.679576 master-0 kubenswrapper[33572]: I1204 22:25:45.679454 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7f9495c789-qq8pz" podUID="c0072c22-7f9a-4a92-9b46-ca145709c1bb" containerName="console" containerID="cri-o://6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8" gracePeriod=15 Dec 04 22:25:46.256102 master-0 kubenswrapper[33572]: I1204 22:25:46.256026 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f9495c789-qq8pz_c0072c22-7f9a-4a92-9b46-ca145709c1bb/console/0.log" Dec 04 22:25:46.256320 master-0 kubenswrapper[33572]: I1204 22:25:46.256127 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:25:46.294444 master-0 kubenswrapper[33572]: I1204 22:25:46.294356 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f9495c789-qq8pz_c0072c22-7f9a-4a92-9b46-ca145709c1bb/console/0.log" Dec 04 22:25:46.294444 master-0 kubenswrapper[33572]: I1204 22:25:46.294415 33572 generic.go:334] "Generic (PLEG): container finished" podID="c0072c22-7f9a-4a92-9b46-ca145709c1bb" containerID="6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8" exitCode=2 Dec 04 22:25:46.294444 master-0 kubenswrapper[33572]: I1204 22:25:46.294447 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f9495c789-qq8pz" event={"ID":"c0072c22-7f9a-4a92-9b46-ca145709c1bb","Type":"ContainerDied","Data":"6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8"} Dec 04 22:25:46.294977 master-0 kubenswrapper[33572]: I1204 22:25:46.294475 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f9495c789-qq8pz" event={"ID":"c0072c22-7f9a-4a92-9b46-ca145709c1bb","Type":"ContainerDied","Data":"1505b514fb1a65c520eafd05e8b5a96e39ef0a58c6b6cd6d08df0fc2c3f47624"} Dec 04 22:25:46.294977 master-0 kubenswrapper[33572]: I1204 22:25:46.294493 33572 scope.go:117] "RemoveContainer" containerID="6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8" Dec 04 22:25:46.294977 master-0 kubenswrapper[33572]: I1204 22:25:46.294601 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f9495c789-qq8pz" Dec 04 22:25:46.316077 master-0 kubenswrapper[33572]: I1204 22:25:46.315991 33572 scope.go:117] "RemoveContainer" containerID="6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8" Dec 04 22:25:46.316671 master-0 kubenswrapper[33572]: E1204 22:25:46.316611 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8\": container with ID starting with 6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8 not found: ID does not exist" containerID="6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8" Dec 04 22:25:46.316840 master-0 kubenswrapper[33572]: I1204 22:25:46.316663 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8"} err="failed to get container status \"6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8\": rpc error: code = NotFound desc = could not find container \"6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8\": container with ID starting with 6fe49f3da6c36acfbb4e9692ed11ebc0363d7bd79b7b87be9369862cf3f87df8 not found: ID does not exist" Dec 04 22:25:46.335352 master-0 kubenswrapper[33572]: I1204 22:25:46.335282 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-trusted-ca-bundle\") pod \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " Dec 04 22:25:46.335477 master-0 kubenswrapper[33572]: I1204 22:25:46.335394 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-service-ca\") pod \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " Dec 04 22:25:46.335477 master-0 kubenswrapper[33572]: I1204 22:25:46.335446 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-config\") pod \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " Dec 04 22:25:46.335696 master-0 kubenswrapper[33572]: I1204 22:25:46.335593 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-serving-cert\") pod \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " Dec 04 22:25:46.335696 master-0 kubenswrapper[33572]: I1204 22:25:46.335637 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-oauth-config\") pod \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " Dec 04 22:25:46.335832 master-0 kubenswrapper[33572]: I1204 22:25:46.335767 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lnss\" (UniqueName: \"kubernetes.io/projected/c0072c22-7f9a-4a92-9b46-ca145709c1bb-kube-api-access-8lnss\") pod \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " Dec 04 22:25:46.335832 master-0 kubenswrapper[33572]: I1204 22:25:46.335827 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-oauth-serving-cert\") pod \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\" (UID: \"c0072c22-7f9a-4a92-9b46-ca145709c1bb\") " Dec 04 22:25:46.336639 master-0 kubenswrapper[33572]: I1204 22:25:46.335970 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-service-ca" (OuterVolumeSpecName: "service-ca") pod "c0072c22-7f9a-4a92-9b46-ca145709c1bb" (UID: "c0072c22-7f9a-4a92-9b46-ca145709c1bb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:25:46.336639 master-0 kubenswrapper[33572]: I1204 22:25:46.336164 33572 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:46.336843 master-0 kubenswrapper[33572]: I1204 22:25:46.336737 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c0072c22-7f9a-4a92-9b46-ca145709c1bb" (UID: "c0072c22-7f9a-4a92-9b46-ca145709c1bb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:25:46.336843 master-0 kubenswrapper[33572]: I1204 22:25:46.336781 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c0072c22-7f9a-4a92-9b46-ca145709c1bb" (UID: "c0072c22-7f9a-4a92-9b46-ca145709c1bb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:25:46.336843 master-0 kubenswrapper[33572]: I1204 22:25:46.336756 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-config" (OuterVolumeSpecName: "console-config") pod "c0072c22-7f9a-4a92-9b46-ca145709c1bb" (UID: "c0072c22-7f9a-4a92-9b46-ca145709c1bb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:25:46.340624 master-0 kubenswrapper[33572]: I1204 22:25:46.340564 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c0072c22-7f9a-4a92-9b46-ca145709c1bb" (UID: "c0072c22-7f9a-4a92-9b46-ca145709c1bb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:25:46.340769 master-0 kubenswrapper[33572]: I1204 22:25:46.340727 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c0072c22-7f9a-4a92-9b46-ca145709c1bb" (UID: "c0072c22-7f9a-4a92-9b46-ca145709c1bb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:25:46.341711 master-0 kubenswrapper[33572]: I1204 22:25:46.341647 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0072c22-7f9a-4a92-9b46-ca145709c1bb-kube-api-access-8lnss" (OuterVolumeSpecName: "kube-api-access-8lnss") pod "c0072c22-7f9a-4a92-9b46-ca145709c1bb" (UID: "c0072c22-7f9a-4a92-9b46-ca145709c1bb"). InnerVolumeSpecName "kube-api-access-8lnss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:25:46.438313 master-0 kubenswrapper[33572]: I1204 22:25:46.438136 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lnss\" (UniqueName: \"kubernetes.io/projected/c0072c22-7f9a-4a92-9b46-ca145709c1bb-kube-api-access-8lnss\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:46.438313 master-0 kubenswrapper[33572]: I1204 22:25:46.438203 33572 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:46.438313 master-0 kubenswrapper[33572]: I1204 22:25:46.438221 33572 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:46.438313 master-0 kubenswrapper[33572]: I1204 22:25:46.438233 33572 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:46.438313 master-0 kubenswrapper[33572]: I1204 22:25:46.438250 33572 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:46.438313 master-0 kubenswrapper[33572]: I1204 22:25:46.438263 33572 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0072c22-7f9a-4a92-9b46-ca145709c1bb-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:46.624348 master-0 kubenswrapper[33572]: I1204 22:25:46.624211 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f9495c789-qq8pz"] Dec 04 22:25:46.630120 master-0 kubenswrapper[33572]: I1204 22:25:46.630055 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f9495c789-qq8pz"] Dec 04 22:25:47.689751 master-0 kubenswrapper[33572]: I1204 22:25:47.689650 33572 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:25:47.690679 master-0 kubenswrapper[33572]: I1204 22:25:47.690095 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="cluster-policy-controller" containerID="cri-o://51dfa6423a699c653fb4188616f00305edb215a14ee4fd1dcde5706013f4ee8d" gracePeriod=30 Dec 04 22:25:47.690679 master-0 kubenswrapper[33572]: I1204 22:25:47.690253 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://f07a7e680c95366ba4f1d333748a37d99d6fcd0c588c7658749adf9e44cb7229" gracePeriod=30 Dec 04 22:25:47.690679 master-0 kubenswrapper[33572]: I1204 22:25:47.690372 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager" containerID="cri-o://39aacc773fddb0383604f8a27ba1b199b302e4f4ede41fa8f08e464ed1607b81" gracePeriod=30 Dec 04 22:25:47.690679 master-0 kubenswrapper[33572]: I1204 22:25:47.690414 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://aab39ce7c056462df6f1a5933a3a5e925b99a0bd484dd0b16b296ab5327006ba" gracePeriod=30 Dec 04 22:25:47.692073 master-0 kubenswrapper[33572]: I1204 22:25:47.692030 33572 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:25:47.692609 master-0 kubenswrapper[33572]: E1204 22:25:47.692579 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0072c22-7f9a-4a92-9b46-ca145709c1bb" containerName="console" Dec 04 22:25:47.692609 master-0 kubenswrapper[33572]: I1204 22:25:47.692608 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0072c22-7f9a-4a92-9b46-ca145709c1bb" containerName="console" Dec 04 22:25:47.693022 master-0 kubenswrapper[33572]: E1204 22:25:47.692642 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager-recovery-controller" Dec 04 22:25:47.693022 master-0 kubenswrapper[33572]: I1204 22:25:47.692663 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager-recovery-controller" Dec 04 22:25:47.693022 master-0 kubenswrapper[33572]: E1204 22:25:47.692691 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager-cert-syncer" Dec 04 22:25:47.693022 master-0 kubenswrapper[33572]: I1204 22:25:47.692711 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager-cert-syncer" Dec 04 22:25:47.693022 master-0 kubenswrapper[33572]: E1204 22:25:47.692792 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager" Dec 04 22:25:47.693022 master-0 kubenswrapper[33572]: I1204 22:25:47.692813 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager" Dec 04 22:25:47.693022 master-0 kubenswrapper[33572]: E1204 22:25:47.692848 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="cluster-policy-controller" Dec 04 22:25:47.693022 master-0 kubenswrapper[33572]: I1204 22:25:47.692865 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="cluster-policy-controller" Dec 04 22:25:47.693022 master-0 kubenswrapper[33572]: E1204 22:25:47.692894 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager" Dec 04 22:25:47.700139 master-0 kubenswrapper[33572]: I1204 22:25:47.697365 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager" Dec 04 22:25:47.700139 master-0 kubenswrapper[33572]: E1204 22:25:47.697700 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a4bc900-27a4-4a4d-9cce-43e2a4708fec" containerName="console" Dec 04 22:25:47.700139 master-0 kubenswrapper[33572]: I1204 22:25:47.697786 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a4bc900-27a4-4a4d-9cce-43e2a4708fec" containerName="console" Dec 04 22:25:47.700139 master-0 kubenswrapper[33572]: I1204 22:25:47.699983 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager" Dec 04 22:25:47.700139 master-0 kubenswrapper[33572]: I1204 22:25:47.700035 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a4bc900-27a4-4a4d-9cce-43e2a4708fec" containerName="console" Dec 04 22:25:47.700139 master-0 kubenswrapper[33572]: I1204 22:25:47.700071 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="cluster-policy-controller" Dec 04 22:25:47.700139 master-0 kubenswrapper[33572]: I1204 22:25:47.700121 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager" Dec 04 22:25:47.700139 master-0 kubenswrapper[33572]: I1204 22:25:47.700154 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0072c22-7f9a-4a92-9b46-ca145709c1bb" containerName="console" Dec 04 22:25:47.711081 master-0 kubenswrapper[33572]: I1204 22:25:47.700197 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager-recovery-controller" Dec 04 22:25:47.711081 master-0 kubenswrapper[33572]: I1204 22:25:47.700223 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="5859424d8ea4459c5b854f1ae5fd942c" containerName="kube-controller-manager-cert-syncer" Dec 04 22:25:47.863540 master-0 kubenswrapper[33572]: I1204 22:25:47.863418 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b92b0b9a9905732cdf5189a9f937700b-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b92b0b9a9905732cdf5189a9f937700b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:25:47.864043 master-0 kubenswrapper[33572]: I1204 22:25:47.863945 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b92b0b9a9905732cdf5189a9f937700b-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b92b0b9a9905732cdf5189a9f937700b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:25:47.961341 master-0 kubenswrapper[33572]: I1204 22:25:47.961164 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_5859424d8ea4459c5b854f1ae5fd942c/kube-controller-manager-cert-syncer/0.log" Dec 04 22:25:47.963664 master-0 kubenswrapper[33572]: I1204 22:25:47.963587 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_5859424d8ea4459c5b854f1ae5fd942c/kube-controller-manager/0.log" Dec 04 22:25:47.963824 master-0 kubenswrapper[33572]: I1204 22:25:47.963772 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:25:47.965556 master-0 kubenswrapper[33572]: I1204 22:25:47.965471 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b92b0b9a9905732cdf5189a9f937700b-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b92b0b9a9905732cdf5189a9f937700b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:25:47.965693 master-0 kubenswrapper[33572]: I1204 22:25:47.965608 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b92b0b9a9905732cdf5189a9f937700b-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b92b0b9a9905732cdf5189a9f937700b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:25:47.965693 master-0 kubenswrapper[33572]: I1204 22:25:47.965657 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b92b0b9a9905732cdf5189a9f937700b-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b92b0b9a9905732cdf5189a9f937700b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:25:47.965841 master-0 kubenswrapper[33572]: I1204 22:25:47.965765 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b92b0b9a9905732cdf5189a9f937700b-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b92b0b9a9905732cdf5189a9f937700b\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:25:47.968668 master-0 kubenswrapper[33572]: I1204 22:25:47.968586 33572 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="5859424d8ea4459c5b854f1ae5fd942c" podUID="b92b0b9a9905732cdf5189a9f937700b" Dec 04 22:25:48.066616 master-0 kubenswrapper[33572]: I1204 22:25:48.066475 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-resource-dir\") pod \"5859424d8ea4459c5b854f1ae5fd942c\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " Dec 04 22:25:48.067194 master-0 kubenswrapper[33572]: I1204 22:25:48.066716 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-cert-dir\") pod \"5859424d8ea4459c5b854f1ae5fd942c\" (UID: \"5859424d8ea4459c5b854f1ae5fd942c\") " Dec 04 22:25:48.067194 master-0 kubenswrapper[33572]: I1204 22:25:48.066711 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "5859424d8ea4459c5b854f1ae5fd942c" (UID: "5859424d8ea4459c5b854f1ae5fd942c"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:25:48.067194 master-0 kubenswrapper[33572]: I1204 22:25:48.066830 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "5859424d8ea4459c5b854f1ae5fd942c" (UID: "5859424d8ea4459c5b854f1ae5fd942c"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:25:48.067780 master-0 kubenswrapper[33572]: I1204 22:25:48.067540 33572 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:48.067780 master-0 kubenswrapper[33572]: I1204 22:25:48.067582 33572 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5859424d8ea4459c5b854f1ae5fd942c-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:48.320451 master-0 kubenswrapper[33572]: I1204 22:25:48.320363 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_5859424d8ea4459c5b854f1ae5fd942c/kube-controller-manager-cert-syncer/0.log" Dec 04 22:25:48.322196 master-0 kubenswrapper[33572]: I1204 22:25:48.322147 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_5859424d8ea4459c5b854f1ae5fd942c/kube-controller-manager/0.log" Dec 04 22:25:48.322341 master-0 kubenswrapper[33572]: I1204 22:25:48.322200 33572 generic.go:334] "Generic (PLEG): container finished" podID="5859424d8ea4459c5b854f1ae5fd942c" containerID="39aacc773fddb0383604f8a27ba1b199b302e4f4ede41fa8f08e464ed1607b81" exitCode=0 Dec 04 22:25:48.322341 master-0 kubenswrapper[33572]: I1204 22:25:48.322221 33572 generic.go:334] "Generic (PLEG): container finished" podID="5859424d8ea4459c5b854f1ae5fd942c" containerID="f07a7e680c95366ba4f1d333748a37d99d6fcd0c588c7658749adf9e44cb7229" exitCode=0 Dec 04 22:25:48.322341 master-0 kubenswrapper[33572]: I1204 22:25:48.322230 33572 generic.go:334] "Generic (PLEG): container finished" podID="5859424d8ea4459c5b854f1ae5fd942c" containerID="aab39ce7c056462df6f1a5933a3a5e925b99a0bd484dd0b16b296ab5327006ba" exitCode=2 Dec 04 22:25:48.322341 master-0 kubenswrapper[33572]: I1204 22:25:48.322239 33572 generic.go:334] "Generic (PLEG): container finished" podID="5859424d8ea4459c5b854f1ae5fd942c" containerID="51dfa6423a699c653fb4188616f00305edb215a14ee4fd1dcde5706013f4ee8d" exitCode=0 Dec 04 22:25:48.322716 master-0 kubenswrapper[33572]: I1204 22:25:48.322351 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:25:48.323423 master-0 kubenswrapper[33572]: I1204 22:25:48.323357 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f2b4d655e0bf0ff49528fd7206e3f4bbadf9c2ae53c24cb531027c7c4811ac5" Dec 04 22:25:48.323573 master-0 kubenswrapper[33572]: I1204 22:25:48.323412 33572 scope.go:117] "RemoveContainer" containerID="669f49b80171e40aea73e838597bed75920e67751d5f839f6934dbce1fedc710" Dec 04 22:25:48.329352 master-0 kubenswrapper[33572]: I1204 22:25:48.329273 33572 generic.go:334] "Generic (PLEG): container finished" podID="41c30024-8bc2-42a7-87ec-544c44a34473" containerID="66c5f73c0677f3b432b1b092b80c6fb920b36c7fe1269fed502d1a3e7d6963f0" exitCode=0 Dec 04 22:25:48.329591 master-0 kubenswrapper[33572]: I1204 22:25:48.329345 33572 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="5859424d8ea4459c5b854f1ae5fd942c" podUID="b92b0b9a9905732cdf5189a9f937700b" Dec 04 22:25:48.329591 master-0 kubenswrapper[33572]: I1204 22:25:48.329357 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"41c30024-8bc2-42a7-87ec-544c44a34473","Type":"ContainerDied","Data":"66c5f73c0677f3b432b1b092b80c6fb920b36c7fe1269fed502d1a3e7d6963f0"} Dec 04 22:25:48.418485 master-0 kubenswrapper[33572]: I1204 22:25:48.417780 33572 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="5859424d8ea4459c5b854f1ae5fd942c" podUID="b92b0b9a9905732cdf5189a9f937700b" Dec 04 22:25:48.544527 master-0 kubenswrapper[33572]: I1204 22:25:48.544389 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5859424d8ea4459c5b854f1ae5fd942c" path="/var/lib/kubelet/pods/5859424d8ea4459c5b854f1ae5fd942c/volumes" Dec 04 22:25:48.546377 master-0 kubenswrapper[33572]: I1204 22:25:48.546317 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0072c22-7f9a-4a92-9b46-ca145709c1bb" path="/var/lib/kubelet/pods/c0072c22-7f9a-4a92-9b46-ca145709c1bb/volumes" Dec 04 22:25:49.346624 master-0 kubenswrapper[33572]: I1204 22:25:49.346243 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_5859424d8ea4459c5b854f1ae5fd942c/kube-controller-manager-cert-syncer/0.log" Dec 04 22:25:49.805563 master-0 kubenswrapper[33572]: I1204 22:25:49.805475 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:49.904145 master-0 kubenswrapper[33572]: I1204 22:25:49.904059 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-var-lock\") pod \"41c30024-8bc2-42a7-87ec-544c44a34473\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " Dec 04 22:25:49.904397 master-0 kubenswrapper[33572]: I1204 22:25:49.904198 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-kubelet-dir\") pod \"41c30024-8bc2-42a7-87ec-544c44a34473\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " Dec 04 22:25:49.904397 master-0 kubenswrapper[33572]: I1204 22:25:49.904216 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-var-lock" (OuterVolumeSpecName: "var-lock") pod "41c30024-8bc2-42a7-87ec-544c44a34473" (UID: "41c30024-8bc2-42a7-87ec-544c44a34473"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:25:49.904397 master-0 kubenswrapper[33572]: I1204 22:25:49.904285 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "41c30024-8bc2-42a7-87ec-544c44a34473" (UID: "41c30024-8bc2-42a7-87ec-544c44a34473"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:25:49.904397 master-0 kubenswrapper[33572]: I1204 22:25:49.904351 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41c30024-8bc2-42a7-87ec-544c44a34473-kube-api-access\") pod \"41c30024-8bc2-42a7-87ec-544c44a34473\" (UID: \"41c30024-8bc2-42a7-87ec-544c44a34473\") " Dec 04 22:25:49.904882 master-0 kubenswrapper[33572]: I1204 22:25:49.904834 33572 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:49.904882 master-0 kubenswrapper[33572]: I1204 22:25:49.904867 33572 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41c30024-8bc2-42a7-87ec-544c44a34473-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:49.910739 master-0 kubenswrapper[33572]: I1204 22:25:49.910652 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c30024-8bc2-42a7-87ec-544c44a34473-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "41c30024-8bc2-42a7-87ec-544c44a34473" (UID: "41c30024-8bc2-42a7-87ec-544c44a34473"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:25:50.007415 master-0 kubenswrapper[33572]: I1204 22:25:50.007264 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41c30024-8bc2-42a7-87ec-544c44a34473-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 04 22:25:50.362086 master-0 kubenswrapper[33572]: I1204 22:25:50.362007 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"41c30024-8bc2-42a7-87ec-544c44a34473","Type":"ContainerDied","Data":"eb834a848019b608d5675eb50ff5c4c3823b0d9dbbc3f36c234cc8d99132d790"} Dec 04 22:25:50.362086 master-0 kubenswrapper[33572]: I1204 22:25:50.362087 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb834a848019b608d5675eb50ff5c4c3823b0d9dbbc3f36c234cc8d99132d790" Dec 04 22:25:50.363306 master-0 kubenswrapper[33572]: I1204 22:25:50.362124 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 04 22:25:56.848333 master-0 kubenswrapper[33572]: I1204 22:25:56.848251 33572 scope.go:117] "RemoveContainer" containerID="51dfa6423a699c653fb4188616f00305edb215a14ee4fd1dcde5706013f4ee8d" Dec 04 22:25:57.121038 master-0 kubenswrapper[33572]: I1204 22:25:57.120840 33572 scope.go:117] "RemoveContainer" containerID="aab39ce7c056462df6f1a5933a3a5e925b99a0bd484dd0b16b296ab5327006ba" Dec 04 22:25:57.168755 master-0 kubenswrapper[33572]: I1204 22:25:57.168694 33572 scope.go:117] "RemoveContainer" containerID="f07a7e680c95366ba4f1d333748a37d99d6fcd0c588c7658749adf9e44cb7229" Dec 04 22:26:02.524355 master-0 kubenswrapper[33572]: I1204 22:26:02.524276 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:02.546212 master-0 kubenswrapper[33572]: I1204 22:26:02.546122 33572 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2b8e54e3-34e7-42cf-9a30-a7539664c894" Dec 04 22:26:02.546212 master-0 kubenswrapper[33572]: I1204 22:26:02.546188 33572 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2b8e54e3-34e7-42cf-9a30-a7539664c894" Dec 04 22:26:02.572647 master-0 kubenswrapper[33572]: I1204 22:26:02.568570 33572 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:02.577877 master-0 kubenswrapper[33572]: I1204 22:26:02.577802 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:26:02.585609 master-0 kubenswrapper[33572]: I1204 22:26:02.582583 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:02.591801 master-0 kubenswrapper[33572]: I1204 22:26:02.591724 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:26:02.599615 master-0 kubenswrapper[33572]: I1204 22:26:02.599422 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 04 22:26:03.534715 master-0 kubenswrapper[33572]: I1204 22:26:03.534645 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b92b0b9a9905732cdf5189a9f937700b","Type":"ContainerStarted","Data":"de0912ff89f77d67feab935ec0c27c84e7757a323d84e546025355f6bd5e8bda"} Dec 04 22:26:03.535709 master-0 kubenswrapper[33572]: I1204 22:26:03.534742 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b92b0b9a9905732cdf5189a9f937700b","Type":"ContainerStarted","Data":"55ab0aa10aeae2f939924d7c4de7873e9d5d72552e264205aa33e27b943f4b02"} Dec 04 22:26:03.535709 master-0 kubenswrapper[33572]: I1204 22:26:03.534762 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b92b0b9a9905732cdf5189a9f937700b","Type":"ContainerStarted","Data":"94cd259cc656544edac7ed34bd4931145a5a2bad65f2697075fd2f99c13df7ef"} Dec 04 22:26:03.535709 master-0 kubenswrapper[33572]: I1204 22:26:03.534775 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b92b0b9a9905732cdf5189a9f937700b","Type":"ContainerStarted","Data":"84f7dd1bf69caf286b179945f55e165051b083eda241d10568b56bed0c0b804c"} Dec 04 22:26:04.555379 master-0 kubenswrapper[33572]: I1204 22:26:04.555266 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b92b0b9a9905732cdf5189a9f937700b","Type":"ContainerStarted","Data":"7572e8cdd7b3d6909b982c3059be035f4002a29b98021fadc1e17f86fcb1bbe8"} Dec 04 22:26:12.585000 master-0 kubenswrapper[33572]: I1204 22:26:12.584826 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:12.585000 master-0 kubenswrapper[33572]: I1204 22:26:12.584957 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:12.585000 master-0 kubenswrapper[33572]: I1204 22:26:12.584998 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:12.585000 master-0 kubenswrapper[33572]: I1204 22:26:12.585026 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:12.591984 master-0 kubenswrapper[33572]: I1204 22:26:12.591914 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:12.592096 master-0 kubenswrapper[33572]: I1204 22:26:12.592045 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:12.624295 master-0 kubenswrapper[33572]: I1204 22:26:12.624227 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=10.624210975 podStartE2EDuration="10.624210975s" podCreationTimestamp="2025-12-04 22:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:26:04.582779697 +0000 UTC m=+428.310305426" watchObservedRunningTime="2025-12-04 22:26:12.624210975 +0000 UTC m=+436.351736624" Dec 04 22:26:12.642053 master-0 kubenswrapper[33572]: I1204 22:26:12.641953 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:13.649846 master-0 kubenswrapper[33572]: I1204 22:26:13.649740 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 04 22:26:28.076231 master-0 kubenswrapper[33572]: I1204 22:26:28.076122 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-mwqvk"] Dec 04 22:26:28.077694 master-0 kubenswrapper[33572]: E1204 22:26:28.076515 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c30024-8bc2-42a7-87ec-544c44a34473" containerName="installer" Dec 04 22:26:28.077694 master-0 kubenswrapper[33572]: I1204 22:26:28.076533 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c30024-8bc2-42a7-87ec-544c44a34473" containerName="installer" Dec 04 22:26:28.077694 master-0 kubenswrapper[33572]: I1204 22:26:28.076744 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c30024-8bc2-42a7-87ec-544c44a34473" containerName="installer" Dec 04 22:26:28.077694 master-0 kubenswrapper[33572]: I1204 22:26:28.077277 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.079031 master-0 kubenswrapper[33572]: I1204 22:26:28.078941 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Dec 04 22:26:28.080414 master-0 kubenswrapper[33572]: I1204 22:26:28.079327 33572 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Dec 04 22:26:28.080414 master-0 kubenswrapper[33572]: I1204 22:26:28.080330 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Dec 04 22:26:28.080712 master-0 kubenswrapper[33572]: I1204 22:26:28.080689 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Dec 04 22:26:28.092060 master-0 kubenswrapper[33572]: I1204 22:26:28.091996 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-mwqvk"] Dec 04 22:26:28.137267 master-0 kubenswrapper[33572]: I1204 22:26:28.137201 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/41a4a33e-5739-4a56-8a8a-98bfd642d564-os-client-config\") pod \"sushy-emulator-58f4c9b998-mwqvk\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.137409 master-0 kubenswrapper[33572]: I1204 22:26:28.137317 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxzdk\" (UniqueName: \"kubernetes.io/projected/41a4a33e-5739-4a56-8a8a-98bfd642d564-kube-api-access-bxzdk\") pod \"sushy-emulator-58f4c9b998-mwqvk\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.137539 master-0 kubenswrapper[33572]: I1204 22:26:28.137458 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/41a4a33e-5739-4a56-8a8a-98bfd642d564-sushy-emulator-config\") pod \"sushy-emulator-58f4c9b998-mwqvk\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.239159 master-0 kubenswrapper[33572]: I1204 22:26:28.239068 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/41a4a33e-5739-4a56-8a8a-98bfd642d564-os-client-config\") pod \"sushy-emulator-58f4c9b998-mwqvk\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.239446 master-0 kubenswrapper[33572]: I1204 22:26:28.239267 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxzdk\" (UniqueName: \"kubernetes.io/projected/41a4a33e-5739-4a56-8a8a-98bfd642d564-kube-api-access-bxzdk\") pod \"sushy-emulator-58f4c9b998-mwqvk\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.239446 master-0 kubenswrapper[33572]: I1204 22:26:28.239376 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/41a4a33e-5739-4a56-8a8a-98bfd642d564-sushy-emulator-config\") pod \"sushy-emulator-58f4c9b998-mwqvk\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.241134 master-0 kubenswrapper[33572]: I1204 22:26:28.241085 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/41a4a33e-5739-4a56-8a8a-98bfd642d564-sushy-emulator-config\") pod \"sushy-emulator-58f4c9b998-mwqvk\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.243949 master-0 kubenswrapper[33572]: I1204 22:26:28.243884 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/41a4a33e-5739-4a56-8a8a-98bfd642d564-os-client-config\") pod \"sushy-emulator-58f4c9b998-mwqvk\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.270573 master-0 kubenswrapper[33572]: I1204 22:26:28.270432 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxzdk\" (UniqueName: \"kubernetes.io/projected/41a4a33e-5739-4a56-8a8a-98bfd642d564-kube-api-access-bxzdk\") pod \"sushy-emulator-58f4c9b998-mwqvk\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.414185 master-0 kubenswrapper[33572]: I1204 22:26:28.413987 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:28.954106 master-0 kubenswrapper[33572]: I1204 22:26:28.954018 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-mwqvk"] Dec 04 22:26:28.962007 master-0 kubenswrapper[33572]: W1204 22:26:28.961891 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41a4a33e_5739_4a56_8a8a_98bfd642d564.slice/crio-1cd6c3429ca666ee1557a97d5b1f23bbfb9fdbc0b1596b23d564faba211c3ffa WatchSource:0}: Error finding container 1cd6c3429ca666ee1557a97d5b1f23bbfb9fdbc0b1596b23d564faba211c3ffa: Status 404 returned error can't find the container with id 1cd6c3429ca666ee1557a97d5b1f23bbfb9fdbc0b1596b23d564faba211c3ffa Dec 04 22:26:29.815725 master-0 kubenswrapper[33572]: I1204 22:26:29.815631 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" event={"ID":"41a4a33e-5739-4a56-8a8a-98bfd642d564","Type":"ContainerStarted","Data":"1cd6c3429ca666ee1557a97d5b1f23bbfb9fdbc0b1596b23d564faba211c3ffa"} Dec 04 22:26:35.866897 master-0 kubenswrapper[33572]: I1204 22:26:35.866782 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" event={"ID":"41a4a33e-5739-4a56-8a8a-98bfd642d564","Type":"ContainerStarted","Data":"3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40"} Dec 04 22:26:35.899309 master-0 kubenswrapper[33572]: I1204 22:26:35.899109 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" podStartSLOduration=1.643574208 podStartE2EDuration="7.899082191s" podCreationTimestamp="2025-12-04 22:26:28 +0000 UTC" firstStartedPulling="2025-12-04 22:26:28.964559552 +0000 UTC m=+452.692085201" lastFinishedPulling="2025-12-04 22:26:35.220067525 +0000 UTC m=+458.947593184" observedRunningTime="2025-12-04 22:26:35.889032503 +0000 UTC m=+459.616558172" watchObservedRunningTime="2025-12-04 22:26:35.899082191 +0000 UTC m=+459.626607870" Dec 04 22:26:38.419269 master-0 kubenswrapper[33572]: I1204 22:26:38.417326 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:38.419269 master-0 kubenswrapper[33572]: I1204 22:26:38.419079 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:38.435182 master-0 kubenswrapper[33572]: I1204 22:26:38.435072 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:26:38.897127 master-0 kubenswrapper[33572]: I1204 22:26:38.896827 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:27:10.891165 master-0 kubenswrapper[33572]: I1204 22:27:10.891050 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv"] Dec 04 22:27:10.900274 master-0 kubenswrapper[33572]: I1204 22:27:10.900188 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:10.902446 master-0 kubenswrapper[33572]: I1204 22:27:10.902385 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-fmctc" Dec 04 22:27:10.913849 master-0 kubenswrapper[33572]: I1204 22:27:10.913746 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv"] Dec 04 22:27:11.065731 master-0 kubenswrapper[33572]: I1204 22:27:11.065685 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:11.066575 master-0 kubenswrapper[33572]: I1204 22:27:11.066548 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:11.066770 master-0 kubenswrapper[33572]: I1204 22:27:11.066747 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfzn4\" (UniqueName: \"kubernetes.io/projected/9da7104c-9748-407f-91db-3923683e8c34-kube-api-access-lfzn4\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:11.168555 master-0 kubenswrapper[33572]: I1204 22:27:11.168328 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:11.168864 master-0 kubenswrapper[33572]: I1204 22:27:11.168560 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfzn4\" (UniqueName: \"kubernetes.io/projected/9da7104c-9748-407f-91db-3923683e8c34-kube-api-access-lfzn4\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:11.168864 master-0 kubenswrapper[33572]: I1204 22:27:11.168762 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:11.169904 master-0 kubenswrapper[33572]: I1204 22:27:11.169543 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:11.169904 master-0 kubenswrapper[33572]: I1204 22:27:11.169714 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:11.199211 master-0 kubenswrapper[33572]: I1204 22:27:11.199169 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfzn4\" (UniqueName: \"kubernetes.io/projected/9da7104c-9748-407f-91db-3923683e8c34-kube-api-access-lfzn4\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:11.234474 master-0 kubenswrapper[33572]: I1204 22:27:11.234366 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:11.686560 master-0 kubenswrapper[33572]: I1204 22:27:11.686467 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv"] Dec 04 22:27:11.692047 master-0 kubenswrapper[33572]: W1204 22:27:11.692001 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9da7104c_9748_407f_91db_3923683e8c34.slice/crio-1aff58778e9eca441a1ec452f229678fce6eea7466183ec54f0f0c6a7678d2c0 WatchSource:0}: Error finding container 1aff58778e9eca441a1ec452f229678fce6eea7466183ec54f0f0c6a7678d2c0: Status 404 returned error can't find the container with id 1aff58778e9eca441a1ec452f229678fce6eea7466183ec54f0f0c6a7678d2c0 Dec 04 22:27:12.215050 master-0 kubenswrapper[33572]: I1204 22:27:12.214985 33572 generic.go:334] "Generic (PLEG): container finished" podID="9da7104c-9748-407f-91db-3923683e8c34" containerID="297ce1cb22425475ec24e00a9f5d420eafd0bca74781276518ee918ca888bd08" exitCode=0 Dec 04 22:27:12.216002 master-0 kubenswrapper[33572]: I1204 22:27:12.215086 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" event={"ID":"9da7104c-9748-407f-91db-3923683e8c34","Type":"ContainerDied","Data":"297ce1cb22425475ec24e00a9f5d420eafd0bca74781276518ee918ca888bd08"} Dec 04 22:27:12.216259 master-0 kubenswrapper[33572]: I1204 22:27:12.216212 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" event={"ID":"9da7104c-9748-407f-91db-3923683e8c34","Type":"ContainerStarted","Data":"1aff58778e9eca441a1ec452f229678fce6eea7466183ec54f0f0c6a7678d2c0"} Dec 04 22:27:13.217563 master-0 kubenswrapper[33572]: I1204 22:27:13.217425 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tssm5"] Dec 04 22:27:13.220126 master-0 kubenswrapper[33572]: I1204 22:27:13.220066 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:13.229679 master-0 kubenswrapper[33572]: I1204 22:27:13.229619 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tssm5"] Dec 04 22:27:13.407565 master-0 kubenswrapper[33572]: I1204 22:27:13.407514 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-utilities\") pod \"redhat-operators-tssm5\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:13.407931 master-0 kubenswrapper[33572]: I1204 22:27:13.407904 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvn8p\" (UniqueName: \"kubernetes.io/projected/3d960c97-d33f-4537-865b-72cc6457310b-kube-api-access-rvn8p\") pod \"redhat-operators-tssm5\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:13.408064 master-0 kubenswrapper[33572]: I1204 22:27:13.408046 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-catalog-content\") pod \"redhat-operators-tssm5\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:13.509840 master-0 kubenswrapper[33572]: I1204 22:27:13.509636 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvn8p\" (UniqueName: \"kubernetes.io/projected/3d960c97-d33f-4537-865b-72cc6457310b-kube-api-access-rvn8p\") pod \"redhat-operators-tssm5\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:13.510315 master-0 kubenswrapper[33572]: I1204 22:27:13.510259 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-catalog-content\") pod \"redhat-operators-tssm5\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:13.510854 master-0 kubenswrapper[33572]: I1204 22:27:13.510833 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-utilities\") pod \"redhat-operators-tssm5\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:13.511124 master-0 kubenswrapper[33572]: I1204 22:27:13.510851 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-catalog-content\") pod \"redhat-operators-tssm5\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:13.511124 master-0 kubenswrapper[33572]: I1204 22:27:13.511118 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-utilities\") pod \"redhat-operators-tssm5\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:13.530693 master-0 kubenswrapper[33572]: I1204 22:27:13.530594 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvn8p\" (UniqueName: \"kubernetes.io/projected/3d960c97-d33f-4537-865b-72cc6457310b-kube-api-access-rvn8p\") pod \"redhat-operators-tssm5\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:13.555000 master-0 kubenswrapper[33572]: I1204 22:27:13.554918 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:14.015094 master-0 kubenswrapper[33572]: I1204 22:27:14.015007 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tssm5"] Dec 04 22:27:14.235207 master-0 kubenswrapper[33572]: I1204 22:27:14.235149 33572 generic.go:334] "Generic (PLEG): container finished" podID="9da7104c-9748-407f-91db-3923683e8c34" containerID="485d7cb347e947bd939bb9a33a80489cd288982ef6ba3faf0c8f65a2d7e44f56" exitCode=0 Dec 04 22:27:14.235767 master-0 kubenswrapper[33572]: I1204 22:27:14.235243 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" event={"ID":"9da7104c-9748-407f-91db-3923683e8c34","Type":"ContainerDied","Data":"485d7cb347e947bd939bb9a33a80489cd288982ef6ba3faf0c8f65a2d7e44f56"} Dec 04 22:27:14.236488 master-0 kubenswrapper[33572]: I1204 22:27:14.236398 33572 generic.go:334] "Generic (PLEG): container finished" podID="3d960c97-d33f-4537-865b-72cc6457310b" containerID="086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e" exitCode=0 Dec 04 22:27:14.236488 master-0 kubenswrapper[33572]: I1204 22:27:14.236423 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tssm5" event={"ID":"3d960c97-d33f-4537-865b-72cc6457310b","Type":"ContainerDied","Data":"086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e"} Dec 04 22:27:14.236488 master-0 kubenswrapper[33572]: I1204 22:27:14.236440 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tssm5" event={"ID":"3d960c97-d33f-4537-865b-72cc6457310b","Type":"ContainerStarted","Data":"5f3902730ccf9a944f3d7871e871352d293fa657349c01fef6a9c11d9e688115"} Dec 04 22:27:15.253000 master-0 kubenswrapper[33572]: I1204 22:27:15.252819 33572 generic.go:334] "Generic (PLEG): container finished" podID="9da7104c-9748-407f-91db-3923683e8c34" containerID="a613084fef60a73410bd5efb176f73a9b23872d6fb684e7c1cd9961a54826c08" exitCode=0 Dec 04 22:27:15.253000 master-0 kubenswrapper[33572]: I1204 22:27:15.252876 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" event={"ID":"9da7104c-9748-407f-91db-3923683e8c34","Type":"ContainerDied","Data":"a613084fef60a73410bd5efb176f73a9b23872d6fb684e7c1cd9961a54826c08"} Dec 04 22:27:15.257346 master-0 kubenswrapper[33572]: I1204 22:27:15.255673 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tssm5" event={"ID":"3d960c97-d33f-4537-865b-72cc6457310b","Type":"ContainerStarted","Data":"bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c"} Dec 04 22:27:16.268370 master-0 kubenswrapper[33572]: I1204 22:27:16.268287 33572 generic.go:334] "Generic (PLEG): container finished" podID="3d960c97-d33f-4537-865b-72cc6457310b" containerID="bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c" exitCode=0 Dec 04 22:27:16.269198 master-0 kubenswrapper[33572]: I1204 22:27:16.268346 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tssm5" event={"ID":"3d960c97-d33f-4537-865b-72cc6457310b","Type":"ContainerDied","Data":"bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c"} Dec 04 22:27:16.629861 master-0 kubenswrapper[33572]: I1204 22:27:16.629760 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:16.773104 master-0 kubenswrapper[33572]: I1204 22:27:16.773023 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-util\") pod \"9da7104c-9748-407f-91db-3923683e8c34\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " Dec 04 22:27:16.773104 master-0 kubenswrapper[33572]: I1204 22:27:16.773107 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfzn4\" (UniqueName: \"kubernetes.io/projected/9da7104c-9748-407f-91db-3923683e8c34-kube-api-access-lfzn4\") pod \"9da7104c-9748-407f-91db-3923683e8c34\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " Dec 04 22:27:16.773602 master-0 kubenswrapper[33572]: I1204 22:27:16.773184 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-bundle\") pod \"9da7104c-9748-407f-91db-3923683e8c34\" (UID: \"9da7104c-9748-407f-91db-3923683e8c34\") " Dec 04 22:27:16.774976 master-0 kubenswrapper[33572]: I1204 22:27:16.774926 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-bundle" (OuterVolumeSpecName: "bundle") pod "9da7104c-9748-407f-91db-3923683e8c34" (UID: "9da7104c-9748-407f-91db-3923683e8c34"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:27:16.779943 master-0 kubenswrapper[33572]: I1204 22:27:16.778588 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da7104c-9748-407f-91db-3923683e8c34-kube-api-access-lfzn4" (OuterVolumeSpecName: "kube-api-access-lfzn4") pod "9da7104c-9748-407f-91db-3923683e8c34" (UID: "9da7104c-9748-407f-91db-3923683e8c34"). InnerVolumeSpecName "kube-api-access-lfzn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:27:16.808986 master-0 kubenswrapper[33572]: I1204 22:27:16.808904 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-util" (OuterVolumeSpecName: "util") pod "9da7104c-9748-407f-91db-3923683e8c34" (UID: "9da7104c-9748-407f-91db-3923683e8c34"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:27:16.874921 master-0 kubenswrapper[33572]: I1204 22:27:16.874865 33572 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:16.874921 master-0 kubenswrapper[33572]: I1204 22:27:16.874898 33572 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9da7104c-9748-407f-91db-3923683e8c34-util\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:16.874921 master-0 kubenswrapper[33572]: I1204 22:27:16.874908 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfzn4\" (UniqueName: \"kubernetes.io/projected/9da7104c-9748-407f-91db-3923683e8c34-kube-api-access-lfzn4\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:17.290458 master-0 kubenswrapper[33572]: I1204 22:27:17.290357 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" event={"ID":"9da7104c-9748-407f-91db-3923683e8c34","Type":"ContainerDied","Data":"1aff58778e9eca441a1ec452f229678fce6eea7466183ec54f0f0c6a7678d2c0"} Dec 04 22:27:17.290458 master-0 kubenswrapper[33572]: I1204 22:27:17.290445 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aff58778e9eca441a1ec452f229678fce6eea7466183ec54f0f0c6a7678d2c0" Dec 04 22:27:17.291203 master-0 kubenswrapper[33572]: I1204 22:27:17.290373 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4khrlv" Dec 04 22:27:17.299111 master-0 kubenswrapper[33572]: I1204 22:27:17.299010 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tssm5" event={"ID":"3d960c97-d33f-4537-865b-72cc6457310b","Type":"ContainerStarted","Data":"dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82"} Dec 04 22:27:17.545718 master-0 kubenswrapper[33572]: I1204 22:27:17.545573 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tssm5" podStartSLOduration=2.002224341 podStartE2EDuration="4.545546544s" podCreationTimestamp="2025-12-04 22:27:13 +0000 UTC" firstStartedPulling="2025-12-04 22:27:14.237526728 +0000 UTC m=+497.965052377" lastFinishedPulling="2025-12-04 22:27:16.780848921 +0000 UTC m=+500.508374580" observedRunningTime="2025-12-04 22:27:17.537726857 +0000 UTC m=+501.265252616" watchObservedRunningTime="2025-12-04 22:27:17.545546544 +0000 UTC m=+501.273072233" Dec 04 22:27:23.557016 master-0 kubenswrapper[33572]: I1204 22:27:23.556854 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:23.557817 master-0 kubenswrapper[33572]: I1204 22:27:23.557784 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:24.267483 master-0 kubenswrapper[33572]: I1204 22:27:24.267401 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-77667f8d6-nvjzt"] Dec 04 22:27:24.267820 master-0 kubenswrapper[33572]: E1204 22:27:24.267786 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da7104c-9748-407f-91db-3923683e8c34" containerName="extract" Dec 04 22:27:24.267820 master-0 kubenswrapper[33572]: I1204 22:27:24.267807 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da7104c-9748-407f-91db-3923683e8c34" containerName="extract" Dec 04 22:27:24.267978 master-0 kubenswrapper[33572]: E1204 22:27:24.267829 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da7104c-9748-407f-91db-3923683e8c34" containerName="util" Dec 04 22:27:24.267978 master-0 kubenswrapper[33572]: I1204 22:27:24.267838 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da7104c-9748-407f-91db-3923683e8c34" containerName="util" Dec 04 22:27:24.267978 master-0 kubenswrapper[33572]: E1204 22:27:24.267847 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da7104c-9748-407f-91db-3923683e8c34" containerName="pull" Dec 04 22:27:24.267978 master-0 kubenswrapper[33572]: I1204 22:27:24.267853 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da7104c-9748-407f-91db-3923683e8c34" containerName="pull" Dec 04 22:27:24.268207 master-0 kubenswrapper[33572]: I1204 22:27:24.268014 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da7104c-9748-407f-91db-3923683e8c34" containerName="extract" Dec 04 22:27:24.268493 master-0 kubenswrapper[33572]: I1204 22:27:24.268454 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.270351 master-0 kubenswrapper[33572]: I1204 22:27:24.270304 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Dec 04 22:27:24.270631 master-0 kubenswrapper[33572]: I1204 22:27:24.270595 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Dec 04 22:27:24.270751 master-0 kubenswrapper[33572]: I1204 22:27:24.270731 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Dec 04 22:27:24.271695 master-0 kubenswrapper[33572]: I1204 22:27:24.271656 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Dec 04 22:27:24.279872 master-0 kubenswrapper[33572]: I1204 22:27:24.279813 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-77667f8d6-nvjzt"] Dec 04 22:27:24.304648 master-0 kubenswrapper[33572]: I1204 22:27:24.304586 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Dec 04 22:27:24.437828 master-0 kubenswrapper[33572]: I1204 22:27:24.437737 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a80dbd-b502-4b66-9afb-1746e5644c41-socket-dir\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.438086 master-0 kubenswrapper[33572]: I1204 22:27:24.437855 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0a80dbd-b502-4b66-9afb-1746e5644c41-metrics-cert\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.438086 master-0 kubenswrapper[33572]: I1204 22:27:24.437901 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0a80dbd-b502-4b66-9afb-1746e5644c41-apiservice-cert\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.438086 master-0 kubenswrapper[33572]: I1204 22:27:24.437951 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgtql\" (UniqueName: \"kubernetes.io/projected/e0a80dbd-b502-4b66-9afb-1746e5644c41-kube-api-access-jgtql\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.438345 master-0 kubenswrapper[33572]: I1204 22:27:24.438293 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0a80dbd-b502-4b66-9afb-1746e5644c41-webhook-cert\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.539480 master-0 kubenswrapper[33572]: I1204 22:27:24.539309 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0a80dbd-b502-4b66-9afb-1746e5644c41-webhook-cert\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.539480 master-0 kubenswrapper[33572]: I1204 22:27:24.539396 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a80dbd-b502-4b66-9afb-1746e5644c41-socket-dir\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.539480 master-0 kubenswrapper[33572]: I1204 22:27:24.539416 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0a80dbd-b502-4b66-9afb-1746e5644c41-metrics-cert\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.539480 master-0 kubenswrapper[33572]: I1204 22:27:24.539431 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0a80dbd-b502-4b66-9afb-1746e5644c41-apiservice-cert\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.539480 master-0 kubenswrapper[33572]: I1204 22:27:24.539462 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgtql\" (UniqueName: \"kubernetes.io/projected/e0a80dbd-b502-4b66-9afb-1746e5644c41-kube-api-access-jgtql\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.540267 master-0 kubenswrapper[33572]: I1204 22:27:24.540201 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a80dbd-b502-4b66-9afb-1746e5644c41-socket-dir\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.544340 master-0 kubenswrapper[33572]: I1204 22:27:24.544276 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0a80dbd-b502-4b66-9afb-1746e5644c41-webhook-cert\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.545147 master-0 kubenswrapper[33572]: I1204 22:27:24.544423 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0a80dbd-b502-4b66-9afb-1746e5644c41-metrics-cert\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.545147 master-0 kubenswrapper[33572]: I1204 22:27:24.545043 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0a80dbd-b502-4b66-9afb-1746e5644c41-apiservice-cert\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.561452 master-0 kubenswrapper[33572]: I1204 22:27:24.561390 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgtql\" (UniqueName: \"kubernetes.io/projected/e0a80dbd-b502-4b66-9afb-1746e5644c41-kube-api-access-jgtql\") pod \"lvms-operator-77667f8d6-nvjzt\" (UID: \"e0a80dbd-b502-4b66-9afb-1746e5644c41\") " pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:24.603009 master-0 kubenswrapper[33572]: I1204 22:27:24.602907 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tssm5" podUID="3d960c97-d33f-4537-865b-72cc6457310b" containerName="registry-server" probeResult="failure" output=< Dec 04 22:27:24.603009 master-0 kubenswrapper[33572]: timeout: failed to connect service ":50051" within 1s Dec 04 22:27:24.603009 master-0 kubenswrapper[33572]: > Dec 04 22:27:24.626779 master-0 kubenswrapper[33572]: I1204 22:27:24.626693 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:25.156949 master-0 kubenswrapper[33572]: I1204 22:27:25.156868 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-77667f8d6-nvjzt"] Dec 04 22:27:25.162742 master-0 kubenswrapper[33572]: W1204 22:27:25.162671 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0a80dbd_b502_4b66_9afb_1746e5644c41.slice/crio-13b2fdef6af5194a4a2d268d57eb88efc63905dce698036dd29d18efdc2719ae WatchSource:0}: Error finding container 13b2fdef6af5194a4a2d268d57eb88efc63905dce698036dd29d18efdc2719ae: Status 404 returned error can't find the container with id 13b2fdef6af5194a4a2d268d57eb88efc63905dce698036dd29d18efdc2719ae Dec 04 22:27:25.359214 master-0 kubenswrapper[33572]: I1204 22:27:25.359155 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" event={"ID":"e0a80dbd-b502-4b66-9afb-1746e5644c41","Type":"ContainerStarted","Data":"13b2fdef6af5194a4a2d268d57eb88efc63905dce698036dd29d18efdc2719ae"} Dec 04 22:27:32.446787 master-0 kubenswrapper[33572]: I1204 22:27:32.446601 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" event={"ID":"e0a80dbd-b502-4b66-9afb-1746e5644c41","Type":"ContainerStarted","Data":"6153a36ee582e09743e18ec203c49bf01d6061881034631970ff30305130ce99"} Dec 04 22:27:32.447650 master-0 kubenswrapper[33572]: I1204 22:27:32.447003 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:32.454443 master-0 kubenswrapper[33572]: I1204 22:27:32.452537 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" Dec 04 22:27:32.481021 master-0 kubenswrapper[33572]: I1204 22:27:32.480228 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-77667f8d6-nvjzt" podStartSLOduration=1.9055869429999999 podStartE2EDuration="8.480196094s" podCreationTimestamp="2025-12-04 22:27:24 +0000 UTC" firstStartedPulling="2025-12-04 22:27:25.166515841 +0000 UTC m=+508.894041490" lastFinishedPulling="2025-12-04 22:27:31.741124952 +0000 UTC m=+515.468650641" observedRunningTime="2025-12-04 22:27:32.47969144 +0000 UTC m=+516.207217099" watchObservedRunningTime="2025-12-04 22:27:32.480196094 +0000 UTC m=+516.207721783" Dec 04 22:27:33.620065 master-0 kubenswrapper[33572]: I1204 22:27:33.620008 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:33.683100 master-0 kubenswrapper[33572]: I1204 22:27:33.683039 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:36.353266 master-0 kubenswrapper[33572]: I1204 22:27:36.353191 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tssm5"] Dec 04 22:27:36.354647 master-0 kubenswrapper[33572]: I1204 22:27:36.354597 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tssm5" podUID="3d960c97-d33f-4537-865b-72cc6457310b" containerName="registry-server" containerID="cri-o://dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82" gracePeriod=2 Dec 04 22:27:36.840187 master-0 kubenswrapper[33572]: I1204 22:27:36.840095 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494"] Dec 04 22:27:36.852758 master-0 kubenswrapper[33572]: I1204 22:27:36.852654 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494"] Dec 04 22:27:36.852913 master-0 kubenswrapper[33572]: I1204 22:27:36.852855 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:36.864341 master-0 kubenswrapper[33572]: I1204 22:27:36.864127 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-fmctc" Dec 04 22:27:36.870128 master-0 kubenswrapper[33572]: I1204 22:27:36.869767 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:36.870128 master-0 kubenswrapper[33572]: I1204 22:27:36.869892 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:36.870128 master-0 kubenswrapper[33572]: I1204 22:27:36.869938 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdsb7\" (UniqueName: \"kubernetes.io/projected/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-kube-api-access-cdsb7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:36.900944 master-0 kubenswrapper[33572]: I1204 22:27:36.900533 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:36.970368 master-0 kubenswrapper[33572]: I1204 22:27:36.970300 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-utilities\") pod \"3d960c97-d33f-4537-865b-72cc6457310b\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " Dec 04 22:27:36.971285 master-0 kubenswrapper[33572]: I1204 22:27:36.970407 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvn8p\" (UniqueName: \"kubernetes.io/projected/3d960c97-d33f-4537-865b-72cc6457310b-kube-api-access-rvn8p\") pod \"3d960c97-d33f-4537-865b-72cc6457310b\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " Dec 04 22:27:36.971285 master-0 kubenswrapper[33572]: I1204 22:27:36.970434 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-catalog-content\") pod \"3d960c97-d33f-4537-865b-72cc6457310b\" (UID: \"3d960c97-d33f-4537-865b-72cc6457310b\") " Dec 04 22:27:36.971285 master-0 kubenswrapper[33572]: I1204 22:27:36.970616 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:36.971285 master-0 kubenswrapper[33572]: I1204 22:27:36.970669 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:36.971819 master-0 kubenswrapper[33572]: I1204 22:27:36.971622 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdsb7\" (UniqueName: \"kubernetes.io/projected/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-kube-api-access-cdsb7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:36.972482 master-0 kubenswrapper[33572]: I1204 22:27:36.972153 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:36.972482 master-0 kubenswrapper[33572]: I1204 22:27:36.972190 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:36.972745 master-0 kubenswrapper[33572]: I1204 22:27:36.972489 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-utilities" (OuterVolumeSpecName: "utilities") pod "3d960c97-d33f-4537-865b-72cc6457310b" (UID: "3d960c97-d33f-4537-865b-72cc6457310b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:27:36.976704 master-0 kubenswrapper[33572]: I1204 22:27:36.974206 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d960c97-d33f-4537-865b-72cc6457310b-kube-api-access-rvn8p" (OuterVolumeSpecName: "kube-api-access-rvn8p") pod "3d960c97-d33f-4537-865b-72cc6457310b" (UID: "3d960c97-d33f-4537-865b-72cc6457310b"). InnerVolumeSpecName "kube-api-access-rvn8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:27:36.989154 master-0 kubenswrapper[33572]: I1204 22:27:36.989072 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdsb7\" (UniqueName: \"kubernetes.io/projected/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-kube-api-access-cdsb7\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:37.072974 master-0 kubenswrapper[33572]: I1204 22:27:37.072853 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d960c97-d33f-4537-865b-72cc6457310b" (UID: "3d960c97-d33f-4537-865b-72cc6457310b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:27:37.075866 master-0 kubenswrapper[33572]: I1204 22:27:37.075348 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:37.075866 master-0 kubenswrapper[33572]: I1204 22:27:37.075402 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvn8p\" (UniqueName: \"kubernetes.io/projected/3d960c97-d33f-4537-865b-72cc6457310b-kube-api-access-rvn8p\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:37.075866 master-0 kubenswrapper[33572]: I1204 22:27:37.075451 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d960c97-d33f-4537-865b-72cc6457310b-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:37.210053 master-0 kubenswrapper[33572]: I1204 22:27:37.208067 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:37.252350 master-0 kubenswrapper[33572]: I1204 22:27:37.250445 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v"] Dec 04 22:27:37.252350 master-0 kubenswrapper[33572]: E1204 22:27:37.251073 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d960c97-d33f-4537-865b-72cc6457310b" containerName="extract-utilities" Dec 04 22:27:37.252350 master-0 kubenswrapper[33572]: I1204 22:27:37.251107 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d960c97-d33f-4537-865b-72cc6457310b" containerName="extract-utilities" Dec 04 22:27:37.252350 master-0 kubenswrapper[33572]: E1204 22:27:37.251161 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d960c97-d33f-4537-865b-72cc6457310b" containerName="extract-content" Dec 04 22:27:37.252350 master-0 kubenswrapper[33572]: I1204 22:27:37.251175 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d960c97-d33f-4537-865b-72cc6457310b" containerName="extract-content" Dec 04 22:27:37.252350 master-0 kubenswrapper[33572]: E1204 22:27:37.251242 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d960c97-d33f-4537-865b-72cc6457310b" containerName="registry-server" Dec 04 22:27:37.252350 master-0 kubenswrapper[33572]: I1204 22:27:37.251256 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d960c97-d33f-4537-865b-72cc6457310b" containerName="registry-server" Dec 04 22:27:37.252350 master-0 kubenswrapper[33572]: I1204 22:27:37.251734 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d960c97-d33f-4537-865b-72cc6457310b" containerName="registry-server" Dec 04 22:27:37.256242 master-0 kubenswrapper[33572]: I1204 22:27:37.255950 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.261251 master-0 kubenswrapper[33572]: I1204 22:27:37.261161 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v"] Dec 04 22:27:37.280704 master-0 kubenswrapper[33572]: I1204 22:27:37.280648 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfhrw\" (UniqueName: \"kubernetes.io/projected/47455326-9604-4582-bdec-78d65bf273ba-kube-api-access-rfhrw\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.280918 master-0 kubenswrapper[33572]: I1204 22:27:37.280714 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.280918 master-0 kubenswrapper[33572]: I1204 22:27:37.280830 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.382582 master-0 kubenswrapper[33572]: I1204 22:27:37.382494 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.382582 master-0 kubenswrapper[33572]: I1204 22:27:37.382572 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfhrw\" (UniqueName: \"kubernetes.io/projected/47455326-9604-4582-bdec-78d65bf273ba-kube-api-access-rfhrw\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.383236 master-0 kubenswrapper[33572]: I1204 22:27:37.382630 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.383236 master-0 kubenswrapper[33572]: I1204 22:27:37.383011 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.383236 master-0 kubenswrapper[33572]: I1204 22:27:37.383050 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.494615 master-0 kubenswrapper[33572]: I1204 22:27:37.494403 33572 generic.go:334] "Generic (PLEG): container finished" podID="3d960c97-d33f-4537-865b-72cc6457310b" containerID="dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82" exitCode=0 Dec 04 22:27:37.494615 master-0 kubenswrapper[33572]: I1204 22:27:37.494468 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tssm5" event={"ID":"3d960c97-d33f-4537-865b-72cc6457310b","Type":"ContainerDied","Data":"dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82"} Dec 04 22:27:37.494615 master-0 kubenswrapper[33572]: I1204 22:27:37.494484 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tssm5" Dec 04 22:27:37.494615 master-0 kubenswrapper[33572]: I1204 22:27:37.494564 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tssm5" event={"ID":"3d960c97-d33f-4537-865b-72cc6457310b","Type":"ContainerDied","Data":"5f3902730ccf9a944f3d7871e871352d293fa657349c01fef6a9c11d9e688115"} Dec 04 22:27:37.494615 master-0 kubenswrapper[33572]: I1204 22:27:37.494597 33572 scope.go:117] "RemoveContainer" containerID="dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82" Dec 04 22:27:37.528931 master-0 kubenswrapper[33572]: I1204 22:27:37.528864 33572 scope.go:117] "RemoveContainer" containerID="bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c" Dec 04 22:27:37.564320 master-0 kubenswrapper[33572]: I1204 22:27:37.564242 33572 scope.go:117] "RemoveContainer" containerID="086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e" Dec 04 22:27:37.594745 master-0 kubenswrapper[33572]: I1204 22:27:37.594568 33572 scope.go:117] "RemoveContainer" containerID="dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82" Dec 04 22:27:37.595197 master-0 kubenswrapper[33572]: E1204 22:27:37.595157 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82\": container with ID starting with dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82 not found: ID does not exist" containerID="dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82" Dec 04 22:27:37.595339 master-0 kubenswrapper[33572]: I1204 22:27:37.595206 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82"} err="failed to get container status \"dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82\": rpc error: code = NotFound desc = could not find container \"dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82\": container with ID starting with dae85f982f282de2db1481f3217e4306ff6455b91baca2ea290164da66456e82 not found: ID does not exist" Dec 04 22:27:37.595339 master-0 kubenswrapper[33572]: I1204 22:27:37.595235 33572 scope.go:117] "RemoveContainer" containerID="bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c" Dec 04 22:27:37.595798 master-0 kubenswrapper[33572]: E1204 22:27:37.595742 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c\": container with ID starting with bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c not found: ID does not exist" containerID="bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c" Dec 04 22:27:37.595798 master-0 kubenswrapper[33572]: I1204 22:27:37.595770 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c"} err="failed to get container status \"bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c\": rpc error: code = NotFound desc = could not find container \"bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c\": container with ID starting with bd4bdf8ecc697df5bc002936a387570c3964d7e6d5d0d7fe15589cacbf80e13c not found: ID does not exist" Dec 04 22:27:37.595798 master-0 kubenswrapper[33572]: I1204 22:27:37.595789 33572 scope.go:117] "RemoveContainer" containerID="086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e" Dec 04 22:27:37.596493 master-0 kubenswrapper[33572]: E1204 22:27:37.596381 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e\": container with ID starting with 086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e not found: ID does not exist" containerID="086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e" Dec 04 22:27:37.596493 master-0 kubenswrapper[33572]: I1204 22:27:37.596443 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e"} err="failed to get container status \"086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e\": rpc error: code = NotFound desc = could not find container \"086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e\": container with ID starting with 086bc395f3eb3719f9ccf68a985ae5f8f46a7e76cd7d1a3e0d16f9e2e24d199e not found: ID does not exist" Dec 04 22:27:37.866806 master-0 kubenswrapper[33572]: I1204 22:27:37.866725 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfhrw\" (UniqueName: \"kubernetes.io/projected/47455326-9604-4582-bdec-78d65bf273ba-kube-api-access-rfhrw\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.882977 master-0 kubenswrapper[33572]: I1204 22:27:37.882880 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tssm5"] Dec 04 22:27:37.886633 master-0 kubenswrapper[33572]: I1204 22:27:37.886566 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:37.897330 master-0 kubenswrapper[33572]: I1204 22:27:37.897238 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494"] Dec 04 22:27:37.913962 master-0 kubenswrapper[33572]: I1204 22:27:37.913885 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tssm5"] Dec 04 22:27:38.034105 master-0 kubenswrapper[33572]: I1204 22:27:38.033614 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp"] Dec 04 22:27:38.036554 master-0 kubenswrapper[33572]: I1204 22:27:38.036136 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.049620 master-0 kubenswrapper[33572]: I1204 22:27:38.049543 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp"] Dec 04 22:27:38.094329 master-0 kubenswrapper[33572]: I1204 22:27:38.094265 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.094983 master-0 kubenswrapper[33572]: I1204 22:27:38.094422 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh98j\" (UniqueName: \"kubernetes.io/projected/792a5b83-3fa6-4a57-a79e-d8a949d58903-kube-api-access-fh98j\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.094983 master-0 kubenswrapper[33572]: I1204 22:27:38.094462 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.196861 master-0 kubenswrapper[33572]: I1204 22:27:38.196792 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.196985 master-0 kubenswrapper[33572]: I1204 22:27:38.196929 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.197200 master-0 kubenswrapper[33572]: I1204 22:27:38.197156 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh98j\" (UniqueName: \"kubernetes.io/projected/792a5b83-3fa6-4a57-a79e-d8a949d58903-kube-api-access-fh98j\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.197391 master-0 kubenswrapper[33572]: I1204 22:27:38.197338 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.197467 master-0 kubenswrapper[33572]: I1204 22:27:38.197424 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.224857 master-0 kubenswrapper[33572]: I1204 22:27:38.224794 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh98j\" (UniqueName: \"kubernetes.io/projected/792a5b83-3fa6-4a57-a79e-d8a949d58903-kube-api-access-fh98j\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.377421 master-0 kubenswrapper[33572]: I1204 22:27:38.377342 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v"] Dec 04 22:27:38.378123 master-0 kubenswrapper[33572]: W1204 22:27:38.378040 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47455326_9604_4582_bdec_78d65bf273ba.slice/crio-1bbdbe4cf23eec10997fd4e9e6abcd963380d3e31fa28bc0eed63d46994fc037 WatchSource:0}: Error finding container 1bbdbe4cf23eec10997fd4e9e6abcd963380d3e31fa28bc0eed63d46994fc037: Status 404 returned error can't find the container with id 1bbdbe4cf23eec10997fd4e9e6abcd963380d3e31fa28bc0eed63d46994fc037 Dec 04 22:27:38.382270 master-0 kubenswrapper[33572]: I1204 22:27:38.382219 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:38.510290 master-0 kubenswrapper[33572]: I1204 22:27:38.509879 33572 generic.go:334] "Generic (PLEG): container finished" podID="028dd693-a4d1-4b97-a882-9c4ab50a9ba0" containerID="591f6d581d11a514bd328886dcaaaf76af8ee477195aafe2a1411e9758ef2e33" exitCode=0 Dec 04 22:27:38.510290 master-0 kubenswrapper[33572]: I1204 22:27:38.510006 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" event={"ID":"028dd693-a4d1-4b97-a882-9c4ab50a9ba0","Type":"ContainerDied","Data":"591f6d581d11a514bd328886dcaaaf76af8ee477195aafe2a1411e9758ef2e33"} Dec 04 22:27:38.523700 master-0 kubenswrapper[33572]: I1204 22:27:38.523603 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" event={"ID":"028dd693-a4d1-4b97-a882-9c4ab50a9ba0","Type":"ContainerStarted","Data":"0f794bc1446358c832c728a409b59885cfb06b9a7fbac8a09a2dd28051865397"} Dec 04 22:27:38.523882 master-0 kubenswrapper[33572]: I1204 22:27:38.523723 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" event={"ID":"47455326-9604-4582-bdec-78d65bf273ba","Type":"ContainerStarted","Data":"1bbdbe4cf23eec10997fd4e9e6abcd963380d3e31fa28bc0eed63d46994fc037"} Dec 04 22:27:38.541340 master-0 kubenswrapper[33572]: I1204 22:27:38.540613 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d960c97-d33f-4537-865b-72cc6457310b" path="/var/lib/kubelet/pods/3d960c97-d33f-4537-865b-72cc6457310b/volumes" Dec 04 22:27:38.909258 master-0 kubenswrapper[33572]: I1204 22:27:38.909142 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp"] Dec 04 22:27:38.916405 master-0 kubenswrapper[33572]: W1204 22:27:38.916249 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod792a5b83_3fa6_4a57_a79e_d8a949d58903.slice/crio-e8375fa006751dbc4e9a9d53c0ec32f2469445da18af7e46b5318ba6dd0fa9f7 WatchSource:0}: Error finding container e8375fa006751dbc4e9a9d53c0ec32f2469445da18af7e46b5318ba6dd0fa9f7: Status 404 returned error can't find the container with id e8375fa006751dbc4e9a9d53c0ec32f2469445da18af7e46b5318ba6dd0fa9f7 Dec 04 22:27:39.527010 master-0 kubenswrapper[33572]: I1204 22:27:39.526937 33572 generic.go:334] "Generic (PLEG): container finished" podID="47455326-9604-4582-bdec-78d65bf273ba" containerID="bf143c9dbd5ca04305687bc5e86d115893674916deb0ad6dd820f3748eb81293" exitCode=0 Dec 04 22:27:39.527761 master-0 kubenswrapper[33572]: I1204 22:27:39.527065 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" event={"ID":"47455326-9604-4582-bdec-78d65bf273ba","Type":"ContainerDied","Data":"bf143c9dbd5ca04305687bc5e86d115893674916deb0ad6dd820f3748eb81293"} Dec 04 22:27:39.531091 master-0 kubenswrapper[33572]: I1204 22:27:39.531052 33572 generic.go:334] "Generic (PLEG): container finished" podID="792a5b83-3fa6-4a57-a79e-d8a949d58903" containerID="adcd2bcc6f481c423c01d75f777931337a363e94b9d2be8ba687cb18c6991201" exitCode=0 Dec 04 22:27:39.531186 master-0 kubenswrapper[33572]: I1204 22:27:39.531105 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" event={"ID":"792a5b83-3fa6-4a57-a79e-d8a949d58903","Type":"ContainerDied","Data":"adcd2bcc6f481c423c01d75f777931337a363e94b9d2be8ba687cb18c6991201"} Dec 04 22:27:39.531186 master-0 kubenswrapper[33572]: I1204 22:27:39.531153 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" event={"ID":"792a5b83-3fa6-4a57-a79e-d8a949d58903","Type":"ContainerStarted","Data":"e8375fa006751dbc4e9a9d53c0ec32f2469445da18af7e46b5318ba6dd0fa9f7"} Dec 04 22:27:41.553401 master-0 kubenswrapper[33572]: I1204 22:27:41.553323 33572 generic.go:334] "Generic (PLEG): container finished" podID="028dd693-a4d1-4b97-a882-9c4ab50a9ba0" containerID="84c496c98b7da860590191ae0ad30f388efc0e2768fcd638a50711c532be7480" exitCode=0 Dec 04 22:27:41.553401 master-0 kubenswrapper[33572]: I1204 22:27:41.553374 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" event={"ID":"028dd693-a4d1-4b97-a882-9c4ab50a9ba0","Type":"ContainerDied","Data":"84c496c98b7da860590191ae0ad30f388efc0e2768fcd638a50711c532be7480"} Dec 04 22:27:42.564032 master-0 kubenswrapper[33572]: I1204 22:27:42.563959 33572 generic.go:334] "Generic (PLEG): container finished" podID="47455326-9604-4582-bdec-78d65bf273ba" containerID="f539497f43ef74ab6670bf1cc1132a42fa6de7d1e697c6a264a83bd623f1f777" exitCode=0 Dec 04 22:27:42.565074 master-0 kubenswrapper[33572]: I1204 22:27:42.564028 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" event={"ID":"47455326-9604-4582-bdec-78d65bf273ba","Type":"ContainerDied","Data":"f539497f43ef74ab6670bf1cc1132a42fa6de7d1e697c6a264a83bd623f1f777"} Dec 04 22:27:42.567085 master-0 kubenswrapper[33572]: I1204 22:27:42.566496 33572 generic.go:334] "Generic (PLEG): container finished" podID="792a5b83-3fa6-4a57-a79e-d8a949d58903" containerID="89d75f266ba2bfe3ba9e9e93d3604bfe0d50e57ac45321c3f4dd241d5ad6ff50" exitCode=0 Dec 04 22:27:42.567085 master-0 kubenswrapper[33572]: I1204 22:27:42.566748 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" event={"ID":"792a5b83-3fa6-4a57-a79e-d8a949d58903","Type":"ContainerDied","Data":"89d75f266ba2bfe3ba9e9e93d3604bfe0d50e57ac45321c3f4dd241d5ad6ff50"} Dec 04 22:27:42.570143 master-0 kubenswrapper[33572]: I1204 22:27:42.570092 33572 generic.go:334] "Generic (PLEG): container finished" podID="028dd693-a4d1-4b97-a882-9c4ab50a9ba0" containerID="a3ab3bad1a4e8ce9f924c4e8b7553c43ef5be7ddad196e9b57429c11af8c31c8" exitCode=0 Dec 04 22:27:42.570241 master-0 kubenswrapper[33572]: I1204 22:27:42.570157 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" event={"ID":"028dd693-a4d1-4b97-a882-9c4ab50a9ba0","Type":"ContainerDied","Data":"a3ab3bad1a4e8ce9f924c4e8b7553c43ef5be7ddad196e9b57429c11af8c31c8"} Dec 04 22:27:43.584099 master-0 kubenswrapper[33572]: I1204 22:27:43.583988 33572 generic.go:334] "Generic (PLEG): container finished" podID="47455326-9604-4582-bdec-78d65bf273ba" containerID="45528699e1118b2306536b68105cb59bbb33b1029cf8e3712c0b475bb8039595" exitCode=0 Dec 04 22:27:43.584099 master-0 kubenswrapper[33572]: I1204 22:27:43.584057 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" event={"ID":"47455326-9604-4582-bdec-78d65bf273ba","Type":"ContainerDied","Data":"45528699e1118b2306536b68105cb59bbb33b1029cf8e3712c0b475bb8039595"} Dec 04 22:27:43.588706 master-0 kubenswrapper[33572]: I1204 22:27:43.588645 33572 generic.go:334] "Generic (PLEG): container finished" podID="792a5b83-3fa6-4a57-a79e-d8a949d58903" containerID="5c9931cafdcd2187d74f11c84f626ea1603a9902cdc603ef3f46bff1383e9e3c" exitCode=0 Dec 04 22:27:43.589028 master-0 kubenswrapper[33572]: I1204 22:27:43.588973 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" event={"ID":"792a5b83-3fa6-4a57-a79e-d8a949d58903","Type":"ContainerDied","Data":"5c9931cafdcd2187d74f11c84f626ea1603a9902cdc603ef3f46bff1383e9e3c"} Dec 04 22:27:44.020240 master-0 kubenswrapper[33572]: I1204 22:27:44.020130 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:44.101259 master-0 kubenswrapper[33572]: I1204 22:27:44.101166 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdsb7\" (UniqueName: \"kubernetes.io/projected/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-kube-api-access-cdsb7\") pod \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " Dec 04 22:27:44.101575 master-0 kubenswrapper[33572]: I1204 22:27:44.101414 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-bundle\") pod \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " Dec 04 22:27:44.101575 master-0 kubenswrapper[33572]: I1204 22:27:44.101483 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-util\") pod \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\" (UID: \"028dd693-a4d1-4b97-a882-9c4ab50a9ba0\") " Dec 04 22:27:44.104458 master-0 kubenswrapper[33572]: I1204 22:27:44.104385 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-bundle" (OuterVolumeSpecName: "bundle") pod "028dd693-a4d1-4b97-a882-9c4ab50a9ba0" (UID: "028dd693-a4d1-4b97-a882-9c4ab50a9ba0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:27:44.104924 master-0 kubenswrapper[33572]: I1204 22:27:44.104830 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-kube-api-access-cdsb7" (OuterVolumeSpecName: "kube-api-access-cdsb7") pod "028dd693-a4d1-4b97-a882-9c4ab50a9ba0" (UID: "028dd693-a4d1-4b97-a882-9c4ab50a9ba0"). InnerVolumeSpecName "kube-api-access-cdsb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:27:44.113632 master-0 kubenswrapper[33572]: I1204 22:27:44.113560 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-util" (OuterVolumeSpecName: "util") pod "028dd693-a4d1-4b97-a882-9c4ab50a9ba0" (UID: "028dd693-a4d1-4b97-a882-9c4ab50a9ba0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:27:44.204347 master-0 kubenswrapper[33572]: I1204 22:27:44.204150 33572 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:44.204347 master-0 kubenswrapper[33572]: I1204 22:27:44.204222 33572 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-util\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:44.204347 master-0 kubenswrapper[33572]: I1204 22:27:44.204246 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdsb7\" (UniqueName: \"kubernetes.io/projected/028dd693-a4d1-4b97-a882-9c4ab50a9ba0-kube-api-access-cdsb7\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:44.602270 master-0 kubenswrapper[33572]: I1204 22:27:44.602206 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" event={"ID":"028dd693-a4d1-4b97-a882-9c4ab50a9ba0","Type":"ContainerDied","Data":"0f794bc1446358c832c728a409b59885cfb06b9a7fbac8a09a2dd28051865397"} Dec 04 22:27:44.602270 master-0 kubenswrapper[33572]: I1204 22:27:44.602259 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f794bc1446358c832c728a409b59885cfb06b9a7fbac8a09a2dd28051865397" Dec 04 22:27:44.603237 master-0 kubenswrapper[33572]: I1204 22:27:44.602321 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a5p494" Dec 04 22:27:45.021723 master-0 kubenswrapper[33572]: I1204 22:27:45.021658 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:45.090767 master-0 kubenswrapper[33572]: I1204 22:27:45.090478 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:45.120622 master-0 kubenswrapper[33572]: I1204 22:27:45.120490 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-bundle\") pod \"792a5b83-3fa6-4a57-a79e-d8a949d58903\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " Dec 04 22:27:45.120622 master-0 kubenswrapper[33572]: I1204 22:27:45.120624 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-util\") pod \"47455326-9604-4582-bdec-78d65bf273ba\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " Dec 04 22:27:45.120942 master-0 kubenswrapper[33572]: I1204 22:27:45.120729 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfhrw\" (UniqueName: \"kubernetes.io/projected/47455326-9604-4582-bdec-78d65bf273ba-kube-api-access-rfhrw\") pod \"47455326-9604-4582-bdec-78d65bf273ba\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " Dec 04 22:27:45.120942 master-0 kubenswrapper[33572]: I1204 22:27:45.120797 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-bundle\") pod \"47455326-9604-4582-bdec-78d65bf273ba\" (UID: \"47455326-9604-4582-bdec-78d65bf273ba\") " Dec 04 22:27:45.120942 master-0 kubenswrapper[33572]: I1204 22:27:45.120838 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-util\") pod \"792a5b83-3fa6-4a57-a79e-d8a949d58903\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " Dec 04 22:27:45.120942 master-0 kubenswrapper[33572]: I1204 22:27:45.120918 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh98j\" (UniqueName: \"kubernetes.io/projected/792a5b83-3fa6-4a57-a79e-d8a949d58903-kube-api-access-fh98j\") pod \"792a5b83-3fa6-4a57-a79e-d8a949d58903\" (UID: \"792a5b83-3fa6-4a57-a79e-d8a949d58903\") " Dec 04 22:27:45.121919 master-0 kubenswrapper[33572]: I1204 22:27:45.121820 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-bundle" (OuterVolumeSpecName: "bundle") pod "792a5b83-3fa6-4a57-a79e-d8a949d58903" (UID: "792a5b83-3fa6-4a57-a79e-d8a949d58903"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:27:45.122640 master-0 kubenswrapper[33572]: I1204 22:27:45.122577 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-bundle" (OuterVolumeSpecName: "bundle") pod "47455326-9604-4582-bdec-78d65bf273ba" (UID: "47455326-9604-4582-bdec-78d65bf273ba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:27:45.126618 master-0 kubenswrapper[33572]: I1204 22:27:45.126569 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47455326-9604-4582-bdec-78d65bf273ba-kube-api-access-rfhrw" (OuterVolumeSpecName: "kube-api-access-rfhrw") pod "47455326-9604-4582-bdec-78d65bf273ba" (UID: "47455326-9604-4582-bdec-78d65bf273ba"). InnerVolumeSpecName "kube-api-access-rfhrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:27:45.126832 master-0 kubenswrapper[33572]: I1204 22:27:45.126622 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792a5b83-3fa6-4a57-a79e-d8a949d58903-kube-api-access-fh98j" (OuterVolumeSpecName: "kube-api-access-fh98j") pod "792a5b83-3fa6-4a57-a79e-d8a949d58903" (UID: "792a5b83-3fa6-4a57-a79e-d8a949d58903"). InnerVolumeSpecName "kube-api-access-fh98j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:27:45.143947 master-0 kubenswrapper[33572]: I1204 22:27:45.143830 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-util" (OuterVolumeSpecName: "util") pod "47455326-9604-4582-bdec-78d65bf273ba" (UID: "47455326-9604-4582-bdec-78d65bf273ba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:27:45.223533 master-0 kubenswrapper[33572]: I1204 22:27:45.223449 33572 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:45.223533 master-0 kubenswrapper[33572]: I1204 22:27:45.223536 33572 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-util\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:45.223868 master-0 kubenswrapper[33572]: I1204 22:27:45.223558 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfhrw\" (UniqueName: \"kubernetes.io/projected/47455326-9604-4582-bdec-78d65bf273ba-kube-api-access-rfhrw\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:45.223868 master-0 kubenswrapper[33572]: I1204 22:27:45.223583 33572 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/47455326-9604-4582-bdec-78d65bf273ba-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:45.223868 master-0 kubenswrapper[33572]: I1204 22:27:45.223602 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh98j\" (UniqueName: \"kubernetes.io/projected/792a5b83-3fa6-4a57-a79e-d8a949d58903-kube-api-access-fh98j\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:45.390605 master-0 kubenswrapper[33572]: I1204 22:27:45.390180 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-util" (OuterVolumeSpecName: "util") pod "792a5b83-3fa6-4a57-a79e-d8a949d58903" (UID: "792a5b83-3fa6-4a57-a79e-d8a949d58903"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:27:45.427565 master-0 kubenswrapper[33572]: I1204 22:27:45.427472 33572 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/792a5b83-3fa6-4a57-a79e-d8a949d58903-util\") on node \"master-0\" DevicePath \"\"" Dec 04 22:27:45.615931 master-0 kubenswrapper[33572]: I1204 22:27:45.615852 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" Dec 04 22:27:45.616574 master-0 kubenswrapper[33572]: I1204 22:27:45.615845 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ffgbkp" event={"ID":"792a5b83-3fa6-4a57-a79e-d8a949d58903","Type":"ContainerDied","Data":"e8375fa006751dbc4e9a9d53c0ec32f2469445da18af7e46b5318ba6dd0fa9f7"} Dec 04 22:27:45.616574 master-0 kubenswrapper[33572]: I1204 22:27:45.616331 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8375fa006751dbc4e9a9d53c0ec32f2469445da18af7e46b5318ba6dd0fa9f7" Dec 04 22:27:45.620710 master-0 kubenswrapper[33572]: I1204 22:27:45.620657 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" event={"ID":"47455326-9604-4582-bdec-78d65bf273ba","Type":"ContainerDied","Data":"1bbdbe4cf23eec10997fd4e9e6abcd963380d3e31fa28bc0eed63d46994fc037"} Dec 04 22:27:45.620793 master-0 kubenswrapper[33572]: I1204 22:27:45.620713 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bbdbe4cf23eec10997fd4e9e6abcd963380d3e31fa28bc0eed63d46994fc037" Dec 04 22:27:45.620877 master-0 kubenswrapper[33572]: I1204 22:27:45.620842 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83wwk7v" Dec 04 22:27:49.860763 master-0 kubenswrapper[33572]: I1204 22:27:49.860704 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8"] Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: E1204 22:27:49.861028 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028dd693-a4d1-4b97-a882-9c4ab50a9ba0" containerName="util" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861046 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="028dd693-a4d1-4b97-a882-9c4ab50a9ba0" containerName="util" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: E1204 22:27:49.861067 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028dd693-a4d1-4b97-a882-9c4ab50a9ba0" containerName="extract" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861075 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="028dd693-a4d1-4b97-a882-9c4ab50a9ba0" containerName="extract" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: E1204 22:27:49.861089 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792a5b83-3fa6-4a57-a79e-d8a949d58903" containerName="util" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861097 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="792a5b83-3fa6-4a57-a79e-d8a949d58903" containerName="util" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: E1204 22:27:49.861118 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47455326-9604-4582-bdec-78d65bf273ba" containerName="pull" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861126 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47455326-9604-4582-bdec-78d65bf273ba" containerName="pull" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: E1204 22:27:49.861136 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792a5b83-3fa6-4a57-a79e-d8a949d58903" containerName="extract" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861144 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="792a5b83-3fa6-4a57-a79e-d8a949d58903" containerName="extract" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: E1204 22:27:49.861155 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028dd693-a4d1-4b97-a882-9c4ab50a9ba0" containerName="pull" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861162 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="028dd693-a4d1-4b97-a882-9c4ab50a9ba0" containerName="pull" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: E1204 22:27:49.861175 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47455326-9604-4582-bdec-78d65bf273ba" containerName="util" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861183 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47455326-9604-4582-bdec-78d65bf273ba" containerName="util" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: E1204 22:27:49.861206 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792a5b83-3fa6-4a57-a79e-d8a949d58903" containerName="pull" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861213 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="792a5b83-3fa6-4a57-a79e-d8a949d58903" containerName="pull" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: E1204 22:27:49.861227 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47455326-9604-4582-bdec-78d65bf273ba" containerName="extract" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861234 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="47455326-9604-4582-bdec-78d65bf273ba" containerName="extract" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861408 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="47455326-9604-4582-bdec-78d65bf273ba" containerName="extract" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861437 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="792a5b83-3fa6-4a57-a79e-d8a949d58903" containerName="extract" Dec 04 22:27:49.861495 master-0 kubenswrapper[33572]: I1204 22:27:49.861456 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="028dd693-a4d1-4b97-a882-9c4ab50a9ba0" containerName="extract" Dec 04 22:27:49.862381 master-0 kubenswrapper[33572]: I1204 22:27:49.862136 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" Dec 04 22:27:49.864580 master-0 kubenswrapper[33572]: I1204 22:27:49.864547 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 04 22:27:49.865446 master-0 kubenswrapper[33572]: I1204 22:27:49.865397 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 04 22:27:49.884953 master-0 kubenswrapper[33572]: I1204 22:27:49.884894 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8"] Dec 04 22:27:49.919535 master-0 kubenswrapper[33572]: I1204 22:27:49.919468 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c285912d-f814-42a4-ab19-9f6a0ddae92d-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gxzw8\" (UID: \"c285912d-f814-42a4-ab19-9f6a0ddae92d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" Dec 04 22:27:49.919859 master-0 kubenswrapper[33572]: I1204 22:27:49.919829 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd68d\" (UniqueName: \"kubernetes.io/projected/c285912d-f814-42a4-ab19-9f6a0ddae92d-kube-api-access-dd68d\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gxzw8\" (UID: \"c285912d-f814-42a4-ab19-9f6a0ddae92d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" Dec 04 22:27:50.020969 master-0 kubenswrapper[33572]: I1204 22:27:50.020906 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c285912d-f814-42a4-ab19-9f6a0ddae92d-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gxzw8\" (UID: \"c285912d-f814-42a4-ab19-9f6a0ddae92d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" Dec 04 22:27:50.021174 master-0 kubenswrapper[33572]: I1204 22:27:50.021045 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd68d\" (UniqueName: \"kubernetes.io/projected/c285912d-f814-42a4-ab19-9f6a0ddae92d-kube-api-access-dd68d\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gxzw8\" (UID: \"c285912d-f814-42a4-ab19-9f6a0ddae92d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" Dec 04 22:27:50.022570 master-0 kubenswrapper[33572]: I1204 22:27:50.021603 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c285912d-f814-42a4-ab19-9f6a0ddae92d-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gxzw8\" (UID: \"c285912d-f814-42a4-ab19-9f6a0ddae92d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" Dec 04 22:27:50.046357 master-0 kubenswrapper[33572]: I1204 22:27:50.046318 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd68d\" (UniqueName: \"kubernetes.io/projected/c285912d-f814-42a4-ab19-9f6a0ddae92d-kube-api-access-dd68d\") pod \"cert-manager-operator-controller-manager-64cf6dff88-gxzw8\" (UID: \"c285912d-f814-42a4-ab19-9f6a0ddae92d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" Dec 04 22:27:50.188736 master-0 kubenswrapper[33572]: I1204 22:27:50.188617 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" Dec 04 22:27:52.977263 master-0 kubenswrapper[33572]: I1204 22:27:52.977191 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8"] Dec 04 22:27:52.997576 master-0 kubenswrapper[33572]: W1204 22:27:52.997478 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc285912d_f814_42a4_ab19_9f6a0ddae92d.slice/crio-3affa3f74386d9b5e87cd3a319c06a0758955033a9ed5de96b4e323ed4cd1b99 WatchSource:0}: Error finding container 3affa3f74386d9b5e87cd3a319c06a0758955033a9ed5de96b4e323ed4cd1b99: Status 404 returned error can't find the container with id 3affa3f74386d9b5e87cd3a319c06a0758955033a9ed5de96b4e323ed4cd1b99 Dec 04 22:27:53.685606 master-0 kubenswrapper[33572]: I1204 22:27:53.685523 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" event={"ID":"c285912d-f814-42a4-ab19-9f6a0ddae92d","Type":"ContainerStarted","Data":"3affa3f74386d9b5e87cd3a319c06a0758955033a9ed5de96b4e323ed4cd1b99"} Dec 04 22:27:54.012412 master-0 kubenswrapper[33572]: I1204 22:27:54.012281 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl"] Dec 04 22:27:54.014078 master-0 kubenswrapper[33572]: I1204 22:27:54.014038 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.020605 master-0 kubenswrapper[33572]: I1204 22:27:54.020547 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-fmctc" Dec 04 22:27:54.027419 master-0 kubenswrapper[33572]: I1204 22:27:54.027362 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl"] Dec 04 22:27:54.134416 master-0 kubenswrapper[33572]: I1204 22:27:54.134342 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.134655 master-0 kubenswrapper[33572]: I1204 22:27:54.134460 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.134655 master-0 kubenswrapper[33572]: I1204 22:27:54.134493 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n56g8\" (UniqueName: \"kubernetes.io/projected/c370c932-c544-4ccb-918c-1f87ae66d2d2-kube-api-access-n56g8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.236014 master-0 kubenswrapper[33572]: I1204 22:27:54.235932 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.236269 master-0 kubenswrapper[33572]: I1204 22:27:54.236038 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n56g8\" (UniqueName: \"kubernetes.io/projected/c370c932-c544-4ccb-918c-1f87ae66d2d2-kube-api-access-n56g8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.236269 master-0 kubenswrapper[33572]: I1204 22:27:54.236214 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.236744 master-0 kubenswrapper[33572]: I1204 22:27:54.236703 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.236919 master-0 kubenswrapper[33572]: I1204 22:27:54.236879 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.272538 master-0 kubenswrapper[33572]: I1204 22:27:54.271563 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n56g8\" (UniqueName: \"kubernetes.io/projected/c370c932-c544-4ccb-918c-1f87ae66d2d2-kube-api-access-n56g8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.334203 master-0 kubenswrapper[33572]: I1204 22:27:54.333786 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:27:54.759867 master-0 kubenswrapper[33572]: I1204 22:27:54.759794 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl"] Dec 04 22:27:54.763288 master-0 kubenswrapper[33572]: W1204 22:27:54.763212 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc370c932_c544_4ccb_918c_1f87ae66d2d2.slice/crio-a1f908e75fe17d58e54d88cc3b5fdd1f8795e0caebebc13a5a561b6b930b54e4 WatchSource:0}: Error finding container a1f908e75fe17d58e54d88cc3b5fdd1f8795e0caebebc13a5a561b6b930b54e4: Status 404 returned error can't find the container with id a1f908e75fe17d58e54d88cc3b5fdd1f8795e0caebebc13a5a561b6b930b54e4 Dec 04 22:27:55.083757 master-0 kubenswrapper[33572]: I1204 22:27:55.083701 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr"] Dec 04 22:27:55.085027 master-0 kubenswrapper[33572]: I1204 22:27:55.085000 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr" Dec 04 22:27:55.087241 master-0 kubenswrapper[33572]: I1204 22:27:55.087207 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 04 22:27:55.087431 master-0 kubenswrapper[33572]: I1204 22:27:55.087405 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 04 22:27:55.104401 master-0 kubenswrapper[33572]: I1204 22:27:55.104259 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr"] Dec 04 22:27:55.161258 master-0 kubenswrapper[33572]: I1204 22:27:55.161199 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlz7c\" (UniqueName: \"kubernetes.io/projected/103e82e6-9c8a-497b-9c46-bec48796d1a0-kube-api-access-hlz7c\") pod \"nmstate-operator-5b5b58f5c8-n77lr\" (UID: \"103e82e6-9c8a-497b-9c46-bec48796d1a0\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr" Dec 04 22:27:55.262732 master-0 kubenswrapper[33572]: I1204 22:27:55.262624 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlz7c\" (UniqueName: \"kubernetes.io/projected/103e82e6-9c8a-497b-9c46-bec48796d1a0-kube-api-access-hlz7c\") pod \"nmstate-operator-5b5b58f5c8-n77lr\" (UID: \"103e82e6-9c8a-497b-9c46-bec48796d1a0\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr" Dec 04 22:27:55.283626 master-0 kubenswrapper[33572]: I1204 22:27:55.283521 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlz7c\" (UniqueName: \"kubernetes.io/projected/103e82e6-9c8a-497b-9c46-bec48796d1a0-kube-api-access-hlz7c\") pod \"nmstate-operator-5b5b58f5c8-n77lr\" (UID: \"103e82e6-9c8a-497b-9c46-bec48796d1a0\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr" Dec 04 22:27:55.449149 master-0 kubenswrapper[33572]: I1204 22:27:55.449073 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr" Dec 04 22:27:55.717927 master-0 kubenswrapper[33572]: I1204 22:27:55.717864 33572 generic.go:334] "Generic (PLEG): container finished" podID="c370c932-c544-4ccb-918c-1f87ae66d2d2" containerID="1237080962e8e9bfe0264c09fe300f98dad9679695ce190c74477f629326bd76" exitCode=0 Dec 04 22:27:55.717927 master-0 kubenswrapper[33572]: I1204 22:27:55.717928 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" event={"ID":"c370c932-c544-4ccb-918c-1f87ae66d2d2","Type":"ContainerDied","Data":"1237080962e8e9bfe0264c09fe300f98dad9679695ce190c74477f629326bd76"} Dec 04 22:27:55.718183 master-0 kubenswrapper[33572]: I1204 22:27:55.717966 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" event={"ID":"c370c932-c544-4ccb-918c-1f87ae66d2d2","Type":"ContainerStarted","Data":"a1f908e75fe17d58e54d88cc3b5fdd1f8795e0caebebc13a5a561b6b930b54e4"} Dec 04 22:27:55.988676 master-0 kubenswrapper[33572]: I1204 22:27:55.988605 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr"] Dec 04 22:27:55.992556 master-0 kubenswrapper[33572]: W1204 22:27:55.992483 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod103e82e6_9c8a_497b_9c46_bec48796d1a0.slice/crio-0f3800954a555f43b68f98c7482852b24072aea74f6d5f74ebe4105c493f9f6f WatchSource:0}: Error finding container 0f3800954a555f43b68f98c7482852b24072aea74f6d5f74ebe4105c493f9f6f: Status 404 returned error can't find the container with id 0f3800954a555f43b68f98c7482852b24072aea74f6d5f74ebe4105c493f9f6f Dec 04 22:27:56.728796 master-0 kubenswrapper[33572]: I1204 22:27:56.728643 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr" event={"ID":"103e82e6-9c8a-497b-9c46-bec48796d1a0","Type":"ContainerStarted","Data":"0f3800954a555f43b68f98c7482852b24072aea74f6d5f74ebe4105c493f9f6f"} Dec 04 22:27:57.279203 master-0 kubenswrapper[33572]: I1204 22:27:57.279118 33572 scope.go:117] "RemoveContainer" containerID="39aacc773fddb0383604f8a27ba1b199b302e4f4ede41fa8f08e464ed1607b81" Dec 04 22:28:02.793067 master-0 kubenswrapper[33572]: I1204 22:28:02.793004 33572 generic.go:334] "Generic (PLEG): container finished" podID="c370c932-c544-4ccb-918c-1f87ae66d2d2" containerID="51fe9ef9d715887f20034b25f6a34a34375af47b538f7f004b084b79d09be5e4" exitCode=0 Dec 04 22:28:02.793748 master-0 kubenswrapper[33572]: I1204 22:28:02.793076 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" event={"ID":"c370c932-c544-4ccb-918c-1f87ae66d2d2","Type":"ContainerDied","Data":"51fe9ef9d715887f20034b25f6a34a34375af47b538f7f004b084b79d09be5e4"} Dec 04 22:28:02.795192 master-0 kubenswrapper[33572]: I1204 22:28:02.795147 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" event={"ID":"c285912d-f814-42a4-ab19-9f6a0ddae92d","Type":"ContainerStarted","Data":"9f63f8bcb5247756147e275249127ed9c06bc949c106d84cd2e30ffd25c2d9f7"} Dec 04 22:28:02.801981 master-0 kubenswrapper[33572]: I1204 22:28:02.801945 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr" event={"ID":"103e82e6-9c8a-497b-9c46-bec48796d1a0","Type":"ContainerStarted","Data":"732e175c580f8e4fd3c88a17208b2fe7a6f75d8a5cc5c557b5bff1adfd7195ee"} Dec 04 22:28:02.850476 master-0 kubenswrapper[33572]: I1204 22:28:02.850285 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-n77lr" podStartSLOduration=1.530398651 podStartE2EDuration="7.850260796s" podCreationTimestamp="2025-12-04 22:27:55 +0000 UTC" firstStartedPulling="2025-12-04 22:27:55.995617736 +0000 UTC m=+539.723143385" lastFinishedPulling="2025-12-04 22:28:02.315479891 +0000 UTC m=+546.043005530" observedRunningTime="2025-12-04 22:28:02.840214377 +0000 UTC m=+546.567740026" watchObservedRunningTime="2025-12-04 22:28:02.850260796 +0000 UTC m=+546.577786445" Dec 04 22:28:02.894373 master-0 kubenswrapper[33572]: I1204 22:28:02.894288 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-gxzw8" podStartSLOduration=4.676544675 podStartE2EDuration="13.894263446s" podCreationTimestamp="2025-12-04 22:27:49 +0000 UTC" firstStartedPulling="2025-12-04 22:27:53.002669402 +0000 UTC m=+536.730195051" lastFinishedPulling="2025-12-04 22:28:02.220388173 +0000 UTC m=+545.947913822" observedRunningTime="2025-12-04 22:28:02.893215987 +0000 UTC m=+546.620741636" watchObservedRunningTime="2025-12-04 22:28:02.894263446 +0000 UTC m=+546.621789095" Dec 04 22:28:03.815341 master-0 kubenswrapper[33572]: I1204 22:28:03.815271 33572 generic.go:334] "Generic (PLEG): container finished" podID="c370c932-c544-4ccb-918c-1f87ae66d2d2" containerID="7f7250899c0728d00cb8d471baf39d955f2148cc0b3cc7689ab54d573ffd3d3c" exitCode=0 Dec 04 22:28:03.815341 master-0 kubenswrapper[33572]: I1204 22:28:03.815323 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" event={"ID":"c370c932-c544-4ccb-918c-1f87ae66d2d2","Type":"ContainerDied","Data":"7f7250899c0728d00cb8d471baf39d955f2148cc0b3cc7689ab54d573ffd3d3c"} Dec 04 22:28:05.161563 master-0 kubenswrapper[33572]: I1204 22:28:05.161490 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:28:05.287544 master-0 kubenswrapper[33572]: I1204 22:28:05.287211 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n56g8\" (UniqueName: \"kubernetes.io/projected/c370c932-c544-4ccb-918c-1f87ae66d2d2-kube-api-access-n56g8\") pod \"c370c932-c544-4ccb-918c-1f87ae66d2d2\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " Dec 04 22:28:05.287544 master-0 kubenswrapper[33572]: I1204 22:28:05.287390 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-bundle\") pod \"c370c932-c544-4ccb-918c-1f87ae66d2d2\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " Dec 04 22:28:05.287961 master-0 kubenswrapper[33572]: I1204 22:28:05.287590 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-util\") pod \"c370c932-c544-4ccb-918c-1f87ae66d2d2\" (UID: \"c370c932-c544-4ccb-918c-1f87ae66d2d2\") " Dec 04 22:28:05.296531 master-0 kubenswrapper[33572]: I1204 22:28:05.295746 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c370c932-c544-4ccb-918c-1f87ae66d2d2-kube-api-access-n56g8" (OuterVolumeSpecName: "kube-api-access-n56g8") pod "c370c932-c544-4ccb-918c-1f87ae66d2d2" (UID: "c370c932-c544-4ccb-918c-1f87ae66d2d2"). InnerVolumeSpecName "kube-api-access-n56g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:28:05.299960 master-0 kubenswrapper[33572]: I1204 22:28:05.298660 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-bundle" (OuterVolumeSpecName: "bundle") pod "c370c932-c544-4ccb-918c-1f87ae66d2d2" (UID: "c370c932-c544-4ccb-918c-1f87ae66d2d2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:28:05.299960 master-0 kubenswrapper[33572]: I1204 22:28:05.299165 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-util" (OuterVolumeSpecName: "util") pod "c370c932-c544-4ccb-918c-1f87ae66d2d2" (UID: "c370c932-c544-4ccb-918c-1f87ae66d2d2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:28:05.389802 master-0 kubenswrapper[33572]: I1204 22:28:05.389568 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n56g8\" (UniqueName: \"kubernetes.io/projected/c370c932-c544-4ccb-918c-1f87ae66d2d2-kube-api-access-n56g8\") on node \"master-0\" DevicePath \"\"" Dec 04 22:28:05.389802 master-0 kubenswrapper[33572]: I1204 22:28:05.389652 33572 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:28:05.389802 master-0 kubenswrapper[33572]: I1204 22:28:05.389670 33572 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c370c932-c544-4ccb-918c-1f87ae66d2d2-util\") on node \"master-0\" DevicePath \"\"" Dec 04 22:28:05.494118 master-0 kubenswrapper[33572]: I1204 22:28:05.494024 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-tgx98"] Dec 04 22:28:05.494654 master-0 kubenswrapper[33572]: E1204 22:28:05.494601 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c370c932-c544-4ccb-918c-1f87ae66d2d2" containerName="pull" Dec 04 22:28:05.494654 master-0 kubenswrapper[33572]: I1204 22:28:05.494659 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c370c932-c544-4ccb-918c-1f87ae66d2d2" containerName="pull" Dec 04 22:28:05.494815 master-0 kubenswrapper[33572]: E1204 22:28:05.494685 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c370c932-c544-4ccb-918c-1f87ae66d2d2" containerName="extract" Dec 04 22:28:05.494815 master-0 kubenswrapper[33572]: I1204 22:28:05.494693 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c370c932-c544-4ccb-918c-1f87ae66d2d2" containerName="extract" Dec 04 22:28:05.494815 master-0 kubenswrapper[33572]: E1204 22:28:05.494711 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c370c932-c544-4ccb-918c-1f87ae66d2d2" containerName="util" Dec 04 22:28:05.494815 master-0 kubenswrapper[33572]: I1204 22:28:05.494732 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c370c932-c544-4ccb-918c-1f87ae66d2d2" containerName="util" Dec 04 22:28:05.495069 master-0 kubenswrapper[33572]: I1204 22:28:05.494914 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c370c932-c544-4ccb-918c-1f87ae66d2d2" containerName="extract" Dec 04 22:28:05.495732 master-0 kubenswrapper[33572]: I1204 22:28:05.495700 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" Dec 04 22:28:05.499646 master-0 kubenswrapper[33572]: I1204 22:28:05.499611 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 04 22:28:05.511887 master-0 kubenswrapper[33572]: I1204 22:28:05.511826 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 04 22:28:05.519827 master-0 kubenswrapper[33572]: I1204 22:28:05.518523 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-tgx98"] Dec 04 22:28:05.609621 master-0 kubenswrapper[33572]: I1204 22:28:05.605416 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft8jn\" (UniqueName: \"kubernetes.io/projected/780dc1ba-097b-4417-a960-9c6e5e7a3d40-kube-api-access-ft8jn\") pod \"cert-manager-webhook-f4fb5df64-tgx98\" (UID: \"780dc1ba-097b-4417-a960-9c6e5e7a3d40\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" Dec 04 22:28:05.609621 master-0 kubenswrapper[33572]: I1204 22:28:05.605472 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/780dc1ba-097b-4417-a960-9c6e5e7a3d40-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-tgx98\" (UID: \"780dc1ba-097b-4417-a960-9c6e5e7a3d40\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" Dec 04 22:28:05.707723 master-0 kubenswrapper[33572]: I1204 22:28:05.707581 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/780dc1ba-097b-4417-a960-9c6e5e7a3d40-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-tgx98\" (UID: \"780dc1ba-097b-4417-a960-9c6e5e7a3d40\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" Dec 04 22:28:05.707933 master-0 kubenswrapper[33572]: I1204 22:28:05.707840 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft8jn\" (UniqueName: \"kubernetes.io/projected/780dc1ba-097b-4417-a960-9c6e5e7a3d40-kube-api-access-ft8jn\") pod \"cert-manager-webhook-f4fb5df64-tgx98\" (UID: \"780dc1ba-097b-4417-a960-9c6e5e7a3d40\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" Dec 04 22:28:05.736549 master-0 kubenswrapper[33572]: I1204 22:28:05.733643 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft8jn\" (UniqueName: \"kubernetes.io/projected/780dc1ba-097b-4417-a960-9c6e5e7a3d40-kube-api-access-ft8jn\") pod \"cert-manager-webhook-f4fb5df64-tgx98\" (UID: \"780dc1ba-097b-4417-a960-9c6e5e7a3d40\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" Dec 04 22:28:05.748534 master-0 kubenswrapper[33572]: I1204 22:28:05.747046 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/780dc1ba-097b-4417-a960-9c6e5e7a3d40-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-tgx98\" (UID: \"780dc1ba-097b-4417-a960-9c6e5e7a3d40\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" Dec 04 22:28:05.812369 master-0 kubenswrapper[33572]: I1204 22:28:05.812287 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" Dec 04 22:28:05.832295 master-0 kubenswrapper[33572]: I1204 22:28:05.832212 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" event={"ID":"c370c932-c544-4ccb-918c-1f87ae66d2d2","Type":"ContainerDied","Data":"a1f908e75fe17d58e54d88cc3b5fdd1f8795e0caebebc13a5a561b6b930b54e4"} Dec 04 22:28:05.832295 master-0 kubenswrapper[33572]: I1204 22:28:05.832290 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1f908e75fe17d58e54d88cc3b5fdd1f8795e0caebebc13a5a561b6b930b54e4" Dec 04 22:28:05.832597 master-0 kubenswrapper[33572]: I1204 22:28:05.832388 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102jbkl" Dec 04 22:28:06.252368 master-0 kubenswrapper[33572]: I1204 22:28:06.252310 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-tgx98"] Dec 04 22:28:06.255208 master-0 kubenswrapper[33572]: W1204 22:28:06.255143 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod780dc1ba_097b_4417_a960_9c6e5e7a3d40.slice/crio-863fb447474c23641a00d64ad0ad79891797b969669151a6cf04a48b28c437ec WatchSource:0}: Error finding container 863fb447474c23641a00d64ad0ad79891797b969669151a6cf04a48b28c437ec: Status 404 returned error can't find the container with id 863fb447474c23641a00d64ad0ad79891797b969669151a6cf04a48b28c437ec Dec 04 22:28:06.841831 master-0 kubenswrapper[33572]: I1204 22:28:06.841771 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" event={"ID":"780dc1ba-097b-4417-a960-9c6e5e7a3d40","Type":"ContainerStarted","Data":"863fb447474c23641a00d64ad0ad79891797b969669151a6cf04a48b28c437ec"} Dec 04 22:28:07.542361 master-0 kubenswrapper[33572]: I1204 22:28:07.542263 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vx58f"] Dec 04 22:28:07.547692 master-0 kubenswrapper[33572]: I1204 22:28:07.544922 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" Dec 04 22:28:07.565015 master-0 kubenswrapper[33572]: I1204 22:28:07.564239 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vx58f"] Dec 04 22:28:07.741693 master-0 kubenswrapper[33572]: I1204 22:28:07.741613 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa4c1453-1a2a-4f5c-a4f9-71189203fe94-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vx58f\" (UID: \"fa4c1453-1a2a-4f5c-a4f9-71189203fe94\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" Dec 04 22:28:07.741693 master-0 kubenswrapper[33572]: I1204 22:28:07.741682 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2dfv\" (UniqueName: \"kubernetes.io/projected/fa4c1453-1a2a-4f5c-a4f9-71189203fe94-kube-api-access-d2dfv\") pod \"cert-manager-cainjector-855d9ccff4-vx58f\" (UID: \"fa4c1453-1a2a-4f5c-a4f9-71189203fe94\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" Dec 04 22:28:07.846996 master-0 kubenswrapper[33572]: I1204 22:28:07.844284 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa4c1453-1a2a-4f5c-a4f9-71189203fe94-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vx58f\" (UID: \"fa4c1453-1a2a-4f5c-a4f9-71189203fe94\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" Dec 04 22:28:07.846996 master-0 kubenswrapper[33572]: I1204 22:28:07.844362 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2dfv\" (UniqueName: \"kubernetes.io/projected/fa4c1453-1a2a-4f5c-a4f9-71189203fe94-kube-api-access-d2dfv\") pod \"cert-manager-cainjector-855d9ccff4-vx58f\" (UID: \"fa4c1453-1a2a-4f5c-a4f9-71189203fe94\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" Dec 04 22:28:07.869690 master-0 kubenswrapper[33572]: I1204 22:28:07.869625 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa4c1453-1a2a-4f5c-a4f9-71189203fe94-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vx58f\" (UID: \"fa4c1453-1a2a-4f5c-a4f9-71189203fe94\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" Dec 04 22:28:07.879527 master-0 kubenswrapper[33572]: I1204 22:28:07.879104 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2dfv\" (UniqueName: \"kubernetes.io/projected/fa4c1453-1a2a-4f5c-a4f9-71189203fe94-kube-api-access-d2dfv\") pod \"cert-manager-cainjector-855d9ccff4-vx58f\" (UID: \"fa4c1453-1a2a-4f5c-a4f9-71189203fe94\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" Dec 04 22:28:08.165728 master-0 kubenswrapper[33572]: I1204 22:28:08.165593 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" Dec 04 22:28:08.689461 master-0 kubenswrapper[33572]: I1204 22:28:08.689396 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vx58f"] Dec 04 22:28:08.885536 master-0 kubenswrapper[33572]: I1204 22:28:08.881937 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" event={"ID":"fa4c1453-1a2a-4f5c-a4f9-71189203fe94","Type":"ContainerStarted","Data":"2a36ef457cddd9f503aa021769b248d1c0ca0bfefb082f8b9eef857c4864206a"} Dec 04 22:28:10.777957 master-0 kubenswrapper[33572]: I1204 22:28:10.777885 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf"] Dec 04 22:28:10.780631 master-0 kubenswrapper[33572]: I1204 22:28:10.780076 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:10.784630 master-0 kubenswrapper[33572]: I1204 22:28:10.782850 33572 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 04 22:28:10.784630 master-0 kubenswrapper[33572]: I1204 22:28:10.783051 33572 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 04 22:28:10.784630 master-0 kubenswrapper[33572]: I1204 22:28:10.783208 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 04 22:28:10.784630 master-0 kubenswrapper[33572]: I1204 22:28:10.783384 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 04 22:28:10.816603 master-0 kubenswrapper[33572]: I1204 22:28:10.816515 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf"] Dec 04 22:28:10.908757 master-0 kubenswrapper[33572]: I1204 22:28:10.908671 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f4308db-85a3-40de-881b-b54463263359-webhook-cert\") pod \"metallb-operator-controller-manager-85bc976bd6-scgdf\" (UID: \"0f4308db-85a3-40de-881b-b54463263359\") " pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:10.908757 master-0 kubenswrapper[33572]: I1204 22:28:10.908744 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jczxt\" (UniqueName: \"kubernetes.io/projected/0f4308db-85a3-40de-881b-b54463263359-kube-api-access-jczxt\") pod \"metallb-operator-controller-manager-85bc976bd6-scgdf\" (UID: \"0f4308db-85a3-40de-881b-b54463263359\") " pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:10.909031 master-0 kubenswrapper[33572]: I1204 22:28:10.908793 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f4308db-85a3-40de-881b-b54463263359-apiservice-cert\") pod \"metallb-operator-controller-manager-85bc976bd6-scgdf\" (UID: \"0f4308db-85a3-40de-881b-b54463263359\") " pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:11.013533 master-0 kubenswrapper[33572]: I1204 22:28:11.010155 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f4308db-85a3-40de-881b-b54463263359-apiservice-cert\") pod \"metallb-operator-controller-manager-85bc976bd6-scgdf\" (UID: \"0f4308db-85a3-40de-881b-b54463263359\") " pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:11.013533 master-0 kubenswrapper[33572]: I1204 22:28:11.010336 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f4308db-85a3-40de-881b-b54463263359-webhook-cert\") pod \"metallb-operator-controller-manager-85bc976bd6-scgdf\" (UID: \"0f4308db-85a3-40de-881b-b54463263359\") " pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:11.013533 master-0 kubenswrapper[33572]: I1204 22:28:11.010364 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jczxt\" (UniqueName: \"kubernetes.io/projected/0f4308db-85a3-40de-881b-b54463263359-kube-api-access-jczxt\") pod \"metallb-operator-controller-manager-85bc976bd6-scgdf\" (UID: \"0f4308db-85a3-40de-881b-b54463263359\") " pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:11.029869 master-0 kubenswrapper[33572]: I1204 22:28:11.019760 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f4308db-85a3-40de-881b-b54463263359-apiservice-cert\") pod \"metallb-operator-controller-manager-85bc976bd6-scgdf\" (UID: \"0f4308db-85a3-40de-881b-b54463263359\") " pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:11.034588 master-0 kubenswrapper[33572]: I1204 22:28:11.031581 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f4308db-85a3-40de-881b-b54463263359-webhook-cert\") pod \"metallb-operator-controller-manager-85bc976bd6-scgdf\" (UID: \"0f4308db-85a3-40de-881b-b54463263359\") " pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:11.046651 master-0 kubenswrapper[33572]: I1204 22:28:11.040455 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jczxt\" (UniqueName: \"kubernetes.io/projected/0f4308db-85a3-40de-881b-b54463263359-kube-api-access-jczxt\") pod \"metallb-operator-controller-manager-85bc976bd6-scgdf\" (UID: \"0f4308db-85a3-40de-881b-b54463263359\") " pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:11.112647 master-0 kubenswrapper[33572]: I1204 22:28:11.110635 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:11.204214 master-0 kubenswrapper[33572]: I1204 22:28:11.204124 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl"] Dec 04 22:28:11.205293 master-0 kubenswrapper[33572]: I1204 22:28:11.205263 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:11.207245 master-0 kubenswrapper[33572]: I1204 22:28:11.207195 33572 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 04 22:28:11.207987 master-0 kubenswrapper[33572]: I1204 22:28:11.207919 33572 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 22:28:11.223755 master-0 kubenswrapper[33572]: I1204 22:28:11.223693 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl"] Dec 04 22:28:11.316409 master-0 kubenswrapper[33572]: I1204 22:28:11.314083 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c189000-5407-41c1-825a-3ad7708d6b67-webhook-cert\") pod \"metallb-operator-webhook-server-5844777bf9-wp7bl\" (UID: \"3c189000-5407-41c1-825a-3ad7708d6b67\") " pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:11.316409 master-0 kubenswrapper[33572]: I1204 22:28:11.314185 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb5fq\" (UniqueName: \"kubernetes.io/projected/3c189000-5407-41c1-825a-3ad7708d6b67-kube-api-access-fb5fq\") pod \"metallb-operator-webhook-server-5844777bf9-wp7bl\" (UID: \"3c189000-5407-41c1-825a-3ad7708d6b67\") " pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:11.316409 master-0 kubenswrapper[33572]: I1204 22:28:11.314212 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c189000-5407-41c1-825a-3ad7708d6b67-apiservice-cert\") pod \"metallb-operator-webhook-server-5844777bf9-wp7bl\" (UID: \"3c189000-5407-41c1-825a-3ad7708d6b67\") " pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:11.421714 master-0 kubenswrapper[33572]: I1204 22:28:11.420094 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c189000-5407-41c1-825a-3ad7708d6b67-webhook-cert\") pod \"metallb-operator-webhook-server-5844777bf9-wp7bl\" (UID: \"3c189000-5407-41c1-825a-3ad7708d6b67\") " pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:11.421714 master-0 kubenswrapper[33572]: I1204 22:28:11.420272 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb5fq\" (UniqueName: \"kubernetes.io/projected/3c189000-5407-41c1-825a-3ad7708d6b67-kube-api-access-fb5fq\") pod \"metallb-operator-webhook-server-5844777bf9-wp7bl\" (UID: \"3c189000-5407-41c1-825a-3ad7708d6b67\") " pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:11.421714 master-0 kubenswrapper[33572]: I1204 22:28:11.420306 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c189000-5407-41c1-825a-3ad7708d6b67-apiservice-cert\") pod \"metallb-operator-webhook-server-5844777bf9-wp7bl\" (UID: \"3c189000-5407-41c1-825a-3ad7708d6b67\") " pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:11.426941 master-0 kubenswrapper[33572]: I1204 22:28:11.425584 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c189000-5407-41c1-825a-3ad7708d6b67-webhook-cert\") pod \"metallb-operator-webhook-server-5844777bf9-wp7bl\" (UID: \"3c189000-5407-41c1-825a-3ad7708d6b67\") " pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:11.426941 master-0 kubenswrapper[33572]: I1204 22:28:11.426405 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c189000-5407-41c1-825a-3ad7708d6b67-apiservice-cert\") pod \"metallb-operator-webhook-server-5844777bf9-wp7bl\" (UID: \"3c189000-5407-41c1-825a-3ad7708d6b67\") " pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:11.435666 master-0 kubenswrapper[33572]: I1204 22:28:11.435631 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb5fq\" (UniqueName: \"kubernetes.io/projected/3c189000-5407-41c1-825a-3ad7708d6b67-kube-api-access-fb5fq\") pod \"metallb-operator-webhook-server-5844777bf9-wp7bl\" (UID: \"3c189000-5407-41c1-825a-3ad7708d6b67\") " pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:11.539519 master-0 kubenswrapper[33572]: I1204 22:28:11.539387 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:16.210469 master-0 kubenswrapper[33572]: I1204 22:28:16.210416 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-gh5j2"] Dec 04 22:28:16.211451 master-0 kubenswrapper[33572]: I1204 22:28:16.211425 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-gh5j2" Dec 04 22:28:16.233013 master-0 kubenswrapper[33572]: I1204 22:28:16.232963 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-gh5j2"] Dec 04 22:28:16.312389 master-0 kubenswrapper[33572]: I1204 22:28:16.312327 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4bx\" (UniqueName: \"kubernetes.io/projected/dc5cee88-523c-4fae-af63-d18313c388cd-kube-api-access-xx4bx\") pod \"cert-manager-86cb77c54b-gh5j2\" (UID: \"dc5cee88-523c-4fae-af63-d18313c388cd\") " pod="cert-manager/cert-manager-86cb77c54b-gh5j2" Dec 04 22:28:16.312628 master-0 kubenswrapper[33572]: I1204 22:28:16.312550 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc5cee88-523c-4fae-af63-d18313c388cd-bound-sa-token\") pod \"cert-manager-86cb77c54b-gh5j2\" (UID: \"dc5cee88-523c-4fae-af63-d18313c388cd\") " pod="cert-manager/cert-manager-86cb77c54b-gh5j2" Dec 04 22:28:16.414262 master-0 kubenswrapper[33572]: I1204 22:28:16.414185 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc5cee88-523c-4fae-af63-d18313c388cd-bound-sa-token\") pod \"cert-manager-86cb77c54b-gh5j2\" (UID: \"dc5cee88-523c-4fae-af63-d18313c388cd\") " pod="cert-manager/cert-manager-86cb77c54b-gh5j2" Dec 04 22:28:16.414490 master-0 kubenswrapper[33572]: I1204 22:28:16.414430 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4bx\" (UniqueName: \"kubernetes.io/projected/dc5cee88-523c-4fae-af63-d18313c388cd-kube-api-access-xx4bx\") pod \"cert-manager-86cb77c54b-gh5j2\" (UID: \"dc5cee88-523c-4fae-af63-d18313c388cd\") " pod="cert-manager/cert-manager-86cb77c54b-gh5j2" Dec 04 22:28:16.442209 master-0 kubenswrapper[33572]: I1204 22:28:16.442140 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4bx\" (UniqueName: \"kubernetes.io/projected/dc5cee88-523c-4fae-af63-d18313c388cd-kube-api-access-xx4bx\") pod \"cert-manager-86cb77c54b-gh5j2\" (UID: \"dc5cee88-523c-4fae-af63-d18313c388cd\") " pod="cert-manager/cert-manager-86cb77c54b-gh5j2" Dec 04 22:28:16.447525 master-0 kubenswrapper[33572]: I1204 22:28:16.443027 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc5cee88-523c-4fae-af63-d18313c388cd-bound-sa-token\") pod \"cert-manager-86cb77c54b-gh5j2\" (UID: \"dc5cee88-523c-4fae-af63-d18313c388cd\") " pod="cert-manager/cert-manager-86cb77c54b-gh5j2" Dec 04 22:28:16.534205 master-0 kubenswrapper[33572]: I1204 22:28:16.534129 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-gh5j2" Dec 04 22:28:17.049307 master-0 kubenswrapper[33572]: I1204 22:28:17.049126 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" event={"ID":"fa4c1453-1a2a-4f5c-a4f9-71189203fe94","Type":"ContainerStarted","Data":"296ae3f0641f3ae8f71e72298cd63944e40143bf67edb1f98a7bc192e1da5619"} Dec 04 22:28:17.079723 master-0 kubenswrapper[33572]: I1204 22:28:17.078921 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" event={"ID":"780dc1ba-097b-4417-a960-9c6e5e7a3d40","Type":"ContainerStarted","Data":"efb52ae1897e0c4604747a1e7353079f9d0aae173d90969ace263412d5ee5a39"} Dec 04 22:28:17.081395 master-0 kubenswrapper[33572]: I1204 22:28:17.080229 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" Dec 04 22:28:17.097339 master-0 kubenswrapper[33572]: I1204 22:28:17.097040 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vx58f" podStartSLOduration=2.027283008 podStartE2EDuration="10.097014794s" podCreationTimestamp="2025-12-04 22:28:07 +0000 UTC" firstStartedPulling="2025-12-04 22:28:08.68632207 +0000 UTC m=+552.413847719" lastFinishedPulling="2025-12-04 22:28:16.756053846 +0000 UTC m=+560.483579505" observedRunningTime="2025-12-04 22:28:17.080973649 +0000 UTC m=+560.808499288" watchObservedRunningTime="2025-12-04 22:28:17.097014794 +0000 UTC m=+560.824540443" Dec 04 22:28:17.133609 master-0 kubenswrapper[33572]: I1204 22:28:17.131469 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" podStartSLOduration=1.5959751020000001 podStartE2EDuration="12.131444259s" podCreationTimestamp="2025-12-04 22:28:05 +0000 UTC" firstStartedPulling="2025-12-04 22:28:06.257794961 +0000 UTC m=+549.985320630" lastFinishedPulling="2025-12-04 22:28:16.793264128 +0000 UTC m=+560.520789787" observedRunningTime="2025-12-04 22:28:17.123125498 +0000 UTC m=+560.850651157" watchObservedRunningTime="2025-12-04 22:28:17.131444259 +0000 UTC m=+560.858969928" Dec 04 22:28:17.244765 master-0 kubenswrapper[33572]: I1204 22:28:17.244717 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf"] Dec 04 22:28:17.249817 master-0 kubenswrapper[33572]: W1204 22:28:17.249750 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f4308db_85a3_40de_881b_b54463263359.slice/crio-582838ab76b4523ccad1015ded2edf753a895f8fe2e7506f02a532baf4ccc690 WatchSource:0}: Error finding container 582838ab76b4523ccad1015ded2edf753a895f8fe2e7506f02a532baf4ccc690: Status 404 returned error can't find the container with id 582838ab76b4523ccad1015ded2edf753a895f8fe2e7506f02a532baf4ccc690 Dec 04 22:28:17.406619 master-0 kubenswrapper[33572]: I1204 22:28:17.406556 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-gh5j2"] Dec 04 22:28:17.415623 master-0 kubenswrapper[33572]: I1204 22:28:17.415568 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl"] Dec 04 22:28:18.089410 master-0 kubenswrapper[33572]: I1204 22:28:18.089340 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" event={"ID":"0f4308db-85a3-40de-881b-b54463263359","Type":"ContainerStarted","Data":"582838ab76b4523ccad1015ded2edf753a895f8fe2e7506f02a532baf4ccc690"} Dec 04 22:28:18.091375 master-0 kubenswrapper[33572]: I1204 22:28:18.091307 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-gh5j2" event={"ID":"dc5cee88-523c-4fae-af63-d18313c388cd","Type":"ContainerStarted","Data":"19db811eafb2ab8ece4c6654ea2e569db76480ffc514d25ea40403f03821f87a"} Dec 04 22:28:18.091597 master-0 kubenswrapper[33572]: I1204 22:28:18.091396 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-gh5j2" event={"ID":"dc5cee88-523c-4fae-af63-d18313c388cd","Type":"ContainerStarted","Data":"3d73b73f1f48eca0111007dba223ac4ad0dd6dff80fffd8d7353a0227f9a7caa"} Dec 04 22:28:18.093471 master-0 kubenswrapper[33572]: I1204 22:28:18.093176 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" event={"ID":"3c189000-5407-41c1-825a-3ad7708d6b67","Type":"ContainerStarted","Data":"3893632e8dc04f5902693c663d54d0e74781e8484a1b35d68a732e9698a34a2a"} Dec 04 22:28:18.398081 master-0 kubenswrapper[33572]: I1204 22:28:18.397920 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-gh5j2" podStartSLOduration=2.397891451 podStartE2EDuration="2.397891451s" podCreationTimestamp="2025-12-04 22:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:28:18.39320424 +0000 UTC m=+562.120729889" watchObservedRunningTime="2025-12-04 22:28:18.397891451 +0000 UTC m=+562.125417100" Dec 04 22:28:23.375840 master-0 kubenswrapper[33572]: I1204 22:28:23.375679 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5"] Dec 04 22:28:23.377365 master-0 kubenswrapper[33572]: I1204 22:28:23.377334 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5" Dec 04 22:28:23.380220 master-0 kubenswrapper[33572]: I1204 22:28:23.379535 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 04 22:28:23.380220 master-0 kubenswrapper[33572]: I1204 22:28:23.379719 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 04 22:28:23.406759 master-0 kubenswrapper[33572]: I1204 22:28:23.406697 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5"] Dec 04 22:28:23.496524 master-0 kubenswrapper[33572]: I1204 22:28:23.496052 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdgfq\" (UniqueName: \"kubernetes.io/projected/d46a38f0-9645-40af-a5e6-c5104d5189ba-kube-api-access-fdgfq\") pod \"obo-prometheus-operator-668cf9dfbb-vm5f5\" (UID: \"d46a38f0-9645-40af-a5e6-c5104d5189ba\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5" Dec 04 22:28:23.516539 master-0 kubenswrapper[33572]: I1204 22:28:23.515687 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2"] Dec 04 22:28:23.516955 master-0 kubenswrapper[33572]: I1204 22:28:23.516924 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" Dec 04 22:28:23.523721 master-0 kubenswrapper[33572]: I1204 22:28:23.521337 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 04 22:28:23.528811 master-0 kubenswrapper[33572]: I1204 22:28:23.528776 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5"] Dec 04 22:28:23.529827 master-0 kubenswrapper[33572]: I1204 22:28:23.529789 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" Dec 04 22:28:23.542194 master-0 kubenswrapper[33572]: I1204 22:28:23.542144 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2"] Dec 04 22:28:23.556470 master-0 kubenswrapper[33572]: I1204 22:28:23.556428 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5"] Dec 04 22:28:23.597319 master-0 kubenswrapper[33572]: I1204 22:28:23.597271 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c17c3a99-1b56-4b32-8e51-cd160fb4062d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2\" (UID: \"c17c3a99-1b56-4b32-8e51-cd160fb4062d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" Dec 04 22:28:23.597627 master-0 kubenswrapper[33572]: I1204 22:28:23.597610 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c17c3a99-1b56-4b32-8e51-cd160fb4062d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2\" (UID: \"c17c3a99-1b56-4b32-8e51-cd160fb4062d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" Dec 04 22:28:23.597730 master-0 kubenswrapper[33572]: I1204 22:28:23.597715 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdgfq\" (UniqueName: \"kubernetes.io/projected/d46a38f0-9645-40af-a5e6-c5104d5189ba-kube-api-access-fdgfq\") pod \"obo-prometheus-operator-668cf9dfbb-vm5f5\" (UID: \"d46a38f0-9645-40af-a5e6-c5104d5189ba\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5" Dec 04 22:28:23.618890 master-0 kubenswrapper[33572]: I1204 22:28:23.618818 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdgfq\" (UniqueName: \"kubernetes.io/projected/d46a38f0-9645-40af-a5e6-c5104d5189ba-kube-api-access-fdgfq\") pod \"obo-prometheus-operator-668cf9dfbb-vm5f5\" (UID: \"d46a38f0-9645-40af-a5e6-c5104d5189ba\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5" Dec 04 22:28:23.706760 master-0 kubenswrapper[33572]: I1204 22:28:23.703544 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5" Dec 04 22:28:23.706760 master-0 kubenswrapper[33572]: I1204 22:28:23.704543 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c17c3a99-1b56-4b32-8e51-cd160fb4062d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2\" (UID: \"c17c3a99-1b56-4b32-8e51-cd160fb4062d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" Dec 04 22:28:23.706760 master-0 kubenswrapper[33572]: I1204 22:28:23.704650 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5eb5e75b-473b-4288-9ca8-4683ae8063ec-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5\" (UID: \"5eb5e75b-473b-4288-9ca8-4683ae8063ec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" Dec 04 22:28:23.706760 master-0 kubenswrapper[33572]: I1204 22:28:23.704689 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5eb5e75b-473b-4288-9ca8-4683ae8063ec-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5\" (UID: \"5eb5e75b-473b-4288-9ca8-4683ae8063ec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" Dec 04 22:28:23.706760 master-0 kubenswrapper[33572]: I1204 22:28:23.704770 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c17c3a99-1b56-4b32-8e51-cd160fb4062d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2\" (UID: \"c17c3a99-1b56-4b32-8e51-cd160fb4062d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" Dec 04 22:28:23.712561 master-0 kubenswrapper[33572]: I1204 22:28:23.712492 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c17c3a99-1b56-4b32-8e51-cd160fb4062d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2\" (UID: \"c17c3a99-1b56-4b32-8e51-cd160fb4062d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" Dec 04 22:28:23.714587 master-0 kubenswrapper[33572]: I1204 22:28:23.713043 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c17c3a99-1b56-4b32-8e51-cd160fb4062d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2\" (UID: \"c17c3a99-1b56-4b32-8e51-cd160fb4062d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" Dec 04 22:28:23.740731 master-0 kubenswrapper[33572]: I1204 22:28:23.740683 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-qsbhs"] Dec 04 22:28:23.742216 master-0 kubenswrapper[33572]: I1204 22:28:23.742173 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" Dec 04 22:28:23.744605 master-0 kubenswrapper[33572]: I1204 22:28:23.744579 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 04 22:28:23.766541 master-0 kubenswrapper[33572]: I1204 22:28:23.766492 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-qsbhs"] Dec 04 22:28:23.806376 master-0 kubenswrapper[33572]: I1204 22:28:23.806302 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5eb5e75b-473b-4288-9ca8-4683ae8063ec-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5\" (UID: \"5eb5e75b-473b-4288-9ca8-4683ae8063ec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" Dec 04 22:28:23.807697 master-0 kubenswrapper[33572]: I1204 22:28:23.807661 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5eb5e75b-473b-4288-9ca8-4683ae8063ec-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5\" (UID: \"5eb5e75b-473b-4288-9ca8-4683ae8063ec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" Dec 04 22:28:23.809672 master-0 kubenswrapper[33572]: I1204 22:28:23.809645 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5eb5e75b-473b-4288-9ca8-4683ae8063ec-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5\" (UID: \"5eb5e75b-473b-4288-9ca8-4683ae8063ec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" Dec 04 22:28:23.810784 master-0 kubenswrapper[33572]: I1204 22:28:23.810761 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5eb5e75b-473b-4288-9ca8-4683ae8063ec-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5\" (UID: \"5eb5e75b-473b-4288-9ca8-4683ae8063ec\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" Dec 04 22:28:23.858232 master-0 kubenswrapper[33572]: I1204 22:28:23.858166 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" Dec 04 22:28:23.871454 master-0 kubenswrapper[33572]: I1204 22:28:23.871420 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" Dec 04 22:28:23.909064 master-0 kubenswrapper[33572]: I1204 22:28:23.909017 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b2e5d49-a0dc-454c-876f-2d46c7f68061-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-qsbhs\" (UID: \"5b2e5d49-a0dc-454c-876f-2d46c7f68061\") " pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" Dec 04 22:28:23.909325 master-0 kubenswrapper[33572]: I1204 22:28:23.909310 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fjbd\" (UniqueName: \"kubernetes.io/projected/5b2e5d49-a0dc-454c-876f-2d46c7f68061-kube-api-access-8fjbd\") pod \"observability-operator-d8bb48f5d-qsbhs\" (UID: \"5b2e5d49-a0dc-454c-876f-2d46c7f68061\") " pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" Dec 04 22:28:24.011122 master-0 kubenswrapper[33572]: I1204 22:28:24.011010 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b2e5d49-a0dc-454c-876f-2d46c7f68061-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-qsbhs\" (UID: \"5b2e5d49-a0dc-454c-876f-2d46c7f68061\") " pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" Dec 04 22:28:24.011955 master-0 kubenswrapper[33572]: I1204 22:28:24.011933 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fjbd\" (UniqueName: \"kubernetes.io/projected/5b2e5d49-a0dc-454c-876f-2d46c7f68061-kube-api-access-8fjbd\") pod \"observability-operator-d8bb48f5d-qsbhs\" (UID: \"5b2e5d49-a0dc-454c-876f-2d46c7f68061\") " pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" Dec 04 22:28:24.015654 master-0 kubenswrapper[33572]: I1204 22:28:24.015606 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b2e5d49-a0dc-454c-876f-2d46c7f68061-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-qsbhs\" (UID: \"5b2e5d49-a0dc-454c-876f-2d46c7f68061\") " pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" Dec 04 22:28:24.718731 master-0 kubenswrapper[33572]: I1204 22:28:24.718681 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fjbd\" (UniqueName: \"kubernetes.io/projected/5b2e5d49-a0dc-454c-876f-2d46c7f68061-kube-api-access-8fjbd\") pod \"observability-operator-d8bb48f5d-qsbhs\" (UID: \"5b2e5d49-a0dc-454c-876f-2d46c7f68061\") " pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" Dec 04 22:28:24.748213 master-0 kubenswrapper[33572]: I1204 22:28:24.748150 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-5nnm4"] Dec 04 22:28:24.749132 master-0 kubenswrapper[33572]: I1204 22:28:24.749105 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-5nnm4" Dec 04 22:28:24.773127 master-0 kubenswrapper[33572]: I1204 22:28:24.773063 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" Dec 04 22:28:24.834524 master-0 kubenswrapper[33572]: I1204 22:28:24.833821 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zzq\" (UniqueName: \"kubernetes.io/projected/11fc2559-ac39-4feb-8046-5a83b3322af9-kube-api-access-m8zzq\") pod \"perses-operator-5446b9c989-5nnm4\" (UID: \"11fc2559-ac39-4feb-8046-5a83b3322af9\") " pod="openshift-operators/perses-operator-5446b9c989-5nnm4" Dec 04 22:28:24.834524 master-0 kubenswrapper[33572]: I1204 22:28:24.833961 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/11fc2559-ac39-4feb-8046-5a83b3322af9-openshift-service-ca\") pod \"perses-operator-5446b9c989-5nnm4\" (UID: \"11fc2559-ac39-4feb-8046-5a83b3322af9\") " pod="openshift-operators/perses-operator-5446b9c989-5nnm4" Dec 04 22:28:24.928118 master-0 kubenswrapper[33572]: I1204 22:28:24.928010 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-5nnm4"] Dec 04 22:28:24.935602 master-0 kubenswrapper[33572]: I1204 22:28:24.935524 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8zzq\" (UniqueName: \"kubernetes.io/projected/11fc2559-ac39-4feb-8046-5a83b3322af9-kube-api-access-m8zzq\") pod \"perses-operator-5446b9c989-5nnm4\" (UID: \"11fc2559-ac39-4feb-8046-5a83b3322af9\") " pod="openshift-operators/perses-operator-5446b9c989-5nnm4" Dec 04 22:28:24.935880 master-0 kubenswrapper[33572]: I1204 22:28:24.935695 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/11fc2559-ac39-4feb-8046-5a83b3322af9-openshift-service-ca\") pod \"perses-operator-5446b9c989-5nnm4\" (UID: \"11fc2559-ac39-4feb-8046-5a83b3322af9\") " pod="openshift-operators/perses-operator-5446b9c989-5nnm4" Dec 04 22:28:24.936679 master-0 kubenswrapper[33572]: I1204 22:28:24.936638 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/11fc2559-ac39-4feb-8046-5a83b3322af9-openshift-service-ca\") pod \"perses-operator-5446b9c989-5nnm4\" (UID: \"11fc2559-ac39-4feb-8046-5a83b3322af9\") " pod="openshift-operators/perses-operator-5446b9c989-5nnm4" Dec 04 22:28:24.981292 master-0 kubenswrapper[33572]: I1204 22:28:24.981187 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8zzq\" (UniqueName: \"kubernetes.io/projected/11fc2559-ac39-4feb-8046-5a83b3322af9-kube-api-access-m8zzq\") pod \"perses-operator-5446b9c989-5nnm4\" (UID: \"11fc2559-ac39-4feb-8046-5a83b3322af9\") " pod="openshift-operators/perses-operator-5446b9c989-5nnm4" Dec 04 22:28:25.086901 master-0 kubenswrapper[33572]: I1204 22:28:25.086855 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-5nnm4" Dec 04 22:28:25.815792 master-0 kubenswrapper[33572]: I1204 22:28:25.815716 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-tgx98" Dec 04 22:28:26.207668 master-0 kubenswrapper[33572]: I1204 22:28:26.207617 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" event={"ID":"3c189000-5407-41c1-825a-3ad7708d6b67","Type":"ContainerStarted","Data":"498bcdc3f589fa46b76d14533e2d41e5b9f79ee7a02528dcb6a343495e87f1ce"} Dec 04 22:28:26.207741 master-0 kubenswrapper[33572]: I1204 22:28:26.207716 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:26.227529 master-0 kubenswrapper[33572]: I1204 22:28:26.223149 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5"] Dec 04 22:28:26.235918 master-0 kubenswrapper[33572]: W1204 22:28:26.231645 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eb5e75b_473b_4288_9ca8_4683ae8063ec.slice/crio-84ca8d33aec3dd91d44ebc60d3f8681924930015e286f494f48c48585a05a353 WatchSource:0}: Error finding container 84ca8d33aec3dd91d44ebc60d3f8681924930015e286f494f48c48585a05a353: Status 404 returned error can't find the container with id 84ca8d33aec3dd91d44ebc60d3f8681924930015e286f494f48c48585a05a353 Dec 04 22:28:26.235918 master-0 kubenswrapper[33572]: I1204 22:28:26.234259 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" event={"ID":"0f4308db-85a3-40de-881b-b54463263359","Type":"ContainerStarted","Data":"78412d01f8528e07e26241349e0b2b83456e9d06aa7ebc82025954891b5e291e"} Dec 04 22:28:26.235918 master-0 kubenswrapper[33572]: I1204 22:28:26.234666 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5"] Dec 04 22:28:26.235918 master-0 kubenswrapper[33572]: I1204 22:28:26.234877 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:28:26.513793 master-0 kubenswrapper[33572]: W1204 22:28:26.512628 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11fc2559_ac39_4feb_8046_5a83b3322af9.slice/crio-6f0c1ed8f7fbfee760919a7232550f53df95c76a98838b9be5045a963071c3d0 WatchSource:0}: Error finding container 6f0c1ed8f7fbfee760919a7232550f53df95c76a98838b9be5045a963071c3d0: Status 404 returned error can't find the container with id 6f0c1ed8f7fbfee760919a7232550f53df95c76a98838b9be5045a963071c3d0 Dec 04 22:28:26.513793 master-0 kubenswrapper[33572]: I1204 22:28:26.512796 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-5nnm4"] Dec 04 22:28:26.540271 master-0 kubenswrapper[33572]: I1204 22:28:26.540188 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" podStartSLOduration=7.452399163 podStartE2EDuration="15.540167279s" podCreationTimestamp="2025-12-04 22:28:11 +0000 UTC" firstStartedPulling="2025-12-04 22:28:17.460855267 +0000 UTC m=+561.188380916" lastFinishedPulling="2025-12-04 22:28:25.548623383 +0000 UTC m=+569.276149032" observedRunningTime="2025-12-04 22:28:26.513237713 +0000 UTC m=+570.240763372" watchObservedRunningTime="2025-12-04 22:28:26.540167279 +0000 UTC m=+570.267692918" Dec 04 22:28:26.540545 master-0 kubenswrapper[33572]: I1204 22:28:26.540427 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2"] Dec 04 22:28:26.599535 master-0 kubenswrapper[33572]: I1204 22:28:26.594119 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-qsbhs"] Dec 04 22:28:26.757203 master-0 kubenswrapper[33572]: I1204 22:28:26.757050 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" podStartSLOduration=8.522306413 podStartE2EDuration="16.757030536s" podCreationTimestamp="2025-12-04 22:28:10 +0000 UTC" firstStartedPulling="2025-12-04 22:28:17.252373284 +0000 UTC m=+560.979898933" lastFinishedPulling="2025-12-04 22:28:25.487097387 +0000 UTC m=+569.214623056" observedRunningTime="2025-12-04 22:28:26.752723556 +0000 UTC m=+570.480249305" watchObservedRunningTime="2025-12-04 22:28:26.757030536 +0000 UTC m=+570.484556195" Dec 04 22:28:27.275541 master-0 kubenswrapper[33572]: I1204 22:28:27.272281 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5" event={"ID":"d46a38f0-9645-40af-a5e6-c5104d5189ba","Type":"ContainerStarted","Data":"08a0c8210d6db5ee28a7c437e3fb940ae9af475f28c4db4bf12c33ea6c1ab757"} Dec 04 22:28:27.275541 master-0 kubenswrapper[33572]: I1204 22:28:27.274123 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" event={"ID":"5b2e5d49-a0dc-454c-876f-2d46c7f68061","Type":"ContainerStarted","Data":"3435036e80da88770f425a87a3dc24cfa5533460bbbbe912ecdde320d2538884"} Dec 04 22:28:27.276233 master-0 kubenswrapper[33572]: I1204 22:28:27.275856 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" event={"ID":"5eb5e75b-473b-4288-9ca8-4683ae8063ec","Type":"ContainerStarted","Data":"84ca8d33aec3dd91d44ebc60d3f8681924930015e286f494f48c48585a05a353"} Dec 04 22:28:27.278601 master-0 kubenswrapper[33572]: I1204 22:28:27.277042 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-5nnm4" event={"ID":"11fc2559-ac39-4feb-8046-5a83b3322af9","Type":"ContainerStarted","Data":"6f0c1ed8f7fbfee760919a7232550f53df95c76a98838b9be5045a963071c3d0"} Dec 04 22:28:27.290737 master-0 kubenswrapper[33572]: I1204 22:28:27.288331 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" event={"ID":"c17c3a99-1b56-4b32-8e51-cd160fb4062d","Type":"ContainerStarted","Data":"5bdb8ac8aedb5a38257cc7e6b83119b1d98f474ac3fedf8852494a22ed43e8bb"} Dec 04 22:28:34.348869 master-0 kubenswrapper[33572]: I1204 22:28:34.348791 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-5nnm4" event={"ID":"11fc2559-ac39-4feb-8046-5a83b3322af9","Type":"ContainerStarted","Data":"c5bfe0c192ff9291a646e628c9a86677373c4dfd172cb0dea19ac148d42292f8"} Dec 04 22:28:34.348869 master-0 kubenswrapper[33572]: I1204 22:28:34.348880 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-5nnm4" Dec 04 22:28:34.374147 master-0 kubenswrapper[33572]: I1204 22:28:34.374071 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-5nnm4" podStartSLOduration=5.425471189 podStartE2EDuration="10.373486167s" podCreationTimestamp="2025-12-04 22:28:24 +0000 UTC" firstStartedPulling="2025-12-04 22:28:26.516084822 +0000 UTC m=+570.243610511" lastFinishedPulling="2025-12-04 22:28:31.46409985 +0000 UTC m=+575.191625489" observedRunningTime="2025-12-04 22:28:34.369065935 +0000 UTC m=+578.096591584" watchObservedRunningTime="2025-12-04 22:28:34.373486167 +0000 UTC m=+578.101011816" Dec 04 22:28:35.357224 master-0 kubenswrapper[33572]: I1204 22:28:35.357178 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5" event={"ID":"d46a38f0-9645-40af-a5e6-c5104d5189ba","Type":"ContainerStarted","Data":"4970fea2d6d959db17cdf8f71d339d9596a6fe0099f427773a938d52a198e2dd"} Dec 04 22:28:35.359442 master-0 kubenswrapper[33572]: I1204 22:28:35.359112 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" event={"ID":"5b2e5d49-a0dc-454c-876f-2d46c7f68061","Type":"ContainerStarted","Data":"b532b2045ab356590e27c0e71530ac57332f0f0be589e69db732dc27308af982"} Dec 04 22:28:35.359548 master-0 kubenswrapper[33572]: I1204 22:28:35.359447 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" Dec 04 22:28:35.361180 master-0 kubenswrapper[33572]: I1204 22:28:35.361153 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" event={"ID":"5eb5e75b-473b-4288-9ca8-4683ae8063ec","Type":"ContainerStarted","Data":"7399c72dedb9d7616ed29bab54bfd2eaf0551f6fbf942a2ef811d9759662bc97"} Dec 04 22:28:35.362001 master-0 kubenswrapper[33572]: I1204 22:28:35.361976 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" Dec 04 22:28:35.363627 master-0 kubenswrapper[33572]: I1204 22:28:35.363588 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" event={"ID":"c17c3a99-1b56-4b32-8e51-cd160fb4062d","Type":"ContainerStarted","Data":"89efb3c18ead8bd7349577f7fb81eefeba97e4c8dee66454b1059b1919826849"} Dec 04 22:28:35.385329 master-0 kubenswrapper[33572]: I1204 22:28:35.385235 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vm5f5" podStartSLOduration=7.174704123 podStartE2EDuration="12.385215723s" podCreationTimestamp="2025-12-04 22:28:23 +0000 UTC" firstStartedPulling="2025-12-04 22:28:26.25468321 +0000 UTC m=+569.982208859" lastFinishedPulling="2025-12-04 22:28:31.46519481 +0000 UTC m=+575.192720459" observedRunningTime="2025-12-04 22:28:35.379119983 +0000 UTC m=+579.106645632" watchObservedRunningTime="2025-12-04 22:28:35.385215723 +0000 UTC m=+579.112741362" Dec 04 22:28:35.407120 master-0 kubenswrapper[33572]: I1204 22:28:35.407030 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-qsbhs" podStartSLOduration=4.7543202 podStartE2EDuration="12.407007467s" podCreationTimestamp="2025-12-04 22:28:23 +0000 UTC" firstStartedPulling="2025-12-04 22:28:26.605018028 +0000 UTC m=+570.332543697" lastFinishedPulling="2025-12-04 22:28:34.257705315 +0000 UTC m=+577.985230964" observedRunningTime="2025-12-04 22:28:35.404726465 +0000 UTC m=+579.132252124" watchObservedRunningTime="2025-12-04 22:28:35.407007467 +0000 UTC m=+579.134533136" Dec 04 22:28:35.520557 master-0 kubenswrapper[33572]: I1204 22:28:35.519573 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5" podStartSLOduration=7.283574773 podStartE2EDuration="12.51955206s" podCreationTimestamp="2025-12-04 22:28:23 +0000 UTC" firstStartedPulling="2025-12-04 22:28:26.234701085 +0000 UTC m=+569.962226734" lastFinishedPulling="2025-12-04 22:28:31.470678372 +0000 UTC m=+575.198204021" observedRunningTime="2025-12-04 22:28:35.505835479 +0000 UTC m=+579.233361138" watchObservedRunningTime="2025-12-04 22:28:35.51955206 +0000 UTC m=+579.247077709" Dec 04 22:28:35.520557 master-0 kubenswrapper[33572]: I1204 22:28:35.519774 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2" podStartSLOduration=7.572085827 podStartE2EDuration="12.519770726s" podCreationTimestamp="2025-12-04 22:28:23 +0000 UTC" firstStartedPulling="2025-12-04 22:28:26.515688351 +0000 UTC m=+570.243214040" lastFinishedPulling="2025-12-04 22:28:31.46337329 +0000 UTC m=+575.190898939" observedRunningTime="2025-12-04 22:28:35.446223405 +0000 UTC m=+579.173749044" watchObservedRunningTime="2025-12-04 22:28:35.519770726 +0000 UTC m=+579.247296375" Dec 04 22:28:41.547549 master-0 kubenswrapper[33572]: I1204 22:28:41.547486 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5844777bf9-wp7bl" Dec 04 22:28:45.026933 master-0 kubenswrapper[33572]: I1204 22:28:45.026853 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8fngp"] Dec 04 22:28:45.029989 master-0 kubenswrapper[33572]: I1204 22:28:45.029945 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.077817 master-0 kubenswrapper[33572]: I1204 22:28:45.076992 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8fngp"] Dec 04 22:28:45.090689 master-0 kubenswrapper[33572]: I1204 22:28:45.090637 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-5nnm4" Dec 04 22:28:45.159525 master-0 kubenswrapper[33572]: I1204 22:28:45.159046 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-utilities\") pod \"community-operators-8fngp\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.159525 master-0 kubenswrapper[33572]: I1204 22:28:45.159119 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjhx9\" (UniqueName: \"kubernetes.io/projected/5c2c85e1-5174-43f4-a469-f49173bc4c4b-kube-api-access-sjhx9\") pod \"community-operators-8fngp\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.159525 master-0 kubenswrapper[33572]: I1204 22:28:45.159155 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-catalog-content\") pod \"community-operators-8fngp\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.260848 master-0 kubenswrapper[33572]: I1204 22:28:45.260802 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-utilities\") pod \"community-operators-8fngp\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.260848 master-0 kubenswrapper[33572]: I1204 22:28:45.260850 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjhx9\" (UniqueName: \"kubernetes.io/projected/5c2c85e1-5174-43f4-a469-f49173bc4c4b-kube-api-access-sjhx9\") pod \"community-operators-8fngp\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.261127 master-0 kubenswrapper[33572]: I1204 22:28:45.260879 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-catalog-content\") pod \"community-operators-8fngp\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.261578 master-0 kubenswrapper[33572]: I1204 22:28:45.261553 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-utilities\") pod \"community-operators-8fngp\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.262257 master-0 kubenswrapper[33572]: I1204 22:28:45.262231 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-catalog-content\") pod \"community-operators-8fngp\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.283197 master-0 kubenswrapper[33572]: I1204 22:28:45.283066 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjhx9\" (UniqueName: \"kubernetes.io/projected/5c2c85e1-5174-43f4-a469-f49173bc4c4b-kube-api-access-sjhx9\") pod \"community-operators-8fngp\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.377090 master-0 kubenswrapper[33572]: I1204 22:28:45.376976 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:45.852344 master-0 kubenswrapper[33572]: I1204 22:28:45.852274 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8fngp"] Dec 04 22:28:45.856199 master-0 kubenswrapper[33572]: W1204 22:28:45.856138 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c2c85e1_5174_43f4_a469_f49173bc4c4b.slice/crio-8bbc94802af29fb63db0fc84dc20ebbd3a1cecd8a859120533948cfdf83bb51f WatchSource:0}: Error finding container 8bbc94802af29fb63db0fc84dc20ebbd3a1cecd8a859120533948cfdf83bb51f: Status 404 returned error can't find the container with id 8bbc94802af29fb63db0fc84dc20ebbd3a1cecd8a859120533948cfdf83bb51f Dec 04 22:28:46.462676 master-0 kubenswrapper[33572]: I1204 22:28:46.462590 33572 generic.go:334] "Generic (PLEG): container finished" podID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerID="48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb" exitCode=0 Dec 04 22:28:46.463438 master-0 kubenswrapper[33572]: I1204 22:28:46.462684 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fngp" event={"ID":"5c2c85e1-5174-43f4-a469-f49173bc4c4b","Type":"ContainerDied","Data":"48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb"} Dec 04 22:28:46.463438 master-0 kubenswrapper[33572]: I1204 22:28:46.462736 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fngp" event={"ID":"5c2c85e1-5174-43f4-a469-f49173bc4c4b","Type":"ContainerStarted","Data":"8bbc94802af29fb63db0fc84dc20ebbd3a1cecd8a859120533948cfdf83bb51f"} Dec 04 22:28:47.476514 master-0 kubenswrapper[33572]: I1204 22:28:47.476438 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fngp" event={"ID":"5c2c85e1-5174-43f4-a469-f49173bc4c4b","Type":"ContainerStarted","Data":"bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69"} Dec 04 22:28:48.487683 master-0 kubenswrapper[33572]: I1204 22:28:48.487493 33572 generic.go:334] "Generic (PLEG): container finished" podID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerID="bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69" exitCode=0 Dec 04 22:28:48.487683 master-0 kubenswrapper[33572]: I1204 22:28:48.487554 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fngp" event={"ID":"5c2c85e1-5174-43f4-a469-f49173bc4c4b","Type":"ContainerDied","Data":"bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69"} Dec 04 22:28:53.556581 master-0 kubenswrapper[33572]: I1204 22:28:53.555994 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fngp" event={"ID":"5c2c85e1-5174-43f4-a469-f49173bc4c4b","Type":"ContainerStarted","Data":"6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468"} Dec 04 22:28:53.592827 master-0 kubenswrapper[33572]: I1204 22:28:53.592747 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8fngp" podStartSLOduration=2.928806245 podStartE2EDuration="9.592727023s" podCreationTimestamp="2025-12-04 22:28:44 +0000 UTC" firstStartedPulling="2025-12-04 22:28:46.465224565 +0000 UTC m=+590.192750214" lastFinishedPulling="2025-12-04 22:28:53.129145343 +0000 UTC m=+596.856670992" observedRunningTime="2025-12-04 22:28:53.586657494 +0000 UTC m=+597.314183143" watchObservedRunningTime="2025-12-04 22:28:53.592727023 +0000 UTC m=+597.320252662" Dec 04 22:28:55.378010 master-0 kubenswrapper[33572]: I1204 22:28:55.377955 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:55.378010 master-0 kubenswrapper[33572]: I1204 22:28:55.378017 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:28:56.421330 master-0 kubenswrapper[33572]: I1204 22:28:56.421164 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8fngp" podUID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerName="registry-server" probeResult="failure" output=< Dec 04 22:28:56.421330 master-0 kubenswrapper[33572]: timeout: failed to connect service ":50051" within 1s Dec 04 22:28:56.421330 master-0 kubenswrapper[33572]: > Dec 04 22:29:01.115896 master-0 kubenswrapper[33572]: I1204 22:29:01.115803 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-85bc976bd6-scgdf" Dec 04 22:29:05.452637 master-0 kubenswrapper[33572]: I1204 22:29:05.452587 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:29:05.513117 master-0 kubenswrapper[33572]: I1204 22:29:05.513044 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:29:06.277399 master-0 kubenswrapper[33572]: I1204 22:29:06.277303 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8fngp"] Dec 04 22:29:06.717246 master-0 kubenswrapper[33572]: I1204 22:29:06.717075 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8fngp" podUID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerName="registry-server" containerID="cri-o://6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468" gracePeriod=2 Dec 04 22:29:07.041530 master-0 kubenswrapper[33572]: I1204 22:29:07.034574 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2"] Dec 04 22:29:07.041530 master-0 kubenswrapper[33572]: I1204 22:29:07.035685 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:07.052067 master-0 kubenswrapper[33572]: I1204 22:29:07.051605 33572 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 04 22:29:07.104531 master-0 kubenswrapper[33572]: I1204 22:29:07.088614 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-mbggv"] Dec 04 22:29:07.104531 master-0 kubenswrapper[33572]: I1204 22:29:07.091795 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.104531 master-0 kubenswrapper[33572]: I1204 22:29:07.100622 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2"] Dec 04 22:29:07.104531 master-0 kubenswrapper[33572]: I1204 22:29:07.102926 33572 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 04 22:29:07.104531 master-0 kubenswrapper[33572]: I1204 22:29:07.103196 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 04 22:29:07.167574 master-0 kubenswrapper[33572]: I1204 22:29:07.156082 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s64v\" (UniqueName: \"kubernetes.io/projected/f4b97d80-d72e-4c4c-92f0-4c9a983c5fca-kube-api-access-6s64v\") pod \"frr-k8s-webhook-server-7fcb986d4-27xx2\" (UID: \"f4b97d80-d72e-4c4c-92f0-4c9a983c5fca\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:07.167574 master-0 kubenswrapper[33572]: I1204 22:29:07.156213 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b97d80-d72e-4c4c-92f0-4c9a983c5fca-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-27xx2\" (UID: \"f4b97d80-d72e-4c4c-92f0-4c9a983c5fca\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:07.186829 master-0 kubenswrapper[33572]: I1204 22:29:07.185395 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-clzpp"] Dec 04 22:29:07.194576 master-0 kubenswrapper[33572]: I1204 22:29:07.191360 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.196005 master-0 kubenswrapper[33572]: I1204 22:29:07.195121 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 04 22:29:07.196005 master-0 kubenswrapper[33572]: I1204 22:29:07.195170 33572 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 04 22:29:07.196005 master-0 kubenswrapper[33572]: I1204 22:29:07.195304 33572 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 04 22:29:07.258320 master-0 kubenswrapper[33572]: I1204 22:29:07.257811 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-frr-conf\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.258320 master-0 kubenswrapper[33572]: I1204 22:29:07.257932 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92452d91-986b-42fc-9778-2f78ad4482a9-metrics-certs\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.258320 master-0 kubenswrapper[33572]: I1204 22:29:07.257983 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fsf\" (UniqueName: \"kubernetes.io/projected/92452d91-986b-42fc-9778-2f78ad4482a9-kube-api-access-b5fsf\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.258320 master-0 kubenswrapper[33572]: I1204 22:29:07.258040 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s64v\" (UniqueName: \"kubernetes.io/projected/f4b97d80-d72e-4c4c-92f0-4c9a983c5fca-kube-api-access-6s64v\") pod \"frr-k8s-webhook-server-7fcb986d4-27xx2\" (UID: \"f4b97d80-d72e-4c4c-92f0-4c9a983c5fca\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:07.258320 master-0 kubenswrapper[33572]: I1204 22:29:07.258064 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-frr-sockets\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.258320 master-0 kubenswrapper[33572]: I1204 22:29:07.258097 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b97d80-d72e-4c4c-92f0-4c9a983c5fca-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-27xx2\" (UID: \"f4b97d80-d72e-4c4c-92f0-4c9a983c5fca\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:07.258320 master-0 kubenswrapper[33572]: I1204 22:29:07.258126 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/92452d91-986b-42fc-9778-2f78ad4482a9-frr-startup\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.258320 master-0 kubenswrapper[33572]: I1204 22:29:07.258156 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-metrics\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.258320 master-0 kubenswrapper[33572]: I1204 22:29:07.258191 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-reloader\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.258770 master-0 kubenswrapper[33572]: E1204 22:29:07.258698 33572 secret.go:189] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 04 22:29:07.258770 master-0 kubenswrapper[33572]: E1204 22:29:07.258747 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4b97d80-d72e-4c4c-92f0-4c9a983c5fca-cert podName:f4b97d80-d72e-4c4c-92f0-4c9a983c5fca nodeName:}" failed. No retries permitted until 2025-12-04 22:29:07.758730439 +0000 UTC m=+611.486256088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f4b97d80-d72e-4c4c-92f0-4c9a983c5fca-cert") pod "frr-k8s-webhook-server-7fcb986d4-27xx2" (UID: "f4b97d80-d72e-4c4c-92f0-4c9a983c5fca") : secret "frr-k8s-webhook-server-cert" not found Dec 04 22:29:07.273561 master-0 kubenswrapper[33572]: I1204 22:29:07.273233 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-v5nvt"] Dec 04 22:29:07.274946 master-0 kubenswrapper[33572]: I1204 22:29:07.274925 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.289493 master-0 kubenswrapper[33572]: I1204 22:29:07.280455 33572 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 04 22:29:07.313590 master-0 kubenswrapper[33572]: I1204 22:29:07.309008 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-v5nvt"] Dec 04 22:29:07.313590 master-0 kubenswrapper[33572]: I1204 22:29:07.310656 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s64v\" (UniqueName: \"kubernetes.io/projected/f4b97d80-d72e-4c4c-92f0-4c9a983c5fca-kube-api-access-6s64v\") pod \"frr-k8s-webhook-server-7fcb986d4-27xx2\" (UID: \"f4b97d80-d72e-4c4c-92f0-4c9a983c5fca\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:07.360914 master-0 kubenswrapper[33572]: I1204 22:29:07.359557 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cc0da214-23b6-4e1e-aba6-bb88fe145246-metallb-excludel2\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.360914 master-0 kubenswrapper[33572]: I1204 22:29:07.359633 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvldn\" (UniqueName: \"kubernetes.io/projected/421b3d90-c030-4f07-bbfa-bcb40fcfef1f-kube-api-access-gvldn\") pod \"controller-f8648f98b-v5nvt\" (UID: \"421b3d90-c030-4f07-bbfa-bcb40fcfef1f\") " pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.360914 master-0 kubenswrapper[33572]: I1204 22:29:07.359686 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-memberlist\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.360914 master-0 kubenswrapper[33572]: I1204 22:29:07.359710 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92452d91-986b-42fc-9778-2f78ad4482a9-metrics-certs\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.360914 master-0 kubenswrapper[33572]: I1204 22:29:07.359746 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/421b3d90-c030-4f07-bbfa-bcb40fcfef1f-metrics-certs\") pod \"controller-f8648f98b-v5nvt\" (UID: \"421b3d90-c030-4f07-bbfa-bcb40fcfef1f\") " pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.360914 master-0 kubenswrapper[33572]: I1204 22:29:07.360685 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgvm\" (UniqueName: \"kubernetes.io/projected/cc0da214-23b6-4e1e-aba6-bb88fe145246-kube-api-access-6hgvm\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.360914 master-0 kubenswrapper[33572]: I1204 22:29:07.360773 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fsf\" (UniqueName: \"kubernetes.io/projected/92452d91-986b-42fc-9778-2f78ad4482a9-kube-api-access-b5fsf\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.360914 master-0 kubenswrapper[33572]: I1204 22:29:07.360852 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-metrics-certs\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.361342 master-0 kubenswrapper[33572]: I1204 22:29:07.360996 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-frr-sockets\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.361342 master-0 kubenswrapper[33572]: I1204 22:29:07.361139 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/92452d91-986b-42fc-9778-2f78ad4482a9-frr-startup\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.361342 master-0 kubenswrapper[33572]: I1204 22:29:07.361196 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-metrics\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.361342 master-0 kubenswrapper[33572]: I1204 22:29:07.361308 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-reloader\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.361466 master-0 kubenswrapper[33572]: I1204 22:29:07.361350 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-frr-conf\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.361466 master-0 kubenswrapper[33572]: I1204 22:29:07.361383 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421b3d90-c030-4f07-bbfa-bcb40fcfef1f-cert\") pod \"controller-f8648f98b-v5nvt\" (UID: \"421b3d90-c030-4f07-bbfa-bcb40fcfef1f\") " pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.362319 master-0 kubenswrapper[33572]: I1204 22:29:07.362288 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-frr-sockets\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.362898 master-0 kubenswrapper[33572]: I1204 22:29:07.362849 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-reloader\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.363145 master-0 kubenswrapper[33572]: I1204 22:29:07.363078 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-metrics\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.363289 master-0 kubenswrapper[33572]: I1204 22:29:07.363248 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92452d91-986b-42fc-9778-2f78ad4482a9-metrics-certs\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.363335 master-0 kubenswrapper[33572]: I1204 22:29:07.363264 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/92452d91-986b-42fc-9778-2f78ad4482a9-frr-conf\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.366098 master-0 kubenswrapper[33572]: I1204 22:29:07.366062 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/92452d91-986b-42fc-9778-2f78ad4482a9-frr-startup\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.395292 master-0 kubenswrapper[33572]: I1204 22:29:07.395249 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fsf\" (UniqueName: \"kubernetes.io/projected/92452d91-986b-42fc-9778-2f78ad4482a9-kube-api-access-b5fsf\") pod \"frr-k8s-mbggv\" (UID: \"92452d91-986b-42fc-9778-2f78ad4482a9\") " pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.437043 master-0 kubenswrapper[33572]: I1204 22:29:07.436988 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:29:07.466726 master-0 kubenswrapper[33572]: I1204 22:29:07.466658 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421b3d90-c030-4f07-bbfa-bcb40fcfef1f-cert\") pod \"controller-f8648f98b-v5nvt\" (UID: \"421b3d90-c030-4f07-bbfa-bcb40fcfef1f\") " pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.466943 master-0 kubenswrapper[33572]: I1204 22:29:07.466767 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cc0da214-23b6-4e1e-aba6-bb88fe145246-metallb-excludel2\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.466943 master-0 kubenswrapper[33572]: I1204 22:29:07.466813 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvldn\" (UniqueName: \"kubernetes.io/projected/421b3d90-c030-4f07-bbfa-bcb40fcfef1f-kube-api-access-gvldn\") pod \"controller-f8648f98b-v5nvt\" (UID: \"421b3d90-c030-4f07-bbfa-bcb40fcfef1f\") " pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.466943 master-0 kubenswrapper[33572]: I1204 22:29:07.466856 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-memberlist\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.467203 master-0 kubenswrapper[33572]: I1204 22:29:07.467136 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/421b3d90-c030-4f07-bbfa-bcb40fcfef1f-metrics-certs\") pod \"controller-f8648f98b-v5nvt\" (UID: \"421b3d90-c030-4f07-bbfa-bcb40fcfef1f\") " pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.467328 master-0 kubenswrapper[33572]: I1204 22:29:07.467245 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hgvm\" (UniqueName: \"kubernetes.io/projected/cc0da214-23b6-4e1e-aba6-bb88fe145246-kube-api-access-6hgvm\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.467328 master-0 kubenswrapper[33572]: E1204 22:29:07.467178 33572 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 22:29:07.467391 master-0 kubenswrapper[33572]: I1204 22:29:07.467337 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-metrics-certs\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.467427 master-0 kubenswrapper[33572]: E1204 22:29:07.467392 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-memberlist podName:cc0da214-23b6-4e1e-aba6-bb88fe145246 nodeName:}" failed. No retries permitted until 2025-12-04 22:29:07.967369928 +0000 UTC m=+611.694895757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-memberlist") pod "speaker-clzpp" (UID: "cc0da214-23b6-4e1e-aba6-bb88fe145246") : secret "metallb-memberlist" not found Dec 04 22:29:07.468331 master-0 kubenswrapper[33572]: I1204 22:29:07.468253 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cc0da214-23b6-4e1e-aba6-bb88fe145246-metallb-excludel2\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.469191 master-0 kubenswrapper[33572]: I1204 22:29:07.469135 33572 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 22:29:07.470261 master-0 kubenswrapper[33572]: I1204 22:29:07.470217 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/421b3d90-c030-4f07-bbfa-bcb40fcfef1f-metrics-certs\") pod \"controller-f8648f98b-v5nvt\" (UID: \"421b3d90-c030-4f07-bbfa-bcb40fcfef1f\") " pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.471181 master-0 kubenswrapper[33572]: I1204 22:29:07.471139 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-metrics-certs\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.473274 master-0 kubenswrapper[33572]: I1204 22:29:07.473232 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:07.486067 master-0 kubenswrapper[33572]: I1204 22:29:07.486013 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421b3d90-c030-4f07-bbfa-bcb40fcfef1f-cert\") pod \"controller-f8648f98b-v5nvt\" (UID: \"421b3d90-c030-4f07-bbfa-bcb40fcfef1f\") " pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.511226 master-0 kubenswrapper[33572]: I1204 22:29:07.511123 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvldn\" (UniqueName: \"kubernetes.io/projected/421b3d90-c030-4f07-bbfa-bcb40fcfef1f-kube-api-access-gvldn\") pod \"controller-f8648f98b-v5nvt\" (UID: \"421b3d90-c030-4f07-bbfa-bcb40fcfef1f\") " pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.513997 master-0 kubenswrapper[33572]: I1204 22:29:07.513940 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hgvm\" (UniqueName: \"kubernetes.io/projected/cc0da214-23b6-4e1e-aba6-bb88fe145246-kube-api-access-6hgvm\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.569261 master-0 kubenswrapper[33572]: I1204 22:29:07.569146 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjhx9\" (UniqueName: \"kubernetes.io/projected/5c2c85e1-5174-43f4-a469-f49173bc4c4b-kube-api-access-sjhx9\") pod \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " Dec 04 22:29:07.569452 master-0 kubenswrapper[33572]: I1204 22:29:07.569312 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-utilities\") pod \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " Dec 04 22:29:07.569452 master-0 kubenswrapper[33572]: I1204 22:29:07.569359 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-catalog-content\") pod \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\" (UID: \"5c2c85e1-5174-43f4-a469-f49173bc4c4b\") " Dec 04 22:29:07.570593 master-0 kubenswrapper[33572]: I1204 22:29:07.570533 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-utilities" (OuterVolumeSpecName: "utilities") pod "5c2c85e1-5174-43f4-a469-f49173bc4c4b" (UID: "5c2c85e1-5174-43f4-a469-f49173bc4c4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:29:07.577042 master-0 kubenswrapper[33572]: I1204 22:29:07.576582 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2c85e1-5174-43f4-a469-f49173bc4c4b-kube-api-access-sjhx9" (OuterVolumeSpecName: "kube-api-access-sjhx9") pod "5c2c85e1-5174-43f4-a469-f49173bc4c4b" (UID: "5c2c85e1-5174-43f4-a469-f49173bc4c4b"). InnerVolumeSpecName "kube-api-access-sjhx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:29:07.619234 master-0 kubenswrapper[33572]: I1204 22:29:07.619155 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c2c85e1-5174-43f4-a469-f49173bc4c4b" (UID: "5c2c85e1-5174-43f4-a469-f49173bc4c4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:29:07.644930 master-0 kubenswrapper[33572]: I1204 22:29:07.644851 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:07.671128 master-0 kubenswrapper[33572]: I1204 22:29:07.671073 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjhx9\" (UniqueName: \"kubernetes.io/projected/5c2c85e1-5174-43f4-a469-f49173bc4c4b-kube-api-access-sjhx9\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:07.671128 master-0 kubenswrapper[33572]: I1204 22:29:07.671123 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:07.671348 master-0 kubenswrapper[33572]: I1204 22:29:07.671152 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c2c85e1-5174-43f4-a469-f49173bc4c4b-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:07.729062 master-0 kubenswrapper[33572]: I1204 22:29:07.728985 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mbggv" event={"ID":"92452d91-986b-42fc-9778-2f78ad4482a9","Type":"ContainerStarted","Data":"42b9f69a68b42f1df5ef84917293f2a1ab6af5d8317f898bf8883bacf1f55d66"} Dec 04 22:29:07.731394 master-0 kubenswrapper[33572]: I1204 22:29:07.731332 33572 generic.go:334] "Generic (PLEG): container finished" podID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerID="6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468" exitCode=0 Dec 04 22:29:07.731483 master-0 kubenswrapper[33572]: I1204 22:29:07.731390 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fngp" event={"ID":"5c2c85e1-5174-43f4-a469-f49173bc4c4b","Type":"ContainerDied","Data":"6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468"} Dec 04 22:29:07.731483 master-0 kubenswrapper[33572]: I1204 22:29:07.731429 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fngp" event={"ID":"5c2c85e1-5174-43f4-a469-f49173bc4c4b","Type":"ContainerDied","Data":"8bbc94802af29fb63db0fc84dc20ebbd3a1cecd8a859120533948cfdf83bb51f"} Dec 04 22:29:07.731483 master-0 kubenswrapper[33572]: I1204 22:29:07.731452 33572 scope.go:117] "RemoveContainer" containerID="6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468" Dec 04 22:29:07.731627 master-0 kubenswrapper[33572]: I1204 22:29:07.731538 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fngp" Dec 04 22:29:07.764478 master-0 kubenswrapper[33572]: I1204 22:29:07.764418 33572 scope.go:117] "RemoveContainer" containerID="bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69" Dec 04 22:29:07.784658 master-0 kubenswrapper[33572]: I1204 22:29:07.773301 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b97d80-d72e-4c4c-92f0-4c9a983c5fca-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-27xx2\" (UID: \"f4b97d80-d72e-4c4c-92f0-4c9a983c5fca\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:07.784658 master-0 kubenswrapper[33572]: I1204 22:29:07.781823 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f4b97d80-d72e-4c4c-92f0-4c9a983c5fca-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-27xx2\" (UID: \"f4b97d80-d72e-4c4c-92f0-4c9a983c5fca\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:07.791849 master-0 kubenswrapper[33572]: I1204 22:29:07.791784 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8fngp"] Dec 04 22:29:07.795832 master-0 kubenswrapper[33572]: I1204 22:29:07.795786 33572 scope.go:117] "RemoveContainer" containerID="48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb" Dec 04 22:29:07.797290 master-0 kubenswrapper[33572]: I1204 22:29:07.797237 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8fngp"] Dec 04 22:29:07.844476 master-0 kubenswrapper[33572]: I1204 22:29:07.844435 33572 scope.go:117] "RemoveContainer" containerID="6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468" Dec 04 22:29:07.844971 master-0 kubenswrapper[33572]: E1204 22:29:07.844935 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468\": container with ID starting with 6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468 not found: ID does not exist" containerID="6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468" Dec 04 22:29:07.845041 master-0 kubenswrapper[33572]: I1204 22:29:07.844970 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468"} err="failed to get container status \"6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468\": rpc error: code = NotFound desc = could not find container \"6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468\": container with ID starting with 6fd3886e39f46dac4f712b525175b50b11504f208d4923296ce5ea70c2a3e468 not found: ID does not exist" Dec 04 22:29:07.845041 master-0 kubenswrapper[33572]: I1204 22:29:07.844998 33572 scope.go:117] "RemoveContainer" containerID="bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69" Dec 04 22:29:07.845436 master-0 kubenswrapper[33572]: E1204 22:29:07.845387 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69\": container with ID starting with bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69 not found: ID does not exist" containerID="bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69" Dec 04 22:29:07.845496 master-0 kubenswrapper[33572]: I1204 22:29:07.845436 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69"} err="failed to get container status \"bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69\": rpc error: code = NotFound desc = could not find container \"bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69\": container with ID starting with bbfa70a20200480c3f757ebc377f0b1df5a73b57c6759a9982f441c371a4cb69 not found: ID does not exist" Dec 04 22:29:07.845496 master-0 kubenswrapper[33572]: I1204 22:29:07.845466 33572 scope.go:117] "RemoveContainer" containerID="48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb" Dec 04 22:29:07.846019 master-0 kubenswrapper[33572]: E1204 22:29:07.845988 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb\": container with ID starting with 48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb not found: ID does not exist" containerID="48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb" Dec 04 22:29:07.846066 master-0 kubenswrapper[33572]: I1204 22:29:07.846015 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb"} err="failed to get container status \"48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb\": rpc error: code = NotFound desc = could not find container \"48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb\": container with ID starting with 48e41b7946939be9ff204d19151e1b03177cd36e17d1c6b4e61c23985e0a3acb not found: ID does not exist" Dec 04 22:29:07.976186 master-0 kubenswrapper[33572]: I1204 22:29:07.976124 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-memberlist\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:07.976423 master-0 kubenswrapper[33572]: E1204 22:29:07.976292 33572 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 22:29:07.976423 master-0 kubenswrapper[33572]: E1204 22:29:07.976397 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-memberlist podName:cc0da214-23b6-4e1e-aba6-bb88fe145246 nodeName:}" failed. No retries permitted until 2025-12-04 22:29:08.976376278 +0000 UTC m=+612.703901927 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-memberlist") pod "speaker-clzpp" (UID: "cc0da214-23b6-4e1e-aba6-bb88fe145246") : secret "metallb-memberlist" not found Dec 04 22:29:08.026217 master-0 kubenswrapper[33572]: I1204 22:29:08.026178 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:08.103710 master-0 kubenswrapper[33572]: I1204 22:29:08.103646 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-v5nvt"] Dec 04 22:29:08.109328 master-0 kubenswrapper[33572]: W1204 22:29:08.109249 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod421b3d90_c030_4f07_bbfa_bcb40fcfef1f.slice/crio-710fc860388259e2445037569ebcb0d9916616df9299acfe739fe96705f3c5bf WatchSource:0}: Error finding container 710fc860388259e2445037569ebcb0d9916616df9299acfe739fe96705f3c5bf: Status 404 returned error can't find the container with id 710fc860388259e2445037569ebcb0d9916616df9299acfe739fe96705f3c5bf Dec 04 22:29:08.460518 master-0 kubenswrapper[33572]: I1204 22:29:08.460440 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2"] Dec 04 22:29:08.469059 master-0 kubenswrapper[33572]: W1204 22:29:08.469006 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4b97d80_d72e_4c4c_92f0_4c9a983c5fca.slice/crio-32b096948966fe18312f311e11ed5dd081a25d010a70f9cbc7493657dc81cdcd WatchSource:0}: Error finding container 32b096948966fe18312f311e11ed5dd081a25d010a70f9cbc7493657dc81cdcd: Status 404 returned error can't find the container with id 32b096948966fe18312f311e11ed5dd081a25d010a70f9cbc7493657dc81cdcd Dec 04 22:29:08.510516 master-0 kubenswrapper[33572]: I1204 22:29:08.510427 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-59s5q"] Dec 04 22:29:08.511143 master-0 kubenswrapper[33572]: E1204 22:29:08.511095 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerName="extract-utilities" Dec 04 22:29:08.511233 master-0 kubenswrapper[33572]: I1204 22:29:08.511142 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerName="extract-utilities" Dec 04 22:29:08.511233 master-0 kubenswrapper[33572]: E1204 22:29:08.511166 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerName="registry-server" Dec 04 22:29:08.511233 master-0 kubenswrapper[33572]: I1204 22:29:08.511181 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerName="registry-server" Dec 04 22:29:08.511381 master-0 kubenswrapper[33572]: E1204 22:29:08.511247 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerName="extract-content" Dec 04 22:29:08.511381 master-0 kubenswrapper[33572]: I1204 22:29:08.511262 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerName="extract-content" Dec 04 22:29:08.511666 master-0 kubenswrapper[33572]: I1204 22:29:08.511627 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" containerName="registry-server" Dec 04 22:29:08.514115 master-0 kubenswrapper[33572]: I1204 22:29:08.514076 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.539760 master-0 kubenswrapper[33572]: I1204 22:29:08.539698 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2c85e1-5174-43f4-a469-f49173bc4c4b" path="/var/lib/kubelet/pods/5c2c85e1-5174-43f4-a469-f49173bc4c4b/volumes" Dec 04 22:29:08.540373 master-0 kubenswrapper[33572]: I1204 22:29:08.540298 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59s5q"] Dec 04 22:29:08.586910 master-0 kubenswrapper[33572]: I1204 22:29:08.586849 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-utilities\") pod \"certified-operators-59s5q\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.587130 master-0 kubenswrapper[33572]: I1204 22:29:08.586943 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-catalog-content\") pod \"certified-operators-59s5q\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.587130 master-0 kubenswrapper[33572]: I1204 22:29:08.587013 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw5nd\" (UniqueName: \"kubernetes.io/projected/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-kube-api-access-fw5nd\") pod \"certified-operators-59s5q\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.688646 master-0 kubenswrapper[33572]: I1204 22:29:08.688557 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-utilities\") pod \"certified-operators-59s5q\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.688646 master-0 kubenswrapper[33572]: I1204 22:29:08.688652 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-catalog-content\") pod \"certified-operators-59s5q\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.688961 master-0 kubenswrapper[33572]: I1204 22:29:08.688938 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw5nd\" (UniqueName: \"kubernetes.io/projected/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-kube-api-access-fw5nd\") pod \"certified-operators-59s5q\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.689856 master-0 kubenswrapper[33572]: I1204 22:29:08.689800 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-utilities\") pod \"certified-operators-59s5q\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.690274 master-0 kubenswrapper[33572]: I1204 22:29:08.690223 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-catalog-content\") pod \"certified-operators-59s5q\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.712797 master-0 kubenswrapper[33572]: I1204 22:29:08.712668 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw5nd\" (UniqueName: \"kubernetes.io/projected/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-kube-api-access-fw5nd\") pod \"certified-operators-59s5q\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.742523 master-0 kubenswrapper[33572]: I1204 22:29:08.742217 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" event={"ID":"f4b97d80-d72e-4c4c-92f0-4c9a983c5fca","Type":"ContainerStarted","Data":"32b096948966fe18312f311e11ed5dd081a25d010a70f9cbc7493657dc81cdcd"} Dec 04 22:29:08.745124 master-0 kubenswrapper[33572]: I1204 22:29:08.745089 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-v5nvt" event={"ID":"421b3d90-c030-4f07-bbfa-bcb40fcfef1f","Type":"ContainerStarted","Data":"9c28fb613586385b53a3dae4883d759b1c86aa1ad845df59c8518f348b847b80"} Dec 04 22:29:08.745251 master-0 kubenswrapper[33572]: I1204 22:29:08.745125 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-v5nvt" event={"ID":"421b3d90-c030-4f07-bbfa-bcb40fcfef1f","Type":"ContainerStarted","Data":"710fc860388259e2445037569ebcb0d9916616df9299acfe739fe96705f3c5bf"} Dec 04 22:29:08.832086 master-0 kubenswrapper[33572]: I1204 22:29:08.832031 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:08.998278 master-0 kubenswrapper[33572]: I1204 22:29:08.997451 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-memberlist\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:09.009073 master-0 kubenswrapper[33572]: I1204 22:29:09.008856 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cc0da214-23b6-4e1e-aba6-bb88fe145246-memberlist\") pod \"speaker-clzpp\" (UID: \"cc0da214-23b6-4e1e-aba6-bb88fe145246\") " pod="metallb-system/speaker-clzpp" Dec 04 22:29:09.316480 master-0 kubenswrapper[33572]: I1204 22:29:09.312975 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-clzpp" Dec 04 22:29:09.360556 master-0 kubenswrapper[33572]: I1204 22:29:09.360189 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs"] Dec 04 22:29:09.362111 master-0 kubenswrapper[33572]: I1204 22:29:09.361584 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" Dec 04 22:29:09.363779 master-0 kubenswrapper[33572]: I1204 22:29:09.363751 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 04 22:29:09.383027 master-0 kubenswrapper[33572]: I1204 22:29:09.382975 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp"] Dec 04 22:29:09.384411 master-0 kubenswrapper[33572]: I1204 22:29:09.384378 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp" Dec 04 22:29:09.419891 master-0 kubenswrapper[33572]: I1204 22:29:09.419536 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mcmbn"] Dec 04 22:29:09.438524 master-0 kubenswrapper[33572]: I1204 22:29:09.422167 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.438524 master-0 kubenswrapper[33572]: I1204 22:29:09.427990 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59s5q"] Dec 04 22:29:09.438524 master-0 kubenswrapper[33572]: I1204 22:29:09.430753 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs"] Dec 04 22:29:09.438524 master-0 kubenswrapper[33572]: I1204 22:29:09.437579 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp"] Dec 04 22:29:09.520155 master-0 kubenswrapper[33572]: I1204 22:29:09.513101 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm5l8\" (UniqueName: \"kubernetes.io/projected/0a44f088-be39-47f3-8e34-9cade0076325-kube-api-access-lm5l8\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.520155 master-0 kubenswrapper[33572]: I1204 22:29:09.513317 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0a44f088-be39-47f3-8e34-9cade0076325-ovs-socket\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.520155 master-0 kubenswrapper[33572]: I1204 22:29:09.513411 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0a44f088-be39-47f3-8e34-9cade0076325-nmstate-lock\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.520155 master-0 kubenswrapper[33572]: I1204 22:29:09.513454 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khm2n\" (UniqueName: \"kubernetes.io/projected/ed12588b-8304-4d07-9055-6944c540d15f-kube-api-access-khm2n\") pod \"nmstate-metrics-7f946cbc9-8rwmp\" (UID: \"ed12588b-8304-4d07-9055-6944c540d15f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp" Dec 04 22:29:09.520155 master-0 kubenswrapper[33572]: I1204 22:29:09.513492 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0a44f088-be39-47f3-8e34-9cade0076325-dbus-socket\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.520155 master-0 kubenswrapper[33572]: I1204 22:29:09.513538 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfvn2\" (UniqueName: \"kubernetes.io/projected/4e7a211d-0841-473d-87c2-953375146be8-kube-api-access-kfvn2\") pod \"nmstate-webhook-5f6d4c5ccb-265zs\" (UID: \"4e7a211d-0841-473d-87c2-953375146be8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" Dec 04 22:29:09.520155 master-0 kubenswrapper[33572]: I1204 22:29:09.513561 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4e7a211d-0841-473d-87c2-953375146be8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-265zs\" (UID: \"4e7a211d-0841-473d-87c2-953375146be8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" Dec 04 22:29:09.593172 master-0 kubenswrapper[33572]: I1204 22:29:09.586372 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb"] Dec 04 22:29:09.593172 master-0 kubenswrapper[33572]: I1204 22:29:09.587294 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:09.593172 master-0 kubenswrapper[33572]: I1204 22:29:09.589798 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 04 22:29:09.593172 master-0 kubenswrapper[33572]: I1204 22:29:09.590021 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 04 22:29:09.621549 master-0 kubenswrapper[33572]: I1204 22:29:09.614527 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khm2n\" (UniqueName: \"kubernetes.io/projected/ed12588b-8304-4d07-9055-6944c540d15f-kube-api-access-khm2n\") pod \"nmstate-metrics-7f946cbc9-8rwmp\" (UID: \"ed12588b-8304-4d07-9055-6944c540d15f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp" Dec 04 22:29:09.621549 master-0 kubenswrapper[33572]: I1204 22:29:09.614636 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0a44f088-be39-47f3-8e34-9cade0076325-dbus-socket\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.621549 master-0 kubenswrapper[33572]: I1204 22:29:09.614665 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfvn2\" (UniqueName: \"kubernetes.io/projected/4e7a211d-0841-473d-87c2-953375146be8-kube-api-access-kfvn2\") pod \"nmstate-webhook-5f6d4c5ccb-265zs\" (UID: \"4e7a211d-0841-473d-87c2-953375146be8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" Dec 04 22:29:09.621549 master-0 kubenswrapper[33572]: I1204 22:29:09.614688 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4e7a211d-0841-473d-87c2-953375146be8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-265zs\" (UID: \"4e7a211d-0841-473d-87c2-953375146be8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" Dec 04 22:29:09.621549 master-0 kubenswrapper[33572]: I1204 22:29:09.614721 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm5l8\" (UniqueName: \"kubernetes.io/projected/0a44f088-be39-47f3-8e34-9cade0076325-kube-api-access-lm5l8\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.621549 master-0 kubenswrapper[33572]: I1204 22:29:09.614819 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0a44f088-be39-47f3-8e34-9cade0076325-ovs-socket\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.621549 master-0 kubenswrapper[33572]: I1204 22:29:09.614838 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0a44f088-be39-47f3-8e34-9cade0076325-nmstate-lock\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.621549 master-0 kubenswrapper[33572]: I1204 22:29:09.614910 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0a44f088-be39-47f3-8e34-9cade0076325-nmstate-lock\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.621549 master-0 kubenswrapper[33572]: I1204 22:29:09.615686 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0a44f088-be39-47f3-8e34-9cade0076325-dbus-socket\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.628073 master-0 kubenswrapper[33572]: I1204 22:29:09.622920 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0a44f088-be39-47f3-8e34-9cade0076325-ovs-socket\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.632582 master-0 kubenswrapper[33572]: I1204 22:29:09.630536 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4e7a211d-0841-473d-87c2-953375146be8-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-265zs\" (UID: \"4e7a211d-0841-473d-87c2-953375146be8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" Dec 04 22:29:09.632582 master-0 kubenswrapper[33572]: I1204 22:29:09.630625 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb"] Dec 04 22:29:09.641161 master-0 kubenswrapper[33572]: I1204 22:29:09.641129 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khm2n\" (UniqueName: \"kubernetes.io/projected/ed12588b-8304-4d07-9055-6944c540d15f-kube-api-access-khm2n\") pod \"nmstate-metrics-7f946cbc9-8rwmp\" (UID: \"ed12588b-8304-4d07-9055-6944c540d15f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp" Dec 04 22:29:09.643309 master-0 kubenswrapper[33572]: I1204 22:29:09.643281 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfvn2\" (UniqueName: \"kubernetes.io/projected/4e7a211d-0841-473d-87c2-953375146be8-kube-api-access-kfvn2\") pod \"nmstate-webhook-5f6d4c5ccb-265zs\" (UID: \"4e7a211d-0841-473d-87c2-953375146be8\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" Dec 04 22:29:09.645984 master-0 kubenswrapper[33572]: I1204 22:29:09.643688 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm5l8\" (UniqueName: \"kubernetes.io/projected/0a44f088-be39-47f3-8e34-9cade0076325-kube-api-access-lm5l8\") pod \"nmstate-handler-mcmbn\" (UID: \"0a44f088-be39-47f3-8e34-9cade0076325\") " pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.716562 master-0 kubenswrapper[33572]: I1204 22:29:09.716518 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/22dd80ed-ccaf-40b1-8837-a8aeb3c28140-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-twslb\" (UID: \"22dd80ed-ccaf-40b1-8837-a8aeb3c28140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:09.716562 master-0 kubenswrapper[33572]: I1204 22:29:09.716562 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/22dd80ed-ccaf-40b1-8837-a8aeb3c28140-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-twslb\" (UID: \"22dd80ed-ccaf-40b1-8837-a8aeb3c28140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:09.716846 master-0 kubenswrapper[33572]: I1204 22:29:09.716586 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkhkx\" (UniqueName: \"kubernetes.io/projected/22dd80ed-ccaf-40b1-8837-a8aeb3c28140-kube-api-access-wkhkx\") pod \"nmstate-console-plugin-7fbb5f6569-twslb\" (UID: \"22dd80ed-ccaf-40b1-8837-a8aeb3c28140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:09.757558 master-0 kubenswrapper[33572]: I1204 22:29:09.757485 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" Dec 04 22:29:09.791736 master-0 kubenswrapper[33572]: I1204 22:29:09.791665 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-795b68ff6d-p7dxw"] Dec 04 22:29:09.793314 master-0 kubenswrapper[33572]: I1204 22:29:09.793098 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:09.794357 master-0 kubenswrapper[33572]: I1204 22:29:09.794323 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp" Dec 04 22:29:09.801251 master-0 kubenswrapper[33572]: I1204 22:29:09.800786 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-v5nvt" event={"ID":"421b3d90-c030-4f07-bbfa-bcb40fcfef1f","Type":"ContainerStarted","Data":"68ca97b14b65a66ea13263c1afff33bef9805e9824176b95aafd474cac850b35"} Dec 04 22:29:09.801251 master-0 kubenswrapper[33572]: I1204 22:29:09.800841 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:09.801960 master-0 kubenswrapper[33572]: I1204 22:29:09.801843 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-795b68ff6d-p7dxw"] Dec 04 22:29:09.804837 master-0 kubenswrapper[33572]: I1204 22:29:09.804781 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-clzpp" event={"ID":"cc0da214-23b6-4e1e-aba6-bb88fe145246","Type":"ContainerStarted","Data":"467d97a82fb50d2465b883af3b6508f2fca78b0ad6648248330d192cf7dbb846"} Dec 04 22:29:09.810980 master-0 kubenswrapper[33572]: I1204 22:29:09.808597 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:09.812027 master-0 kubenswrapper[33572]: I1204 22:29:09.811184 33572 generic.go:334] "Generic (PLEG): container finished" podID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerID="7812d950dc2a5238df5e790ede80d928a2f3777e9a6203d7fd80ddc09d5bafae" exitCode=0 Dec 04 22:29:09.812027 master-0 kubenswrapper[33572]: I1204 22:29:09.811232 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59s5q" event={"ID":"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610","Type":"ContainerDied","Data":"7812d950dc2a5238df5e790ede80d928a2f3777e9a6203d7fd80ddc09d5bafae"} Dec 04 22:29:09.812027 master-0 kubenswrapper[33572]: I1204 22:29:09.811259 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59s5q" event={"ID":"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610","Type":"ContainerStarted","Data":"7877a22b1ebf679b42f1a5f3610580ff7029831eabe8712e80d8e973ec2b0137"} Dec 04 22:29:09.831302 master-0 kubenswrapper[33572]: I1204 22:29:09.820694 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/22dd80ed-ccaf-40b1-8837-a8aeb3c28140-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-twslb\" (UID: \"22dd80ed-ccaf-40b1-8837-a8aeb3c28140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:09.831302 master-0 kubenswrapper[33572]: I1204 22:29:09.820748 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/22dd80ed-ccaf-40b1-8837-a8aeb3c28140-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-twslb\" (UID: \"22dd80ed-ccaf-40b1-8837-a8aeb3c28140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:09.831302 master-0 kubenswrapper[33572]: I1204 22:29:09.820926 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkhkx\" (UniqueName: \"kubernetes.io/projected/22dd80ed-ccaf-40b1-8837-a8aeb3c28140-kube-api-access-wkhkx\") pod \"nmstate-console-plugin-7fbb5f6569-twslb\" (UID: \"22dd80ed-ccaf-40b1-8837-a8aeb3c28140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:09.831302 master-0 kubenswrapper[33572]: I1204 22:29:09.822014 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/22dd80ed-ccaf-40b1-8837-a8aeb3c28140-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-twslb\" (UID: \"22dd80ed-ccaf-40b1-8837-a8aeb3c28140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:09.836371 master-0 kubenswrapper[33572]: I1204 22:29:09.836331 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/22dd80ed-ccaf-40b1-8837-a8aeb3c28140-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-twslb\" (UID: \"22dd80ed-ccaf-40b1-8837-a8aeb3c28140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:09.852010 master-0 kubenswrapper[33572]: I1204 22:29:09.845285 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkhkx\" (UniqueName: \"kubernetes.io/projected/22dd80ed-ccaf-40b1-8837-a8aeb3c28140-kube-api-access-wkhkx\") pod \"nmstate-console-plugin-7fbb5f6569-twslb\" (UID: \"22dd80ed-ccaf-40b1-8837-a8aeb3c28140\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:09.855381 master-0 kubenswrapper[33572]: I1204 22:29:09.855304 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-v5nvt" podStartSLOduration=1.828645981 podStartE2EDuration="2.855286849s" podCreationTimestamp="2025-12-04 22:29:07 +0000 UTC" firstStartedPulling="2025-12-04 22:29:08.289990338 +0000 UTC m=+612.017515997" lastFinishedPulling="2025-12-04 22:29:09.316631216 +0000 UTC m=+613.044156865" observedRunningTime="2025-12-04 22:29:09.849390585 +0000 UTC m=+613.576916234" watchObservedRunningTime="2025-12-04 22:29:09.855286849 +0000 UTC m=+613.582812498" Dec 04 22:29:09.911851 master-0 kubenswrapper[33572]: I1204 22:29:09.911791 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" Dec 04 22:29:10.027905 master-0 kubenswrapper[33572]: I1204 22:29:10.025256 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-service-ca\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.027905 master-0 kubenswrapper[33572]: I1204 22:29:10.025318 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-oauth-serving-cert\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.027905 master-0 kubenswrapper[33572]: I1204 22:29:10.025349 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-console-config\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.027905 master-0 kubenswrapper[33572]: I1204 22:29:10.025383 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-trusted-ca-bundle\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.027905 master-0 kubenswrapper[33572]: I1204 22:29:10.025414 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntp2b\" (UniqueName: \"kubernetes.io/projected/800cb931-1357-490b-855e-d7b30b4a5593-kube-api-access-ntp2b\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.027905 master-0 kubenswrapper[33572]: I1204 22:29:10.025473 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/800cb931-1357-490b-855e-d7b30b4a5593-console-serving-cert\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.027905 master-0 kubenswrapper[33572]: I1204 22:29:10.025611 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/800cb931-1357-490b-855e-d7b30b4a5593-console-oauth-config\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.167840 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/800cb931-1357-490b-855e-d7b30b4a5593-console-serving-cert\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.167978 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/800cb931-1357-490b-855e-d7b30b4a5593-console-oauth-config\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.168133 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-service-ca\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.168152 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-oauth-serving-cert\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.168170 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-console-config\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.168195 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-trusted-ca-bundle\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.168226 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntp2b\" (UniqueName: \"kubernetes.io/projected/800cb931-1357-490b-855e-d7b30b4a5593-kube-api-access-ntp2b\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.169309 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-service-ca\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.177266 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/800cb931-1357-490b-855e-d7b30b4a5593-console-oauth-config\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.178011 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-console-config\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.179187 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/800cb931-1357-490b-855e-d7b30b4a5593-console-serving-cert\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.182531 master-0 kubenswrapper[33572]: I1204 22:29:10.179337 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-trusted-ca-bundle\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.198524 master-0 kubenswrapper[33572]: I1204 22:29:10.192983 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/800cb931-1357-490b-855e-d7b30b4a5593-oauth-serving-cert\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.203471 master-0 kubenswrapper[33572]: I1204 22:29:10.203426 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntp2b\" (UniqueName: \"kubernetes.io/projected/800cb931-1357-490b-855e-d7b30b4a5593-kube-api-access-ntp2b\") pod \"console-795b68ff6d-p7dxw\" (UID: \"800cb931-1357-490b-855e-d7b30b4a5593\") " pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.300442 master-0 kubenswrapper[33572]: I1204 22:29:10.300002 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs"] Dec 04 22:29:10.314270 master-0 kubenswrapper[33572]: I1204 22:29:10.314223 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb"] Dec 04 22:29:10.319248 master-0 kubenswrapper[33572]: W1204 22:29:10.319180 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e7a211d_0841_473d_87c2_953375146be8.slice/crio-7c535ac3862552895175646796f51490d736e0d4d2f7b3b1fe2a199c1153bfc2 WatchSource:0}: Error finding container 7c535ac3862552895175646796f51490d736e0d4d2f7b3b1fe2a199c1153bfc2: Status 404 returned error can't find the container with id 7c535ac3862552895175646796f51490d736e0d4d2f7b3b1fe2a199c1153bfc2 Dec 04 22:29:10.345693 master-0 kubenswrapper[33572]: I1204 22:29:10.344680 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp"] Dec 04 22:29:10.424599 master-0 kubenswrapper[33572]: I1204 22:29:10.424497 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:10.829312 master-0 kubenswrapper[33572]: I1204 22:29:10.829272 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp" event={"ID":"ed12588b-8304-4d07-9055-6944c540d15f","Type":"ContainerStarted","Data":"7cfd7062d5a643a042a71268913074ebc6af3bab11630bf8e1760085e6ccd65a"} Dec 04 22:29:10.830904 master-0 kubenswrapper[33572]: I1204 22:29:10.830830 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mcmbn" event={"ID":"0a44f088-be39-47f3-8e34-9cade0076325","Type":"ContainerStarted","Data":"dc02c9b44ad3a034e0e479432418cbcc79bc5419cb4c9c54b595faf26c92910f"} Dec 04 22:29:10.832838 master-0 kubenswrapper[33572]: I1204 22:29:10.832797 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-795b68ff6d-p7dxw"] Dec 04 22:29:10.833609 master-0 kubenswrapper[33572]: I1204 22:29:10.833577 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" event={"ID":"22dd80ed-ccaf-40b1-8837-a8aeb3c28140","Type":"ContainerStarted","Data":"b355b44cedb6bf8e7e96045e9526b0c3d68a102ba5b31b98b7a9787fe5c7efe5"} Dec 04 22:29:10.834805 master-0 kubenswrapper[33572]: W1204 22:29:10.834698 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800cb931_1357_490b_855e_d7b30b4a5593.slice/crio-84032c9254fc4855cd4320a034099d5bd1894ad49cf394d5197f12af79751918 WatchSource:0}: Error finding container 84032c9254fc4855cd4320a034099d5bd1894ad49cf394d5197f12af79751918: Status 404 returned error can't find the container with id 84032c9254fc4855cd4320a034099d5bd1894ad49cf394d5197f12af79751918 Dec 04 22:29:10.835851 master-0 kubenswrapper[33572]: I1204 22:29:10.835724 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-clzpp" event={"ID":"cc0da214-23b6-4e1e-aba6-bb88fe145246","Type":"ContainerStarted","Data":"ab256ce21de56d80429b6dcb86d9a9566b3fb3fcaa3f8bf123437a5fa38cfe55"} Dec 04 22:29:10.835851 master-0 kubenswrapper[33572]: I1204 22:29:10.835749 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-clzpp" event={"ID":"cc0da214-23b6-4e1e-aba6-bb88fe145246","Type":"ContainerStarted","Data":"b420d7770f283df52b1a828ee31fc0b0c53f0808c880029e2587951333e4b148"} Dec 04 22:29:10.835851 master-0 kubenswrapper[33572]: I1204 22:29:10.835802 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-clzpp" Dec 04 22:29:10.837208 master-0 kubenswrapper[33572]: I1204 22:29:10.837117 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" event={"ID":"4e7a211d-0841-473d-87c2-953375146be8","Type":"ContainerStarted","Data":"7c535ac3862552895175646796f51490d736e0d4d2f7b3b1fe2a199c1153bfc2"} Dec 04 22:29:10.839350 master-0 kubenswrapper[33572]: I1204 22:29:10.839310 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59s5q" event={"ID":"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610","Type":"ContainerStarted","Data":"7bce740548513a35b1fba24a420883b2a7523f5a54ed583acb5349a57f7217f4"} Dec 04 22:29:10.865339 master-0 kubenswrapper[33572]: I1204 22:29:10.864737 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-clzpp" podStartSLOduration=3.864714211 podStartE2EDuration="3.864714211s" podCreationTimestamp="2025-12-04 22:29:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:29:10.856340009 +0000 UTC m=+614.583865658" watchObservedRunningTime="2025-12-04 22:29:10.864714211 +0000 UTC m=+614.592239870" Dec 04 22:29:11.850137 master-0 kubenswrapper[33572]: I1204 22:29:11.850084 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-795b68ff6d-p7dxw" event={"ID":"800cb931-1357-490b-855e-d7b30b4a5593","Type":"ContainerStarted","Data":"fbbf8963225973f6517e4c9d743f072ece53ba942981e984a6e98416fbf989dc"} Dec 04 22:29:11.850137 master-0 kubenswrapper[33572]: I1204 22:29:11.850135 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-795b68ff6d-p7dxw" event={"ID":"800cb931-1357-490b-855e-d7b30b4a5593","Type":"ContainerStarted","Data":"84032c9254fc4855cd4320a034099d5bd1894ad49cf394d5197f12af79751918"} Dec 04 22:29:11.853676 master-0 kubenswrapper[33572]: I1204 22:29:11.853580 33572 generic.go:334] "Generic (PLEG): container finished" podID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerID="7bce740548513a35b1fba24a420883b2a7523f5a54ed583acb5349a57f7217f4" exitCode=0 Dec 04 22:29:11.853676 master-0 kubenswrapper[33572]: I1204 22:29:11.853648 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59s5q" event={"ID":"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610","Type":"ContainerDied","Data":"7bce740548513a35b1fba24a420883b2a7523f5a54ed583acb5349a57f7217f4"} Dec 04 22:29:11.902996 master-0 kubenswrapper[33572]: I1204 22:29:11.902852 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-795b68ff6d-p7dxw" podStartSLOduration=2.9028328180000003 podStartE2EDuration="2.902832818s" podCreationTimestamp="2025-12-04 22:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:29:11.902491459 +0000 UTC m=+615.630017108" watchObservedRunningTime="2025-12-04 22:29:11.902832818 +0000 UTC m=+615.630358467" Dec 04 22:29:12.701351 master-0 kubenswrapper[33572]: I1204 22:29:12.701163 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-msm58"] Dec 04 22:29:12.704245 master-0 kubenswrapper[33572]: I1204 22:29:12.704202 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:12.718684 master-0 kubenswrapper[33572]: I1204 22:29:12.716007 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-msm58"] Dec 04 22:29:12.830269 master-0 kubenswrapper[33572]: I1204 22:29:12.830178 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltwcl\" (UniqueName: \"kubernetes.io/projected/43b9152b-94ea-4be0-a892-d0d7bc5f299e-kube-api-access-ltwcl\") pod \"redhat-marketplace-msm58\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:12.830269 master-0 kubenswrapper[33572]: I1204 22:29:12.830276 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-utilities\") pod \"redhat-marketplace-msm58\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:12.830714 master-0 kubenswrapper[33572]: I1204 22:29:12.830330 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-catalog-content\") pod \"redhat-marketplace-msm58\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:12.931684 master-0 kubenswrapper[33572]: I1204 22:29:12.931636 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltwcl\" (UniqueName: \"kubernetes.io/projected/43b9152b-94ea-4be0-a892-d0d7bc5f299e-kube-api-access-ltwcl\") pod \"redhat-marketplace-msm58\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:12.931684 master-0 kubenswrapper[33572]: I1204 22:29:12.931688 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-utilities\") pod \"redhat-marketplace-msm58\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:12.932539 master-0 kubenswrapper[33572]: I1204 22:29:12.932491 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-catalog-content\") pod \"redhat-marketplace-msm58\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:12.932618 master-0 kubenswrapper[33572]: I1204 22:29:12.932582 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-utilities\") pod \"redhat-marketplace-msm58\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:12.933050 master-0 kubenswrapper[33572]: I1204 22:29:12.933026 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-catalog-content\") pod \"redhat-marketplace-msm58\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:12.951700 master-0 kubenswrapper[33572]: I1204 22:29:12.951597 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltwcl\" (UniqueName: \"kubernetes.io/projected/43b9152b-94ea-4be0-a892-d0d7bc5f299e-kube-api-access-ltwcl\") pod \"redhat-marketplace-msm58\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:13.035596 master-0 kubenswrapper[33572]: I1204 22:29:13.034570 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:15.499158 master-0 kubenswrapper[33572]: I1204 22:29:15.499096 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-msm58"] Dec 04 22:29:15.505955 master-0 kubenswrapper[33572]: W1204 22:29:15.505913 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43b9152b_94ea_4be0_a892_d0d7bc5f299e.slice/crio-c34d11449648c919d4f57fd73a9fac0be0b0a9754f6a32391ff1a1e1418021ce WatchSource:0}: Error finding container c34d11449648c919d4f57fd73a9fac0be0b0a9754f6a32391ff1a1e1418021ce: Status 404 returned error can't find the container with id c34d11449648c919d4f57fd73a9fac0be0b0a9754f6a32391ff1a1e1418021ce Dec 04 22:29:15.896683 master-0 kubenswrapper[33572]: I1204 22:29:15.896487 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" event={"ID":"4e7a211d-0841-473d-87c2-953375146be8","Type":"ContainerStarted","Data":"ee053ef97b75cef9c19ef2f3a45d89c3f4d0846c2bb3083fd6e9d2d14dfc8798"} Dec 04 22:29:15.899588 master-0 kubenswrapper[33572]: I1204 22:29:15.899525 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59s5q" event={"ID":"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610","Type":"ContainerStarted","Data":"272f41cd51619be85d71ef666cf16c84cd73ede27da9011d69f041d21915ac3c"} Dec 04 22:29:15.907474 master-0 kubenswrapper[33572]: I1204 22:29:15.907403 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" event={"ID":"f4b97d80-d72e-4c4c-92f0-4c9a983c5fca","Type":"ContainerStarted","Data":"5267ad03663ab8284c8e68b1774642b8c1b18a05519b280e9a736c79719e8dc4"} Dec 04 22:29:15.908134 master-0 kubenswrapper[33572]: I1204 22:29:15.908092 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:15.918399 master-0 kubenswrapper[33572]: I1204 22:29:15.918338 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp" event={"ID":"ed12588b-8304-4d07-9055-6944c540d15f","Type":"ContainerStarted","Data":"249f94d3ef6821395c0bca33bc9ed2da15fcb12fcfd32fcae555c00a94bdea03"} Dec 04 22:29:15.918399 master-0 kubenswrapper[33572]: I1204 22:29:15.918400 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp" event={"ID":"ed12588b-8304-4d07-9055-6944c540d15f","Type":"ContainerStarted","Data":"12ce6715700155559a397c1082268083d8c09c2419d1abb3b508058fbaaa88de"} Dec 04 22:29:15.919997 master-0 kubenswrapper[33572]: I1204 22:29:15.919964 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mcmbn" event={"ID":"0a44f088-be39-47f3-8e34-9cade0076325","Type":"ContainerStarted","Data":"4a189bdfc93aee6e4e5adf7f34796aebd130fe4e8d1bfce56bae698f8c0d4504"} Dec 04 22:29:15.920111 master-0 kubenswrapper[33572]: I1204 22:29:15.920092 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:15.921174 master-0 kubenswrapper[33572]: I1204 22:29:15.921148 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" event={"ID":"22dd80ed-ccaf-40b1-8837-a8aeb3c28140","Type":"ContainerStarted","Data":"6059505dd13cd48ec6852f8bf8d8fa4507794efda53be7ab99897f28807384e7"} Dec 04 22:29:15.922596 master-0 kubenswrapper[33572]: I1204 22:29:15.922560 33572 generic.go:334] "Generic (PLEG): container finished" podID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerID="b6ecac440dd0cbf4d787a7ad369053fc03f940003202de4b08de26c281b0098e" exitCode=0 Dec 04 22:29:15.922677 master-0 kubenswrapper[33572]: I1204 22:29:15.922618 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msm58" event={"ID":"43b9152b-94ea-4be0-a892-d0d7bc5f299e","Type":"ContainerDied","Data":"b6ecac440dd0cbf4d787a7ad369053fc03f940003202de4b08de26c281b0098e"} Dec 04 22:29:15.922677 master-0 kubenswrapper[33572]: I1204 22:29:15.922640 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msm58" event={"ID":"43b9152b-94ea-4be0-a892-d0d7bc5f299e","Type":"ContainerStarted","Data":"c34d11449648c919d4f57fd73a9fac0be0b0a9754f6a32391ff1a1e1418021ce"} Dec 04 22:29:15.925451 master-0 kubenswrapper[33572]: I1204 22:29:15.925424 33572 generic.go:334] "Generic (PLEG): container finished" podID="92452d91-986b-42fc-9778-2f78ad4482a9" containerID="1c3621c0f157d758a096a4e3e1d718d271920c7e6bd9c7429828a38582de0ff7" exitCode=0 Dec 04 22:29:15.925625 master-0 kubenswrapper[33572]: I1204 22:29:15.925481 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mbggv" event={"ID":"92452d91-986b-42fc-9778-2f78ad4482a9","Type":"ContainerDied","Data":"1c3621c0f157d758a096a4e3e1d718d271920c7e6bd9c7429828a38582de0ff7"} Dec 04 22:29:16.389441 master-0 kubenswrapper[33572]: I1204 22:29:16.389360 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" podStartSLOduration=2.66752321 podStartE2EDuration="7.389334585s" podCreationTimestamp="2025-12-04 22:29:09 +0000 UTC" firstStartedPulling="2025-12-04 22:29:10.325586075 +0000 UTC m=+614.053111724" lastFinishedPulling="2025-12-04 22:29:15.04739745 +0000 UTC m=+618.774923099" observedRunningTime="2025-12-04 22:29:16.378149075 +0000 UTC m=+620.105674724" watchObservedRunningTime="2025-12-04 22:29:16.389334585 +0000 UTC m=+620.116860244" Dec 04 22:29:16.414116 master-0 kubenswrapper[33572]: I1204 22:29:16.413156 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8rwmp" podStartSLOduration=2.72194215 podStartE2EDuration="7.413136785s" podCreationTimestamp="2025-12-04 22:29:09 +0000 UTC" firstStartedPulling="2025-12-04 22:29:10.359759513 +0000 UTC m=+614.087285162" lastFinishedPulling="2025-12-04 22:29:15.050954148 +0000 UTC m=+618.778479797" observedRunningTime="2025-12-04 22:29:16.40283727 +0000 UTC m=+620.130362939" watchObservedRunningTime="2025-12-04 22:29:16.413136785 +0000 UTC m=+620.140662434" Dec 04 22:29:16.443954 master-0 kubenswrapper[33572]: I1204 22:29:16.443450 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-59s5q" podStartSLOduration=3.188124622 podStartE2EDuration="8.443423595s" podCreationTimestamp="2025-12-04 22:29:08 +0000 UTC" firstStartedPulling="2025-12-04 22:29:09.821883782 +0000 UTC m=+613.549409421" lastFinishedPulling="2025-12-04 22:29:15.077182735 +0000 UTC m=+618.804708394" observedRunningTime="2025-12-04 22:29:16.433226642 +0000 UTC m=+620.160752291" watchObservedRunningTime="2025-12-04 22:29:16.443423595 +0000 UTC m=+620.170949264" Dec 04 22:29:16.455571 master-0 kubenswrapper[33572]: I1204 22:29:16.454267 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mcmbn" podStartSLOduration=2.247624883 podStartE2EDuration="7.454246726s" podCreationTimestamp="2025-12-04 22:29:09 +0000 UTC" firstStartedPulling="2025-12-04 22:29:09.841156937 +0000 UTC m=+613.568682586" lastFinishedPulling="2025-12-04 22:29:15.04777874 +0000 UTC m=+618.775304429" observedRunningTime="2025-12-04 22:29:16.451248473 +0000 UTC m=+620.178774122" watchObservedRunningTime="2025-12-04 22:29:16.454246726 +0000 UTC m=+620.181772375" Dec 04 22:29:16.481910 master-0 kubenswrapper[33572]: I1204 22:29:16.481839 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" podStartSLOduration=3.856820772 podStartE2EDuration="10.481816021s" podCreationTimestamp="2025-12-04 22:29:06 +0000 UTC" firstStartedPulling="2025-12-04 22:29:08.472809799 +0000 UTC m=+612.200335458" lastFinishedPulling="2025-12-04 22:29:15.097805048 +0000 UTC m=+618.825330707" observedRunningTime="2025-12-04 22:29:16.471780492 +0000 UTC m=+620.199306141" watchObservedRunningTime="2025-12-04 22:29:16.481816021 +0000 UTC m=+620.209341670" Dec 04 22:29:16.516637 master-0 kubenswrapper[33572]: I1204 22:29:16.516544 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-twslb" podStartSLOduration=2.795993573 podStartE2EDuration="7.516522023s" podCreationTimestamp="2025-12-04 22:29:09 +0000 UTC" firstStartedPulling="2025-12-04 22:29:10.329131183 +0000 UTC m=+614.056656822" lastFinishedPulling="2025-12-04 22:29:15.049659583 +0000 UTC m=+618.777185272" observedRunningTime="2025-12-04 22:29:16.509698384 +0000 UTC m=+620.237224043" watchObservedRunningTime="2025-12-04 22:29:16.516522023 +0000 UTC m=+620.244047692" Dec 04 22:29:16.949014 master-0 kubenswrapper[33572]: I1204 22:29:16.945738 33572 generic.go:334] "Generic (PLEG): container finished" podID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerID="46334dfa5af5ca0b09b1add51f4321980abf06d5e01063ed5bf9ac4e1901fc8f" exitCode=0 Dec 04 22:29:16.949014 master-0 kubenswrapper[33572]: I1204 22:29:16.945821 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msm58" event={"ID":"43b9152b-94ea-4be0-a892-d0d7bc5f299e","Type":"ContainerDied","Data":"46334dfa5af5ca0b09b1add51f4321980abf06d5e01063ed5bf9ac4e1901fc8f"} Dec 04 22:29:16.949014 master-0 kubenswrapper[33572]: I1204 22:29:16.947390 33572 generic.go:334] "Generic (PLEG): container finished" podID="92452d91-986b-42fc-9778-2f78ad4482a9" containerID="6ce421e1017dc0921d814e331ca202be9a3bc97cdc7a4bf534e9db13d7d694fa" exitCode=0 Dec 04 22:29:16.949014 master-0 kubenswrapper[33572]: I1204 22:29:16.947414 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mbggv" event={"ID":"92452d91-986b-42fc-9778-2f78ad4482a9","Type":"ContainerDied","Data":"6ce421e1017dc0921d814e331ca202be9a3bc97cdc7a4bf534e9db13d7d694fa"} Dec 04 22:29:16.949014 master-0 kubenswrapper[33572]: I1204 22:29:16.947944 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" Dec 04 22:29:17.967793 master-0 kubenswrapper[33572]: I1204 22:29:17.964280 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msm58" event={"ID":"43b9152b-94ea-4be0-a892-d0d7bc5f299e","Type":"ContainerStarted","Data":"1467409dcba690b9c1ca1d9206405b941288cbefbf84a9bd51cbc1884de3b4da"} Dec 04 22:29:17.969991 master-0 kubenswrapper[33572]: I1204 22:29:17.969935 33572 generic.go:334] "Generic (PLEG): container finished" podID="92452d91-986b-42fc-9778-2f78ad4482a9" containerID="4b83354fc406c3c4507e873ed67b03da07de8b7370dbb4e6e6474004f3775191" exitCode=0 Dec 04 22:29:17.970171 master-0 kubenswrapper[33572]: I1204 22:29:17.970101 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mbggv" event={"ID":"92452d91-986b-42fc-9778-2f78ad4482a9","Type":"ContainerDied","Data":"4b83354fc406c3c4507e873ed67b03da07de8b7370dbb4e6e6474004f3775191"} Dec 04 22:29:17.984369 master-0 kubenswrapper[33572]: I1204 22:29:17.984299 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-msm58" podStartSLOduration=4.5500138329999995 podStartE2EDuration="5.984240528s" podCreationTimestamp="2025-12-04 22:29:12 +0000 UTC" firstStartedPulling="2025-12-04 22:29:15.924214153 +0000 UTC m=+619.651739792" lastFinishedPulling="2025-12-04 22:29:17.358440838 +0000 UTC m=+621.085966487" observedRunningTime="2025-12-04 22:29:17.983158828 +0000 UTC m=+621.710684467" watchObservedRunningTime="2025-12-04 22:29:17.984240528 +0000 UTC m=+621.711766177" Dec 04 22:29:18.833617 master-0 kubenswrapper[33572]: I1204 22:29:18.833521 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:18.835674 master-0 kubenswrapper[33572]: I1204 22:29:18.835601 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:18.888098 master-0 kubenswrapper[33572]: I1204 22:29:18.888026 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:18.985952 master-0 kubenswrapper[33572]: I1204 22:29:18.985813 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mbggv" event={"ID":"92452d91-986b-42fc-9778-2f78ad4482a9","Type":"ContainerStarted","Data":"09024c0cf53bb39eb20094727e0a27b676a15fb86123b1ab8d72e5c7326e744f"} Dec 04 22:29:19.318321 master-0 kubenswrapper[33572]: I1204 22:29:19.318210 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-clzpp" Dec 04 22:29:19.997031 master-0 kubenswrapper[33572]: I1204 22:29:19.996970 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mbggv" event={"ID":"92452d91-986b-42fc-9778-2f78ad4482a9","Type":"ContainerStarted","Data":"476e7347e6e2880d9c069a2be98c333a92eb7e6890c5dfcabab30b63ff0a421e"} Dec 04 22:29:19.997031 master-0 kubenswrapper[33572]: I1204 22:29:19.997036 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mbggv" event={"ID":"92452d91-986b-42fc-9778-2f78ad4482a9","Type":"ContainerStarted","Data":"a71614d8596e2b458c47b26f594c187af155a0ed0c27b7e10f0fa891ae853053"} Dec 04 22:29:20.424701 master-0 kubenswrapper[33572]: I1204 22:29:20.424637 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:20.424701 master-0 kubenswrapper[33572]: I1204 22:29:20.424688 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:20.428858 master-0 kubenswrapper[33572]: I1204 22:29:20.428761 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:21.431398 master-0 kubenswrapper[33572]: I1204 22:29:21.431332 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mbggv" event={"ID":"92452d91-986b-42fc-9778-2f78ad4482a9","Type":"ContainerStarted","Data":"ab4ec12d4f1fa0e4293382da931ee3d3eae43e8fd87d87c16def70f3ce13c78d"} Dec 04 22:29:21.437037 master-0 kubenswrapper[33572]: I1204 22:29:21.436969 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-795b68ff6d-p7dxw" Dec 04 22:29:23.035946 master-0 kubenswrapper[33572]: I1204 22:29:23.035854 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:23.036620 master-0 kubenswrapper[33572]: I1204 22:29:23.035978 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:23.089353 master-0 kubenswrapper[33572]: I1204 22:29:23.089307 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:23.258879 master-0 kubenswrapper[33572]: I1204 22:29:23.258411 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64b5bcd658-ztwxm"] Dec 04 22:29:23.458451 master-0 kubenswrapper[33572]: I1204 22:29:23.458286 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mbggv" event={"ID":"92452d91-986b-42fc-9778-2f78ad4482a9","Type":"ContainerStarted","Data":"f0f8b8e79a8215e98b4878261c67ebe1a04d81db5f32f64fd2f11bfe7c49d567"} Dec 04 22:29:23.458451 master-0 kubenswrapper[33572]: I1204 22:29:23.458344 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mbggv" event={"ID":"92452d91-986b-42fc-9778-2f78ad4482a9","Type":"ContainerStarted","Data":"038db6bcc5b713b0bb1c051b693cfb4e36ea22635c8710e1b0acd7929c7a0568"} Dec 04 22:29:23.458767 master-0 kubenswrapper[33572]: I1204 22:29:23.458476 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:23.540876 master-0 kubenswrapper[33572]: I1204 22:29:23.540802 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:23.761324 master-0 kubenswrapper[33572]: I1204 22:29:23.761081 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-mbggv" podStartSLOduration=10.262120426 podStartE2EDuration="17.761055118s" podCreationTimestamp="2025-12-04 22:29:06 +0000 UTC" firstStartedPulling="2025-12-04 22:29:07.597637201 +0000 UTC m=+611.325162870" lastFinishedPulling="2025-12-04 22:29:15.096571863 +0000 UTC m=+618.824097562" observedRunningTime="2025-12-04 22:29:23.752273704 +0000 UTC m=+627.479799403" watchObservedRunningTime="2025-12-04 22:29:23.761055118 +0000 UTC m=+627.488580787" Dec 04 22:29:24.275945 master-0 kubenswrapper[33572]: I1204 22:29:24.275861 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-msm58"] Dec 04 22:29:24.846737 master-0 kubenswrapper[33572]: I1204 22:29:24.846617 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mcmbn" Dec 04 22:29:25.481354 master-0 kubenswrapper[33572]: I1204 22:29:25.481226 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-msm58" podUID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerName="registry-server" containerID="cri-o://1467409dcba690b9c1ca1d9206405b941288cbefbf84a9bd51cbc1884de3b4da" gracePeriod=2 Dec 04 22:29:27.474711 master-0 kubenswrapper[33572]: I1204 22:29:27.474602 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:27.545821 master-0 kubenswrapper[33572]: I1204 22:29:27.545723 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:27.650453 master-0 kubenswrapper[33572]: I1204 22:29:27.650374 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-v5nvt" Dec 04 22:29:28.033631 master-0 kubenswrapper[33572]: I1204 22:29:28.033542 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-27xx2" Dec 04 22:29:28.516444 master-0 kubenswrapper[33572]: I1204 22:29:28.516329 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-msm58_43b9152b-94ea-4be0-a892-d0d7bc5f299e/registry-server/0.log" Dec 04 22:29:28.518222 master-0 kubenswrapper[33572]: I1204 22:29:28.518153 33572 generic.go:334] "Generic (PLEG): container finished" podID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerID="1467409dcba690b9c1ca1d9206405b941288cbefbf84a9bd51cbc1884de3b4da" exitCode=137 Dec 04 22:29:28.518360 master-0 kubenswrapper[33572]: I1204 22:29:28.518245 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msm58" event={"ID":"43b9152b-94ea-4be0-a892-d0d7bc5f299e","Type":"ContainerDied","Data":"1467409dcba690b9c1ca1d9206405b941288cbefbf84a9bd51cbc1884de3b4da"} Dec 04 22:29:28.912912 master-0 kubenswrapper[33572]: I1204 22:29:28.912744 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:29.767852 master-0 kubenswrapper[33572]: I1204 22:29:29.767773 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-265zs" Dec 04 22:29:29.799577 master-0 kubenswrapper[33572]: I1204 22:29:29.799466 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-msm58_43b9152b-94ea-4be0-a892-d0d7bc5f299e/registry-server/0.log" Dec 04 22:29:29.800593 master-0 kubenswrapper[33572]: I1204 22:29:29.800548 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:29.987948 master-0 kubenswrapper[33572]: I1204 22:29:29.987817 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-utilities\") pod \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " Dec 04 22:29:29.987948 master-0 kubenswrapper[33572]: I1204 22:29:29.987915 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-catalog-content\") pod \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " Dec 04 22:29:29.987948 master-0 kubenswrapper[33572]: I1204 22:29:29.987945 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltwcl\" (UniqueName: \"kubernetes.io/projected/43b9152b-94ea-4be0-a892-d0d7bc5f299e-kube-api-access-ltwcl\") pod \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\" (UID: \"43b9152b-94ea-4be0-a892-d0d7bc5f299e\") " Dec 04 22:29:29.989341 master-0 kubenswrapper[33572]: I1204 22:29:29.989290 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-utilities" (OuterVolumeSpecName: "utilities") pod "43b9152b-94ea-4be0-a892-d0d7bc5f299e" (UID: "43b9152b-94ea-4be0-a892-d0d7bc5f299e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:29:29.992354 master-0 kubenswrapper[33572]: I1204 22:29:29.991680 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b9152b-94ea-4be0-a892-d0d7bc5f299e-kube-api-access-ltwcl" (OuterVolumeSpecName: "kube-api-access-ltwcl") pod "43b9152b-94ea-4be0-a892-d0d7bc5f299e" (UID: "43b9152b-94ea-4be0-a892-d0d7bc5f299e"). InnerVolumeSpecName "kube-api-access-ltwcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:29:30.017473 master-0 kubenswrapper[33572]: I1204 22:29:30.017432 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43b9152b-94ea-4be0-a892-d0d7bc5f299e" (UID: "43b9152b-94ea-4be0-a892-d0d7bc5f299e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:29:30.089931 master-0 kubenswrapper[33572]: I1204 22:29:30.089659 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:30.089931 master-0 kubenswrapper[33572]: I1204 22:29:30.089703 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43b9152b-94ea-4be0-a892-d0d7bc5f299e-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:30.089931 master-0 kubenswrapper[33572]: I1204 22:29:30.089715 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltwcl\" (UniqueName: \"kubernetes.io/projected/43b9152b-94ea-4be0-a892-d0d7bc5f299e-kube-api-access-ltwcl\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:30.536596 master-0 kubenswrapper[33572]: I1204 22:29:30.536517 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-msm58_43b9152b-94ea-4be0-a892-d0d7bc5f299e/registry-server/0.log" Dec 04 22:29:30.537122 master-0 kubenswrapper[33572]: I1204 22:29:30.537080 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-msm58" event={"ID":"43b9152b-94ea-4be0-a892-d0d7bc5f299e","Type":"ContainerDied","Data":"c34d11449648c919d4f57fd73a9fac0be0b0a9754f6a32391ff1a1e1418021ce"} Dec 04 22:29:30.537122 master-0 kubenswrapper[33572]: I1204 22:29:30.537126 33572 scope.go:117] "RemoveContainer" containerID="1467409dcba690b9c1ca1d9206405b941288cbefbf84a9bd51cbc1884de3b4da" Dec 04 22:29:30.537387 master-0 kubenswrapper[33572]: I1204 22:29:30.537135 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-msm58" Dec 04 22:29:30.572445 master-0 kubenswrapper[33572]: I1204 22:29:30.568979 33572 scope.go:117] "RemoveContainer" containerID="46334dfa5af5ca0b09b1add51f4321980abf06d5e01063ed5bf9ac4e1901fc8f" Dec 04 22:29:30.577995 master-0 kubenswrapper[33572]: I1204 22:29:30.577946 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-msm58"] Dec 04 22:29:30.585487 master-0 kubenswrapper[33572]: I1204 22:29:30.585419 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-msm58"] Dec 04 22:29:30.592652 master-0 kubenswrapper[33572]: I1204 22:29:30.592608 33572 scope.go:117] "RemoveContainer" containerID="b6ecac440dd0cbf4d787a7ad369053fc03f940003202de4b08de26c281b0098e" Dec 04 22:29:31.128622 master-0 kubenswrapper[33572]: I1204 22:29:31.128489 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59s5q"] Dec 04 22:29:31.129927 master-0 kubenswrapper[33572]: I1204 22:29:31.129430 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-59s5q" podUID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerName="registry-server" containerID="cri-o://272f41cd51619be85d71ef666cf16c84cd73ede27da9011d69f041d21915ac3c" gracePeriod=2 Dec 04 22:29:31.562153 master-0 kubenswrapper[33572]: I1204 22:29:31.561996 33572 generic.go:334] "Generic (PLEG): container finished" podID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerID="272f41cd51619be85d71ef666cf16c84cd73ede27da9011d69f041d21915ac3c" exitCode=0 Dec 04 22:29:31.562153 master-0 kubenswrapper[33572]: I1204 22:29:31.562050 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59s5q" event={"ID":"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610","Type":"ContainerDied","Data":"272f41cd51619be85d71ef666cf16c84cd73ede27da9011d69f041d21915ac3c"} Dec 04 22:29:31.562153 master-0 kubenswrapper[33572]: I1204 22:29:31.562090 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59s5q" event={"ID":"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610","Type":"ContainerDied","Data":"7877a22b1ebf679b42f1a5f3610580ff7029831eabe8712e80d8e973ec2b0137"} Dec 04 22:29:31.562153 master-0 kubenswrapper[33572]: I1204 22:29:31.562104 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7877a22b1ebf679b42f1a5f3610580ff7029831eabe8712e80d8e973ec2b0137" Dec 04 22:29:31.566320 master-0 kubenswrapper[33572]: I1204 22:29:31.565862 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:31.715890 master-0 kubenswrapper[33572]: I1204 22:29:31.715822 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-catalog-content\") pod \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " Dec 04 22:29:31.715890 master-0 kubenswrapper[33572]: I1204 22:29:31.715899 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw5nd\" (UniqueName: \"kubernetes.io/projected/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-kube-api-access-fw5nd\") pod \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " Dec 04 22:29:31.716202 master-0 kubenswrapper[33572]: I1204 22:29:31.715924 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-utilities\") pod \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\" (UID: \"0ca2ec70-b3b4-4ee5-8ad1-dc115e221610\") " Dec 04 22:29:31.717652 master-0 kubenswrapper[33572]: I1204 22:29:31.717618 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-utilities" (OuterVolumeSpecName: "utilities") pod "0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" (UID: "0ca2ec70-b3b4-4ee5-8ad1-dc115e221610"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:29:31.718977 master-0 kubenswrapper[33572]: I1204 22:29:31.718959 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-kube-api-access-fw5nd" (OuterVolumeSpecName: "kube-api-access-fw5nd") pod "0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" (UID: "0ca2ec70-b3b4-4ee5-8ad1-dc115e221610"). InnerVolumeSpecName "kube-api-access-fw5nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:29:31.772933 master-0 kubenswrapper[33572]: I1204 22:29:31.772815 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" (UID: "0ca2ec70-b3b4-4ee5-8ad1-dc115e221610"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:29:31.818875 master-0 kubenswrapper[33572]: I1204 22:29:31.818821 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:31.818875 master-0 kubenswrapper[33572]: I1204 22:29:31.818872 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw5nd\" (UniqueName: \"kubernetes.io/projected/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-kube-api-access-fw5nd\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:31.818875 master-0 kubenswrapper[33572]: I1204 22:29:31.818888 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:32.536998 master-0 kubenswrapper[33572]: I1204 22:29:32.536927 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" path="/var/lib/kubelet/pods/43b9152b-94ea-4be0-a892-d0d7bc5f299e/volumes" Dec 04 22:29:32.572352 master-0 kubenswrapper[33572]: I1204 22:29:32.572274 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59s5q" Dec 04 22:29:32.631337 master-0 kubenswrapper[33572]: I1204 22:29:32.631256 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59s5q"] Dec 04 22:29:32.850948 master-0 kubenswrapper[33572]: I1204 22:29:32.850813 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-59s5q"] Dec 04 22:29:34.547087 master-0 kubenswrapper[33572]: I1204 22:29:34.546994 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" path="/var/lib/kubelet/pods/0ca2ec70-b3b4-4ee5-8ad1-dc115e221610/volumes" Dec 04 22:29:35.355358 master-0 kubenswrapper[33572]: I1204 22:29:35.355260 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-7m9pd"] Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: E1204 22:29:35.355612 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerName="extract-content" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: I1204 22:29:35.355626 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerName="extract-content" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: E1204 22:29:35.355640 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerName="extract-content" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: I1204 22:29:35.355647 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerName="extract-content" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: E1204 22:29:35.355661 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerName="registry-server" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: I1204 22:29:35.355667 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerName="registry-server" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: E1204 22:29:35.355681 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerName="extract-utilities" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: I1204 22:29:35.355687 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerName="extract-utilities" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: E1204 22:29:35.355700 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerName="extract-utilities" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: I1204 22:29:35.355706 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerName="extract-utilities" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: E1204 22:29:35.355727 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerName="registry-server" Dec 04 22:29:35.355821 master-0 kubenswrapper[33572]: I1204 22:29:35.355733 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerName="registry-server" Dec 04 22:29:35.356664 master-0 kubenswrapper[33572]: I1204 22:29:35.355894 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b9152b-94ea-4be0-a892-d0d7bc5f299e" containerName="registry-server" Dec 04 22:29:35.356664 master-0 kubenswrapper[33572]: I1204 22:29:35.355931 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ca2ec70-b3b4-4ee5-8ad1-dc115e221610" containerName="registry-server" Dec 04 22:29:35.356664 master-0 kubenswrapper[33572]: I1204 22:29:35.356427 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.358942 master-0 kubenswrapper[33572]: I1204 22:29:35.358876 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Dec 04 22:29:35.430366 master-0 kubenswrapper[33572]: I1204 22:29:35.430202 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-pod-volumes-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.430627 master-0 kubenswrapper[33572]: I1204 22:29:35.430380 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-file-lock-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.430627 master-0 kubenswrapper[33572]: I1204 22:29:35.430423 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-registration-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.430627 master-0 kubenswrapper[33572]: I1204 22:29:35.430540 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkqdz\" (UniqueName: \"kubernetes.io/projected/7ad816db-4466-4e46-bc8d-34ee253f2fe8-kube-api-access-rkqdz\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.430627 master-0 kubenswrapper[33572]: I1204 22:29:35.430603 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ad816db-4466-4e46-bc8d-34ee253f2fe8-metrics-cert\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.430770 master-0 kubenswrapper[33572]: I1204 22:29:35.430707 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-run-udev\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.430804 master-0 kubenswrapper[33572]: I1204 22:29:35.430788 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-lvmd-config\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.430838 master-0 kubenswrapper[33572]: I1204 22:29:35.430821 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-device-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.431029 master-0 kubenswrapper[33572]: I1204 22:29:35.430998 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-sys\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.431077 master-0 kubenswrapper[33572]: I1204 22:29:35.431045 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-node-plugin-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.431176 master-0 kubenswrapper[33572]: I1204 22:29:35.431144 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-csi-plugin-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.448812 master-0 kubenswrapper[33572]: I1204 22:29:35.448732 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-7m9pd"] Dec 04 22:29:35.533205 master-0 kubenswrapper[33572]: I1204 22:29:35.533108 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-sys\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533205 master-0 kubenswrapper[33572]: I1204 22:29:35.533193 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-node-plugin-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533596 master-0 kubenswrapper[33572]: I1204 22:29:35.533229 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-csi-plugin-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533596 master-0 kubenswrapper[33572]: I1204 22:29:35.533258 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-pod-volumes-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533596 master-0 kubenswrapper[33572]: I1204 22:29:35.533279 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-file-lock-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533596 master-0 kubenswrapper[33572]: I1204 22:29:35.533300 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-registration-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533596 master-0 kubenswrapper[33572]: I1204 22:29:35.533322 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkqdz\" (UniqueName: \"kubernetes.io/projected/7ad816db-4466-4e46-bc8d-34ee253f2fe8-kube-api-access-rkqdz\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533596 master-0 kubenswrapper[33572]: I1204 22:29:35.533346 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ad816db-4466-4e46-bc8d-34ee253f2fe8-metrics-cert\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533596 master-0 kubenswrapper[33572]: I1204 22:29:35.533374 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-run-udev\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533596 master-0 kubenswrapper[33572]: I1204 22:29:35.533398 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-lvmd-config\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533596 master-0 kubenswrapper[33572]: I1204 22:29:35.533418 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-device-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.533596 master-0 kubenswrapper[33572]: I1204 22:29:35.533561 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-device-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.534133 master-0 kubenswrapper[33572]: I1204 22:29:35.533632 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-registration-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.534133 master-0 kubenswrapper[33572]: I1204 22:29:35.534013 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-file-lock-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.534207 master-0 kubenswrapper[33572]: I1204 22:29:35.534155 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-sys\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.534427 master-0 kubenswrapper[33572]: I1204 22:29:35.534387 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-node-plugin-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.534488 master-0 kubenswrapper[33572]: I1204 22:29:35.534409 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-run-udev\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.534574 master-0 kubenswrapper[33572]: I1204 22:29:35.534490 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-pod-volumes-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.534574 master-0 kubenswrapper[33572]: I1204 22:29:35.534550 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-lvmd-config\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.534636 master-0 kubenswrapper[33572]: I1204 22:29:35.534559 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/7ad816db-4466-4e46-bc8d-34ee253f2fe8-csi-plugin-dir\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.537197 master-0 kubenswrapper[33572]: I1204 22:29:35.537162 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/7ad816db-4466-4e46-bc8d-34ee253f2fe8-metrics-cert\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.551262 master-0 kubenswrapper[33572]: I1204 22:29:35.551199 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkqdz\" (UniqueName: \"kubernetes.io/projected/7ad816db-4466-4e46-bc8d-34ee253f2fe8-kube-api-access-rkqdz\") pod \"vg-manager-7m9pd\" (UID: \"7ad816db-4466-4e46-bc8d-34ee253f2fe8\") " pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:35.699195 master-0 kubenswrapper[33572]: I1204 22:29:35.699074 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:36.204716 master-0 kubenswrapper[33572]: W1204 22:29:36.204649 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ad816db_4466_4e46_bc8d_34ee253f2fe8.slice/crio-eb3866e69d403ff6639cd49531204821d0b16c076b76eec5823489edc600beaa WatchSource:0}: Error finding container eb3866e69d403ff6639cd49531204821d0b16c076b76eec5823489edc600beaa: Status 404 returned error can't find the container with id eb3866e69d403ff6639cd49531204821d0b16c076b76eec5823489edc600beaa Dec 04 22:29:36.208646 master-0 kubenswrapper[33572]: I1204 22:29:36.208572 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-7m9pd"] Dec 04 22:29:36.615044 master-0 kubenswrapper[33572]: I1204 22:29:36.614958 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-7m9pd" event={"ID":"7ad816db-4466-4e46-bc8d-34ee253f2fe8","Type":"ContainerStarted","Data":"f5630eb77892f4acf64fb11ce2cab80b98f5070d86efe8450c4b3f975c1d8afe"} Dec 04 22:29:36.615044 master-0 kubenswrapper[33572]: I1204 22:29:36.615008 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-7m9pd" event={"ID":"7ad816db-4466-4e46-bc8d-34ee253f2fe8","Type":"ContainerStarted","Data":"eb3866e69d403ff6639cd49531204821d0b16c076b76eec5823489edc600beaa"} Dec 04 22:29:36.650326 master-0 kubenswrapper[33572]: I1204 22:29:36.649829 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-7m9pd" podStartSLOduration=1.649805483 podStartE2EDuration="1.649805483s" podCreationTimestamp="2025-12-04 22:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:29:36.646865901 +0000 UTC m=+640.374391550" watchObservedRunningTime="2025-12-04 22:29:36.649805483 +0000 UTC m=+640.377331142" Dec 04 22:29:37.479547 master-0 kubenswrapper[33572]: I1204 22:29:37.479459 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-mbggv" Dec 04 22:29:38.636413 master-0 kubenswrapper[33572]: I1204 22:29:38.636234 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-7m9pd_7ad816db-4466-4e46-bc8d-34ee253f2fe8/vg-manager/0.log" Dec 04 22:29:38.636413 master-0 kubenswrapper[33572]: I1204 22:29:38.636332 33572 generic.go:334] "Generic (PLEG): container finished" podID="7ad816db-4466-4e46-bc8d-34ee253f2fe8" containerID="f5630eb77892f4acf64fb11ce2cab80b98f5070d86efe8450c4b3f975c1d8afe" exitCode=1 Dec 04 22:29:38.637326 master-0 kubenswrapper[33572]: I1204 22:29:38.636811 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-7m9pd" event={"ID":"7ad816db-4466-4e46-bc8d-34ee253f2fe8","Type":"ContainerDied","Data":"f5630eb77892f4acf64fb11ce2cab80b98f5070d86efe8450c4b3f975c1d8afe"} Dec 04 22:29:38.637411 master-0 kubenswrapper[33572]: I1204 22:29:38.637358 33572 scope.go:117] "RemoveContainer" containerID="f5630eb77892f4acf64fb11ce2cab80b98f5070d86efe8450c4b3f975c1d8afe" Dec 04 22:29:38.990412 master-0 kubenswrapper[33572]: I1204 22:29:38.990303 33572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Dec 04 22:29:39.668843 master-0 kubenswrapper[33572]: I1204 22:29:39.668752 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-7m9pd_7ad816db-4466-4e46-bc8d-34ee253f2fe8/vg-manager/0.log" Dec 04 22:29:39.671828 master-0 kubenswrapper[33572]: I1204 22:29:39.668861 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-7m9pd" event={"ID":"7ad816db-4466-4e46-bc8d-34ee253f2fe8","Type":"ContainerStarted","Data":"5230b770b2da2a2de3a98e7473bfe74878a78e378c7304dc1e488eb449522e89"} Dec 04 22:29:39.877726 master-0 kubenswrapper[33572]: I1204 22:29:39.877552 33572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2025-12-04T22:29:38.990627868Z","Handler":null,"Name":""} Dec 04 22:29:39.880204 master-0 kubenswrapper[33572]: I1204 22:29:39.880150 33572 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Dec 04 22:29:39.880204 master-0 kubenswrapper[33572]: I1204 22:29:39.880206 33572 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Dec 04 22:29:45.702855 master-0 kubenswrapper[33572]: I1204 22:29:45.702715 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:45.704849 master-0 kubenswrapper[33572]: I1204 22:29:45.704812 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:45.743576 master-0 kubenswrapper[33572]: I1204 22:29:45.743493 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:45.756382 master-0 kubenswrapper[33572]: I1204 22:29:45.756321 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-7m9pd" Dec 04 22:29:48.116119 master-0 kubenswrapper[33572]: I1204 22:29:48.116041 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mlm9f"] Dec 04 22:29:48.117422 master-0 kubenswrapper[33572]: I1204 22:29:48.117387 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mlm9f" Dec 04 22:29:48.119332 master-0 kubenswrapper[33572]: I1204 22:29:48.119295 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 04 22:29:48.119618 master-0 kubenswrapper[33572]: I1204 22:29:48.119582 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 04 22:29:48.141520 master-0 kubenswrapper[33572]: I1204 22:29:48.141425 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mlm9f"] Dec 04 22:29:48.224232 master-0 kubenswrapper[33572]: I1204 22:29:48.224151 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27ng\" (UniqueName: \"kubernetes.io/projected/a7ff0244-14a7-41f4-b881-5de6342e3dfa-kube-api-access-c27ng\") pod \"openstack-operator-index-mlm9f\" (UID: \"a7ff0244-14a7-41f4-b881-5de6342e3dfa\") " pod="openstack-operators/openstack-operator-index-mlm9f" Dec 04 22:29:48.311865 master-0 kubenswrapper[33572]: I1204 22:29:48.311800 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-64b5bcd658-ztwxm" podUID="da9b6cb4-13db-408a-8a53-c03f64ccea5a" containerName="console" containerID="cri-o://4c5f8fe348cfd361d4cff444c1a27d696284b954e7c06243153cba9f36ad5161" gracePeriod=15 Dec 04 22:29:48.331335 master-0 kubenswrapper[33572]: I1204 22:29:48.331246 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c27ng\" (UniqueName: \"kubernetes.io/projected/a7ff0244-14a7-41f4-b881-5de6342e3dfa-kube-api-access-c27ng\") pod \"openstack-operator-index-mlm9f\" (UID: \"a7ff0244-14a7-41f4-b881-5de6342e3dfa\") " pod="openstack-operators/openstack-operator-index-mlm9f" Dec 04 22:29:48.348347 master-0 kubenswrapper[33572]: I1204 22:29:48.348281 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27ng\" (UniqueName: \"kubernetes.io/projected/a7ff0244-14a7-41f4-b881-5de6342e3dfa-kube-api-access-c27ng\") pod \"openstack-operator-index-mlm9f\" (UID: \"a7ff0244-14a7-41f4-b881-5de6342e3dfa\") " pod="openstack-operators/openstack-operator-index-mlm9f" Dec 04 22:29:48.447402 master-0 kubenswrapper[33572]: I1204 22:29:48.447331 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mlm9f" Dec 04 22:29:48.804202 master-0 kubenswrapper[33572]: I1204 22:29:48.804154 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64b5bcd658-ztwxm_da9b6cb4-13db-408a-8a53-c03f64ccea5a/console/0.log" Dec 04 22:29:48.804309 master-0 kubenswrapper[33572]: I1204 22:29:48.804212 33572 generic.go:334] "Generic (PLEG): container finished" podID="da9b6cb4-13db-408a-8a53-c03f64ccea5a" containerID="4c5f8fe348cfd361d4cff444c1a27d696284b954e7c06243153cba9f36ad5161" exitCode=2 Dec 04 22:29:48.804309 master-0 kubenswrapper[33572]: I1204 22:29:48.804245 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b5bcd658-ztwxm" event={"ID":"da9b6cb4-13db-408a-8a53-c03f64ccea5a","Type":"ContainerDied","Data":"4c5f8fe348cfd361d4cff444c1a27d696284b954e7c06243153cba9f36ad5161"} Dec 04 22:29:48.804309 master-0 kubenswrapper[33572]: I1204 22:29:48.804272 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b5bcd658-ztwxm" event={"ID":"da9b6cb4-13db-408a-8a53-c03f64ccea5a","Type":"ContainerDied","Data":"6084e4ae3f1babbf58d4e534213427d4af231f66bbf8a2790288527947b2adda"} Dec 04 22:29:48.804309 master-0 kubenswrapper[33572]: I1204 22:29:48.804283 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6084e4ae3f1babbf58d4e534213427d4af231f66bbf8a2790288527947b2adda" Dec 04 22:29:48.806007 master-0 kubenswrapper[33572]: I1204 22:29:48.805988 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64b5bcd658-ztwxm_da9b6cb4-13db-408a-8a53-c03f64ccea5a/console/0.log" Dec 04 22:29:48.806072 master-0 kubenswrapper[33572]: I1204 22:29:48.806045 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:29:48.946188 master-0 kubenswrapper[33572]: I1204 22:29:48.946113 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7cg2\" (UniqueName: \"kubernetes.io/projected/da9b6cb4-13db-408a-8a53-c03f64ccea5a-kube-api-access-t7cg2\") pod \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " Dec 04 22:29:48.946188 master-0 kubenswrapper[33572]: I1204 22:29:48.946177 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-config\") pod \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " Dec 04 22:29:48.946473 master-0 kubenswrapper[33572]: I1204 22:29:48.946282 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-trusted-ca-bundle\") pod \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " Dec 04 22:29:48.946967 master-0 kubenswrapper[33572]: I1204 22:29:48.946763 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-config" (OuterVolumeSpecName: "console-config") pod "da9b6cb4-13db-408a-8a53-c03f64ccea5a" (UID: "da9b6cb4-13db-408a-8a53-c03f64ccea5a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:29:48.946967 master-0 kubenswrapper[33572]: I1204 22:29:48.946832 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "da9b6cb4-13db-408a-8a53-c03f64ccea5a" (UID: "da9b6cb4-13db-408a-8a53-c03f64ccea5a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:29:48.947169 master-0 kubenswrapper[33572]: I1204 22:29:48.947139 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-oauth-serving-cert\") pod \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " Dec 04 22:29:48.947280 master-0 kubenswrapper[33572]: I1204 22:29:48.947255 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-service-ca\") pod \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " Dec 04 22:29:48.947594 master-0 kubenswrapper[33572]: I1204 22:29:48.947568 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "da9b6cb4-13db-408a-8a53-c03f64ccea5a" (UID: "da9b6cb4-13db-408a-8a53-c03f64ccea5a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:29:48.947926 master-0 kubenswrapper[33572]: I1204 22:29:48.947881 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-service-ca" (OuterVolumeSpecName: "service-ca") pod "da9b6cb4-13db-408a-8a53-c03f64ccea5a" (UID: "da9b6cb4-13db-408a-8a53-c03f64ccea5a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:29:48.947985 master-0 kubenswrapper[33572]: I1204 22:29:48.947968 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-oauth-config\") pod \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " Dec 04 22:29:48.948586 master-0 kubenswrapper[33572]: I1204 22:29:48.948522 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-serving-cert\") pod \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\" (UID: \"da9b6cb4-13db-408a-8a53-c03f64ccea5a\") " Dec 04 22:29:48.949220 master-0 kubenswrapper[33572]: I1204 22:29:48.949176 33572 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:48.949220 master-0 kubenswrapper[33572]: I1204 22:29:48.949213 33572 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:48.949320 master-0 kubenswrapper[33572]: I1204 22:29:48.949231 33572 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:48.949320 master-0 kubenswrapper[33572]: I1204 22:29:48.949251 33572 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da9b6cb4-13db-408a-8a53-c03f64ccea5a-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:48.950617 master-0 kubenswrapper[33572]: I1204 22:29:48.950568 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "da9b6cb4-13db-408a-8a53-c03f64ccea5a" (UID: "da9b6cb4-13db-408a-8a53-c03f64ccea5a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:29:48.952683 master-0 kubenswrapper[33572]: I1204 22:29:48.952611 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9b6cb4-13db-408a-8a53-c03f64ccea5a-kube-api-access-t7cg2" (OuterVolumeSpecName: "kube-api-access-t7cg2") pod "da9b6cb4-13db-408a-8a53-c03f64ccea5a" (UID: "da9b6cb4-13db-408a-8a53-c03f64ccea5a"). InnerVolumeSpecName "kube-api-access-t7cg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:29:48.955780 master-0 kubenswrapper[33572]: I1204 22:29:48.955731 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "da9b6cb4-13db-408a-8a53-c03f64ccea5a" (UID: "da9b6cb4-13db-408a-8a53-c03f64ccea5a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:29:49.051236 master-0 kubenswrapper[33572]: I1204 22:29:49.050844 33572 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:49.051236 master-0 kubenswrapper[33572]: I1204 22:29:49.050917 33572 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/da9b6cb4-13db-408a-8a53-c03f64ccea5a-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:49.051236 master-0 kubenswrapper[33572]: I1204 22:29:49.050937 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7cg2\" (UniqueName: \"kubernetes.io/projected/da9b6cb4-13db-408a-8a53-c03f64ccea5a-kube-api-access-t7cg2\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:49.063622 master-0 kubenswrapper[33572]: I1204 22:29:49.063544 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mlm9f"] Dec 04 22:29:49.815016 master-0 kubenswrapper[33572]: I1204 22:29:49.814974 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b5bcd658-ztwxm" Dec 04 22:29:49.815639 master-0 kubenswrapper[33572]: I1204 22:29:49.814960 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mlm9f" event={"ID":"a7ff0244-14a7-41f4-b881-5de6342e3dfa","Type":"ContainerStarted","Data":"e351bfe9b6ab48ecab3e230a94ca461a761017491b2a866a891dd9f75ae14b51"} Dec 04 22:29:49.864740 master-0 kubenswrapper[33572]: I1204 22:29:49.864649 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64b5bcd658-ztwxm"] Dec 04 22:29:49.874099 master-0 kubenswrapper[33572]: I1204 22:29:49.874045 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64b5bcd658-ztwxm"] Dec 04 22:29:50.544623 master-0 kubenswrapper[33572]: I1204 22:29:50.544485 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da9b6cb4-13db-408a-8a53-c03f64ccea5a" path="/var/lib/kubelet/pods/da9b6cb4-13db-408a-8a53-c03f64ccea5a/volumes" Dec 04 22:29:50.830130 master-0 kubenswrapper[33572]: I1204 22:29:50.829943 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mlm9f" event={"ID":"a7ff0244-14a7-41f4-b881-5de6342e3dfa","Type":"ContainerStarted","Data":"8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14"} Dec 04 22:29:50.873294 master-0 kubenswrapper[33572]: I1204 22:29:50.873093 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mlm9f" podStartSLOduration=1.948196563 podStartE2EDuration="2.873055489s" podCreationTimestamp="2025-12-04 22:29:48 +0000 UTC" firstStartedPulling="2025-12-04 22:29:49.068814789 +0000 UTC m=+652.796340478" lastFinishedPulling="2025-12-04 22:29:49.993673745 +0000 UTC m=+653.721199404" observedRunningTime="2025-12-04 22:29:50.853686161 +0000 UTC m=+654.581211880" watchObservedRunningTime="2025-12-04 22:29:50.873055489 +0000 UTC m=+654.600581168" Dec 04 22:29:52.283046 master-0 kubenswrapper[33572]: I1204 22:29:52.282905 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mlm9f"] Dec 04 22:29:52.871260 master-0 kubenswrapper[33572]: I1204 22:29:52.871052 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mlm9f" podUID="a7ff0244-14a7-41f4-b881-5de6342e3dfa" containerName="registry-server" containerID="cri-o://8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14" gracePeriod=2 Dec 04 22:29:52.899498 master-0 kubenswrapper[33572]: I1204 22:29:52.899369 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zbrtw"] Dec 04 22:29:52.900138 master-0 kubenswrapper[33572]: E1204 22:29:52.900052 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9b6cb4-13db-408a-8a53-c03f64ccea5a" containerName="console" Dec 04 22:29:52.900138 master-0 kubenswrapper[33572]: I1204 22:29:52.900085 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9b6cb4-13db-408a-8a53-c03f64ccea5a" containerName="console" Dec 04 22:29:52.900604 master-0 kubenswrapper[33572]: I1204 22:29:52.900484 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9b6cb4-13db-408a-8a53-c03f64ccea5a" containerName="console" Dec 04 22:29:52.901560 master-0 kubenswrapper[33572]: I1204 22:29:52.901466 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zbrtw" Dec 04 22:29:52.905877 master-0 kubenswrapper[33572]: I1204 22:29:52.905769 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zbrtw"] Dec 04 22:29:53.028575 master-0 kubenswrapper[33572]: I1204 22:29:53.028486 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxjx9\" (UniqueName: \"kubernetes.io/projected/a08580e2-9965-4142-91cf-0c09d82a50b7-kube-api-access-pxjx9\") pod \"openstack-operator-index-zbrtw\" (UID: \"a08580e2-9965-4142-91cf-0c09d82a50b7\") " pod="openstack-operators/openstack-operator-index-zbrtw" Dec 04 22:29:53.131406 master-0 kubenswrapper[33572]: I1204 22:29:53.131204 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxjx9\" (UniqueName: \"kubernetes.io/projected/a08580e2-9965-4142-91cf-0c09d82a50b7-kube-api-access-pxjx9\") pod \"openstack-operator-index-zbrtw\" (UID: \"a08580e2-9965-4142-91cf-0c09d82a50b7\") " pod="openstack-operators/openstack-operator-index-zbrtw" Dec 04 22:29:53.156056 master-0 kubenswrapper[33572]: I1204 22:29:53.155986 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxjx9\" (UniqueName: \"kubernetes.io/projected/a08580e2-9965-4142-91cf-0c09d82a50b7-kube-api-access-pxjx9\") pod \"openstack-operator-index-zbrtw\" (UID: \"a08580e2-9965-4142-91cf-0c09d82a50b7\") " pod="openstack-operators/openstack-operator-index-zbrtw" Dec 04 22:29:53.353256 master-0 kubenswrapper[33572]: I1204 22:29:53.353190 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zbrtw" Dec 04 22:29:53.464194 master-0 kubenswrapper[33572]: I1204 22:29:53.464078 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mlm9f" Dec 04 22:29:53.640562 master-0 kubenswrapper[33572]: I1204 22:29:53.640476 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c27ng\" (UniqueName: \"kubernetes.io/projected/a7ff0244-14a7-41f4-b881-5de6342e3dfa-kube-api-access-c27ng\") pod \"a7ff0244-14a7-41f4-b881-5de6342e3dfa\" (UID: \"a7ff0244-14a7-41f4-b881-5de6342e3dfa\") " Dec 04 22:29:53.644204 master-0 kubenswrapper[33572]: I1204 22:29:53.644152 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ff0244-14a7-41f4-b881-5de6342e3dfa-kube-api-access-c27ng" (OuterVolumeSpecName: "kube-api-access-c27ng") pod "a7ff0244-14a7-41f4-b881-5de6342e3dfa" (UID: "a7ff0244-14a7-41f4-b881-5de6342e3dfa"). InnerVolumeSpecName "kube-api-access-c27ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:29:53.744163 master-0 kubenswrapper[33572]: I1204 22:29:53.744025 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c27ng\" (UniqueName: \"kubernetes.io/projected/a7ff0244-14a7-41f4-b881-5de6342e3dfa-kube-api-access-c27ng\") on node \"master-0\" DevicePath \"\"" Dec 04 22:29:53.855941 master-0 kubenswrapper[33572]: I1204 22:29:53.855885 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zbrtw"] Dec 04 22:29:53.862267 master-0 kubenswrapper[33572]: W1204 22:29:53.862208 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08580e2_9965_4142_91cf_0c09d82a50b7.slice/crio-41981d0851447c3e7b2cda4af5aa0752414ddf78c2b1d18156abe6da55f522cd WatchSource:0}: Error finding container 41981d0851447c3e7b2cda4af5aa0752414ddf78c2b1d18156abe6da55f522cd: Status 404 returned error can't find the container with id 41981d0851447c3e7b2cda4af5aa0752414ddf78c2b1d18156abe6da55f522cd Dec 04 22:29:53.880442 master-0 kubenswrapper[33572]: I1204 22:29:53.880396 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zbrtw" event={"ID":"a08580e2-9965-4142-91cf-0c09d82a50b7","Type":"ContainerStarted","Data":"41981d0851447c3e7b2cda4af5aa0752414ddf78c2b1d18156abe6da55f522cd"} Dec 04 22:29:53.882664 master-0 kubenswrapper[33572]: I1204 22:29:53.881946 33572 generic.go:334] "Generic (PLEG): container finished" podID="a7ff0244-14a7-41f4-b881-5de6342e3dfa" containerID="8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14" exitCode=0 Dec 04 22:29:53.882664 master-0 kubenswrapper[33572]: I1204 22:29:53.881986 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mlm9f" event={"ID":"a7ff0244-14a7-41f4-b881-5de6342e3dfa","Type":"ContainerDied","Data":"8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14"} Dec 04 22:29:53.882664 master-0 kubenswrapper[33572]: I1204 22:29:53.882037 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mlm9f" Dec 04 22:29:53.882664 master-0 kubenswrapper[33572]: I1204 22:29:53.882057 33572 scope.go:117] "RemoveContainer" containerID="8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14" Dec 04 22:29:53.882664 master-0 kubenswrapper[33572]: I1204 22:29:53.882039 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mlm9f" event={"ID":"a7ff0244-14a7-41f4-b881-5de6342e3dfa","Type":"ContainerDied","Data":"e351bfe9b6ab48ecab3e230a94ca461a761017491b2a866a891dd9f75ae14b51"} Dec 04 22:29:53.899256 master-0 kubenswrapper[33572]: I1204 22:29:53.899196 33572 scope.go:117] "RemoveContainer" containerID="8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14" Dec 04 22:29:53.899773 master-0 kubenswrapper[33572]: E1204 22:29:53.899695 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14\": container with ID starting with 8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14 not found: ID does not exist" containerID="8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14" Dec 04 22:29:53.899913 master-0 kubenswrapper[33572]: I1204 22:29:53.899772 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14"} err="failed to get container status \"8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14\": rpc error: code = NotFound desc = could not find container \"8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14\": container with ID starting with 8ebe93a98e6c1aafb3f6805fbc23219d79de7a8580b6e116970af312bf776a14 not found: ID does not exist" Dec 04 22:29:53.923221 master-0 kubenswrapper[33572]: I1204 22:29:53.923112 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mlm9f"] Dec 04 22:29:53.931230 master-0 kubenswrapper[33572]: I1204 22:29:53.931154 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mlm9f"] Dec 04 22:29:54.547386 master-0 kubenswrapper[33572]: I1204 22:29:54.547307 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ff0244-14a7-41f4-b881-5de6342e3dfa" path="/var/lib/kubelet/pods/a7ff0244-14a7-41f4-b881-5de6342e3dfa/volumes" Dec 04 22:29:54.892449 master-0 kubenswrapper[33572]: I1204 22:29:54.892380 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zbrtw" event={"ID":"a08580e2-9965-4142-91cf-0c09d82a50b7","Type":"ContainerStarted","Data":"17cfddb573580b1e54cdbd0be243daddea38cfbe4ebecb34a9adb4af911ec2d0"} Dec 04 22:29:54.923412 master-0 kubenswrapper[33572]: I1204 22:29:54.922427 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zbrtw" podStartSLOduration=2.4779336 podStartE2EDuration="2.922396719s" podCreationTimestamp="2025-12-04 22:29:52 +0000 UTC" firstStartedPulling="2025-12-04 22:29:53.868910135 +0000 UTC m=+657.596435784" lastFinishedPulling="2025-12-04 22:29:54.313373224 +0000 UTC m=+658.040898903" observedRunningTime="2025-12-04 22:29:54.91952591 +0000 UTC m=+658.647051579" watchObservedRunningTime="2025-12-04 22:29:54.922396719 +0000 UTC m=+658.649922408" Dec 04 22:30:00.171423 master-0 kubenswrapper[33572]: I1204 22:30:00.171371 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx"] Dec 04 22:30:00.172469 master-0 kubenswrapper[33572]: E1204 22:30:00.172453 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ff0244-14a7-41f4-b881-5de6342e3dfa" containerName="registry-server" Dec 04 22:30:00.172557 master-0 kubenswrapper[33572]: I1204 22:30:00.172546 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ff0244-14a7-41f4-b881-5de6342e3dfa" containerName="registry-server" Dec 04 22:30:00.172806 master-0 kubenswrapper[33572]: I1204 22:30:00.172793 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ff0244-14a7-41f4-b881-5de6342e3dfa" containerName="registry-server" Dec 04 22:30:00.173435 master-0 kubenswrapper[33572]: I1204 22:30:00.173419 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:00.175723 master-0 kubenswrapper[33572]: I1204 22:30:00.175707 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-njflt" Dec 04 22:30:00.175964 master-0 kubenswrapper[33572]: I1204 22:30:00.175743 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 22:30:00.179705 master-0 kubenswrapper[33572]: I1204 22:30:00.179680 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9df51f1-321e-4597-b301-f588964aadec-config-volume\") pod \"collect-profiles-29414790-h7jwx\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:00.179848 master-0 kubenswrapper[33572]: I1204 22:30:00.179834 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5wz7\" (UniqueName: \"kubernetes.io/projected/b9df51f1-321e-4597-b301-f588964aadec-kube-api-access-v5wz7\") pod \"collect-profiles-29414790-h7jwx\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:00.179932 master-0 kubenswrapper[33572]: I1204 22:30:00.179919 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9df51f1-321e-4597-b301-f588964aadec-secret-volume\") pod \"collect-profiles-29414790-h7jwx\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:00.183882 master-0 kubenswrapper[33572]: I1204 22:30:00.183807 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx"] Dec 04 22:30:00.281903 master-0 kubenswrapper[33572]: I1204 22:30:00.281865 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9df51f1-321e-4597-b301-f588964aadec-config-volume\") pod \"collect-profiles-29414790-h7jwx\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:00.282193 master-0 kubenswrapper[33572]: I1204 22:30:00.282175 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5wz7\" (UniqueName: \"kubernetes.io/projected/b9df51f1-321e-4597-b301-f588964aadec-kube-api-access-v5wz7\") pod \"collect-profiles-29414790-h7jwx\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:00.282321 master-0 kubenswrapper[33572]: I1204 22:30:00.282306 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9df51f1-321e-4597-b301-f588964aadec-secret-volume\") pod \"collect-profiles-29414790-h7jwx\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:00.283238 master-0 kubenswrapper[33572]: I1204 22:30:00.283175 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9df51f1-321e-4597-b301-f588964aadec-config-volume\") pod \"collect-profiles-29414790-h7jwx\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:00.286600 master-0 kubenswrapper[33572]: I1204 22:30:00.286544 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9df51f1-321e-4597-b301-f588964aadec-secret-volume\") pod \"collect-profiles-29414790-h7jwx\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:00.307672 master-0 kubenswrapper[33572]: I1204 22:30:00.307587 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5wz7\" (UniqueName: \"kubernetes.io/projected/b9df51f1-321e-4597-b301-f588964aadec-kube-api-access-v5wz7\") pod \"collect-profiles-29414790-h7jwx\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:00.511555 master-0 kubenswrapper[33572]: I1204 22:30:00.511323 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:01.055564 master-0 kubenswrapper[33572]: I1204 22:30:01.055476 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx"] Dec 04 22:30:01.063564 master-0 kubenswrapper[33572]: W1204 22:30:01.063519 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9df51f1_321e_4597_b301_f588964aadec.slice/crio-4157f103d41d39881dedf8516f7d27ecb3d14b268158f5ad13a36d926bfc4a1a WatchSource:0}: Error finding container 4157f103d41d39881dedf8516f7d27ecb3d14b268158f5ad13a36d926bfc4a1a: Status 404 returned error can't find the container with id 4157f103d41d39881dedf8516f7d27ecb3d14b268158f5ad13a36d926bfc4a1a Dec 04 22:30:01.975264 master-0 kubenswrapper[33572]: I1204 22:30:01.975173 33572 generic.go:334] "Generic (PLEG): container finished" podID="b9df51f1-321e-4597-b301-f588964aadec" containerID="54743383680cf2cebbe84250192d96c5164c491a04b382d1aacfad82180bf690" exitCode=0 Dec 04 22:30:01.975264 master-0 kubenswrapper[33572]: I1204 22:30:01.975249 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" event={"ID":"b9df51f1-321e-4597-b301-f588964aadec","Type":"ContainerDied","Data":"54743383680cf2cebbe84250192d96c5164c491a04b382d1aacfad82180bf690"} Dec 04 22:30:01.976222 master-0 kubenswrapper[33572]: I1204 22:30:01.975291 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" event={"ID":"b9df51f1-321e-4597-b301-f588964aadec","Type":"ContainerStarted","Data":"4157f103d41d39881dedf8516f7d27ecb3d14b268158f5ad13a36d926bfc4a1a"} Dec 04 22:30:03.353923 master-0 kubenswrapper[33572]: I1204 22:30:03.353863 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zbrtw" Dec 04 22:30:03.353923 master-0 kubenswrapper[33572]: I1204 22:30:03.353927 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zbrtw" Dec 04 22:30:03.397425 master-0 kubenswrapper[33572]: I1204 22:30:03.397359 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zbrtw" Dec 04 22:30:03.407552 master-0 kubenswrapper[33572]: I1204 22:30:03.405978 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:03.541837 master-0 kubenswrapper[33572]: I1204 22:30:03.541753 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5wz7\" (UniqueName: \"kubernetes.io/projected/b9df51f1-321e-4597-b301-f588964aadec-kube-api-access-v5wz7\") pod \"b9df51f1-321e-4597-b301-f588964aadec\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " Dec 04 22:30:03.542110 master-0 kubenswrapper[33572]: I1204 22:30:03.541995 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9df51f1-321e-4597-b301-f588964aadec-secret-volume\") pod \"b9df51f1-321e-4597-b301-f588964aadec\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " Dec 04 22:30:03.542156 master-0 kubenswrapper[33572]: I1204 22:30:03.542119 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9df51f1-321e-4597-b301-f588964aadec-config-volume\") pod \"b9df51f1-321e-4597-b301-f588964aadec\" (UID: \"b9df51f1-321e-4597-b301-f588964aadec\") " Dec 04 22:30:03.544523 master-0 kubenswrapper[33572]: I1204 22:30:03.544421 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9df51f1-321e-4597-b301-f588964aadec-config-volume" (OuterVolumeSpecName: "config-volume") pod "b9df51f1-321e-4597-b301-f588964aadec" (UID: "b9df51f1-321e-4597-b301-f588964aadec"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:30:03.547141 master-0 kubenswrapper[33572]: I1204 22:30:03.547069 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9df51f1-321e-4597-b301-f588964aadec-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b9df51f1-321e-4597-b301-f588964aadec" (UID: "b9df51f1-321e-4597-b301-f588964aadec"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:30:03.554093 master-0 kubenswrapper[33572]: I1204 22:30:03.554014 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9df51f1-321e-4597-b301-f588964aadec-kube-api-access-v5wz7" (OuterVolumeSpecName: "kube-api-access-v5wz7") pod "b9df51f1-321e-4597-b301-f588964aadec" (UID: "b9df51f1-321e-4597-b301-f588964aadec"). InnerVolumeSpecName "kube-api-access-v5wz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:30:03.645527 master-0 kubenswrapper[33572]: I1204 22:30:03.645432 33572 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9df51f1-321e-4597-b301-f588964aadec-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 22:30:03.646135 master-0 kubenswrapper[33572]: I1204 22:30:03.646088 33572 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9df51f1-321e-4597-b301-f588964aadec-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 22:30:03.646135 master-0 kubenswrapper[33572]: I1204 22:30:03.646127 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5wz7\" (UniqueName: \"kubernetes.io/projected/b9df51f1-321e-4597-b301-f588964aadec-kube-api-access-v5wz7\") on node \"master-0\" DevicePath \"\"" Dec 04 22:30:04.001903 master-0 kubenswrapper[33572]: I1204 22:30:04.001707 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" event={"ID":"b9df51f1-321e-4597-b301-f588964aadec","Type":"ContainerDied","Data":"4157f103d41d39881dedf8516f7d27ecb3d14b268158f5ad13a36d926bfc4a1a"} Dec 04 22:30:04.001903 master-0 kubenswrapper[33572]: I1204 22:30:04.001797 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4157f103d41d39881dedf8516f7d27ecb3d14b268158f5ad13a36d926bfc4a1a" Dec 04 22:30:04.001903 master-0 kubenswrapper[33572]: I1204 22:30:04.001749 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414790-h7jwx" Dec 04 22:30:04.050471 master-0 kubenswrapper[33572]: I1204 22:30:04.050396 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zbrtw" Dec 04 22:30:10.111538 master-0 kubenswrapper[33572]: I1204 22:30:10.110796 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25"] Dec 04 22:30:10.111538 master-0 kubenswrapper[33572]: E1204 22:30:10.111425 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9df51f1-321e-4597-b301-f588964aadec" containerName="collect-profiles" Dec 04 22:30:10.111538 master-0 kubenswrapper[33572]: I1204 22:30:10.111451 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9df51f1-321e-4597-b301-f588964aadec" containerName="collect-profiles" Dec 04 22:30:10.112578 master-0 kubenswrapper[33572]: I1204 22:30:10.111826 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9df51f1-321e-4597-b301-f588964aadec" containerName="collect-profiles" Dec 04 22:30:10.117004 master-0 kubenswrapper[33572]: I1204 22:30:10.113812 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.122645 master-0 kubenswrapper[33572]: I1204 22:30:10.122577 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25"] Dec 04 22:30:10.277640 master-0 kubenswrapper[33572]: I1204 22:30:10.277560 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.277927 master-0 kubenswrapper[33572]: I1204 22:30:10.277711 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9crr\" (UniqueName: \"kubernetes.io/projected/9157b62a-a6a9-4203-b23b-1e4657e06d49-kube-api-access-v9crr\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.277927 master-0 kubenswrapper[33572]: I1204 22:30:10.277791 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.380456 master-0 kubenswrapper[33572]: I1204 22:30:10.380271 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.380456 master-0 kubenswrapper[33572]: I1204 22:30:10.380426 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9crr\" (UniqueName: \"kubernetes.io/projected/9157b62a-a6a9-4203-b23b-1e4657e06d49-kube-api-access-v9crr\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.380909 master-0 kubenswrapper[33572]: I1204 22:30:10.380534 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.381065 master-0 kubenswrapper[33572]: I1204 22:30:10.380991 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.381367 master-0 kubenswrapper[33572]: I1204 22:30:10.381308 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.402462 master-0 kubenswrapper[33572]: I1204 22:30:10.402411 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9crr\" (UniqueName: \"kubernetes.io/projected/9157b62a-a6a9-4203-b23b-1e4657e06d49-kube-api-access-v9crr\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.463789 master-0 kubenswrapper[33572]: I1204 22:30:10.463679 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:10.953241 master-0 kubenswrapper[33572]: I1204 22:30:10.953176 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25"] Dec 04 22:30:10.961299 master-0 kubenswrapper[33572]: W1204 22:30:10.961214 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9157b62a_a6a9_4203_b23b_1e4657e06d49.slice/crio-78710fd0a04dbdaeb9c5d9d65eddd66b6b87bbfdd1c67e2f18279a3af565b1df WatchSource:0}: Error finding container 78710fd0a04dbdaeb9c5d9d65eddd66b6b87bbfdd1c67e2f18279a3af565b1df: Status 404 returned error can't find the container with id 78710fd0a04dbdaeb9c5d9d65eddd66b6b87bbfdd1c67e2f18279a3af565b1df Dec 04 22:30:11.098327 master-0 kubenswrapper[33572]: I1204 22:30:11.098233 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" event={"ID":"9157b62a-a6a9-4203-b23b-1e4657e06d49","Type":"ContainerStarted","Data":"78710fd0a04dbdaeb9c5d9d65eddd66b6b87bbfdd1c67e2f18279a3af565b1df"} Dec 04 22:30:12.114901 master-0 kubenswrapper[33572]: I1204 22:30:12.114754 33572 generic.go:334] "Generic (PLEG): container finished" podID="9157b62a-a6a9-4203-b23b-1e4657e06d49" containerID="386b8ac6a98d6423cd4f5beb9bf0336a7aefc8708cbddbaa95dff1051a460279" exitCode=0 Dec 04 22:30:12.114901 master-0 kubenswrapper[33572]: I1204 22:30:12.114851 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" event={"ID":"9157b62a-a6a9-4203-b23b-1e4657e06d49","Type":"ContainerDied","Data":"386b8ac6a98d6423cd4f5beb9bf0336a7aefc8708cbddbaa95dff1051a460279"} Dec 04 22:30:12.119199 master-0 kubenswrapper[33572]: I1204 22:30:12.119131 33572 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 22:30:13.129078 master-0 kubenswrapper[33572]: I1204 22:30:13.128959 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" event={"ID":"9157b62a-a6a9-4203-b23b-1e4657e06d49","Type":"ContainerStarted","Data":"ecc4b9ca909850ca3e2f8aa29eef8607f842e64124a5b2459b2d019e6d2a068c"} Dec 04 22:30:14.140706 master-0 kubenswrapper[33572]: I1204 22:30:14.140604 33572 generic.go:334] "Generic (PLEG): container finished" podID="9157b62a-a6a9-4203-b23b-1e4657e06d49" containerID="ecc4b9ca909850ca3e2f8aa29eef8607f842e64124a5b2459b2d019e6d2a068c" exitCode=0 Dec 04 22:30:14.140706 master-0 kubenswrapper[33572]: I1204 22:30:14.140671 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" event={"ID":"9157b62a-a6a9-4203-b23b-1e4657e06d49","Type":"ContainerDied","Data":"ecc4b9ca909850ca3e2f8aa29eef8607f842e64124a5b2459b2d019e6d2a068c"} Dec 04 22:30:15.158070 master-0 kubenswrapper[33572]: I1204 22:30:15.157953 33572 generic.go:334] "Generic (PLEG): container finished" podID="9157b62a-a6a9-4203-b23b-1e4657e06d49" containerID="6c0f8117fd5e38312f490d3147f62d01f164da02050129b6c6a1743dd527bcfa" exitCode=0 Dec 04 22:30:15.158070 master-0 kubenswrapper[33572]: I1204 22:30:15.158035 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" event={"ID":"9157b62a-a6a9-4203-b23b-1e4657e06d49","Type":"ContainerDied","Data":"6c0f8117fd5e38312f490d3147f62d01f164da02050129b6c6a1743dd527bcfa"} Dec 04 22:30:16.620302 master-0 kubenswrapper[33572]: I1204 22:30:16.620247 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:16.854553 master-0 kubenswrapper[33572]: I1204 22:30:16.791423 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-util\") pod \"9157b62a-a6a9-4203-b23b-1e4657e06d49\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " Dec 04 22:30:16.854553 master-0 kubenswrapper[33572]: I1204 22:30:16.791788 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-bundle\") pod \"9157b62a-a6a9-4203-b23b-1e4657e06d49\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " Dec 04 22:30:16.854553 master-0 kubenswrapper[33572]: I1204 22:30:16.791939 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9crr\" (UniqueName: \"kubernetes.io/projected/9157b62a-a6a9-4203-b23b-1e4657e06d49-kube-api-access-v9crr\") pod \"9157b62a-a6a9-4203-b23b-1e4657e06d49\" (UID: \"9157b62a-a6a9-4203-b23b-1e4657e06d49\") " Dec 04 22:30:16.854553 master-0 kubenswrapper[33572]: I1204 22:30:16.792720 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-bundle" (OuterVolumeSpecName: "bundle") pod "9157b62a-a6a9-4203-b23b-1e4657e06d49" (UID: "9157b62a-a6a9-4203-b23b-1e4657e06d49"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:30:16.854553 master-0 kubenswrapper[33572]: I1204 22:30:16.819569 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-util" (OuterVolumeSpecName: "util") pod "9157b62a-a6a9-4203-b23b-1e4657e06d49" (UID: "9157b62a-a6a9-4203-b23b-1e4657e06d49"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:30:16.860576 master-0 kubenswrapper[33572]: I1204 22:30:16.859074 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9157b62a-a6a9-4203-b23b-1e4657e06d49-kube-api-access-v9crr" (OuterVolumeSpecName: "kube-api-access-v9crr") pod "9157b62a-a6a9-4203-b23b-1e4657e06d49" (UID: "9157b62a-a6a9-4203-b23b-1e4657e06d49"). InnerVolumeSpecName "kube-api-access-v9crr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:30:16.893998 master-0 kubenswrapper[33572]: I1204 22:30:16.893916 33572 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:30:16.893998 master-0 kubenswrapper[33572]: I1204 22:30:16.893962 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9crr\" (UniqueName: \"kubernetes.io/projected/9157b62a-a6a9-4203-b23b-1e4657e06d49-kube-api-access-v9crr\") on node \"master-0\" DevicePath \"\"" Dec 04 22:30:16.893998 master-0 kubenswrapper[33572]: I1204 22:30:16.893975 33572 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9157b62a-a6a9-4203-b23b-1e4657e06d49-util\") on node \"master-0\" DevicePath \"\"" Dec 04 22:30:17.188894 master-0 kubenswrapper[33572]: I1204 22:30:17.188730 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" event={"ID":"9157b62a-a6a9-4203-b23b-1e4657e06d49","Type":"ContainerDied","Data":"78710fd0a04dbdaeb9c5d9d65eddd66b6b87bbfdd1c67e2f18279a3af565b1df"} Dec 04 22:30:17.188894 master-0 kubenswrapper[33572]: I1204 22:30:17.188793 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78710fd0a04dbdaeb9c5d9d65eddd66b6b87bbfdd1c67e2f18279a3af565b1df" Dec 04 22:30:17.188894 master-0 kubenswrapper[33572]: I1204 22:30:17.188791 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25" Dec 04 22:30:22.501853 master-0 kubenswrapper[33572]: I1204 22:30:22.501718 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj"] Dec 04 22:30:22.502488 master-0 kubenswrapper[33572]: E1204 22:30:22.502144 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9157b62a-a6a9-4203-b23b-1e4657e06d49" containerName="util" Dec 04 22:30:22.502488 master-0 kubenswrapper[33572]: I1204 22:30:22.502163 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9157b62a-a6a9-4203-b23b-1e4657e06d49" containerName="util" Dec 04 22:30:22.502488 master-0 kubenswrapper[33572]: E1204 22:30:22.502201 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9157b62a-a6a9-4203-b23b-1e4657e06d49" containerName="pull" Dec 04 22:30:22.502488 master-0 kubenswrapper[33572]: I1204 22:30:22.502211 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9157b62a-a6a9-4203-b23b-1e4657e06d49" containerName="pull" Dec 04 22:30:22.502488 master-0 kubenswrapper[33572]: E1204 22:30:22.502250 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9157b62a-a6a9-4203-b23b-1e4657e06d49" containerName="extract" Dec 04 22:30:22.502488 master-0 kubenswrapper[33572]: I1204 22:30:22.502258 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9157b62a-a6a9-4203-b23b-1e4657e06d49" containerName="extract" Dec 04 22:30:22.502488 master-0 kubenswrapper[33572]: I1204 22:30:22.502469 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="9157b62a-a6a9-4203-b23b-1e4657e06d49" containerName="extract" Dec 04 22:30:22.503240 master-0 kubenswrapper[33572]: I1204 22:30:22.503214 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" Dec 04 22:30:22.534658 master-0 kubenswrapper[33572]: I1204 22:30:22.534601 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj"] Dec 04 22:30:22.609731 master-0 kubenswrapper[33572]: I1204 22:30:22.609670 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpncj\" (UniqueName: \"kubernetes.io/projected/a70e6ff3-ffc3-4937-98e8-3ce53f402969-kube-api-access-qpncj\") pod \"openstack-operator-controller-operator-55b6fb9447-qsvnj\" (UID: \"a70e6ff3-ffc3-4937-98e8-3ce53f402969\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" Dec 04 22:30:22.711537 master-0 kubenswrapper[33572]: I1204 22:30:22.711466 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpncj\" (UniqueName: \"kubernetes.io/projected/a70e6ff3-ffc3-4937-98e8-3ce53f402969-kube-api-access-qpncj\") pod \"openstack-operator-controller-operator-55b6fb9447-qsvnj\" (UID: \"a70e6ff3-ffc3-4937-98e8-3ce53f402969\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" Dec 04 22:30:22.728423 master-0 kubenswrapper[33572]: I1204 22:30:22.728361 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpncj\" (UniqueName: \"kubernetes.io/projected/a70e6ff3-ffc3-4937-98e8-3ce53f402969-kube-api-access-qpncj\") pod \"openstack-operator-controller-operator-55b6fb9447-qsvnj\" (UID: \"a70e6ff3-ffc3-4937-98e8-3ce53f402969\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" Dec 04 22:30:22.823648 master-0 kubenswrapper[33572]: I1204 22:30:22.823594 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" Dec 04 22:30:23.369624 master-0 kubenswrapper[33572]: I1204 22:30:23.369563 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj"] Dec 04 22:30:23.387300 master-0 kubenswrapper[33572]: W1204 22:30:23.387234 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda70e6ff3_ffc3_4937_98e8_3ce53f402969.slice/crio-d7eaa1e7388d0a902f5d534a1cdbc216ceab0d229fcf02e57f5f77fd4d3305c0 WatchSource:0}: Error finding container d7eaa1e7388d0a902f5d534a1cdbc216ceab0d229fcf02e57f5f77fd4d3305c0: Status 404 returned error can't find the container with id d7eaa1e7388d0a902f5d534a1cdbc216ceab0d229fcf02e57f5f77fd4d3305c0 Dec 04 22:30:24.281202 master-0 kubenswrapper[33572]: I1204 22:30:24.280776 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" event={"ID":"a70e6ff3-ffc3-4937-98e8-3ce53f402969","Type":"ContainerStarted","Data":"d7eaa1e7388d0a902f5d534a1cdbc216ceab0d229fcf02e57f5f77fd4d3305c0"} Dec 04 22:30:28.326975 master-0 kubenswrapper[33572]: I1204 22:30:28.326881 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" event={"ID":"a70e6ff3-ffc3-4937-98e8-3ce53f402969","Type":"ContainerStarted","Data":"beebbb3424a2ae30001a0a74e7f2ec1a09277ca357bae7b4ffaa5a2da478197f"} Dec 04 22:30:28.327625 master-0 kubenswrapper[33572]: I1204 22:30:28.327102 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" Dec 04 22:30:28.542838 master-0 kubenswrapper[33572]: I1204 22:30:28.540366 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" podStartSLOduration=2.53983224 podStartE2EDuration="6.540345485s" podCreationTimestamp="2025-12-04 22:30:22 +0000 UTC" firstStartedPulling="2025-12-04 22:30:23.391578177 +0000 UTC m=+687.119103846" lastFinishedPulling="2025-12-04 22:30:27.392091442 +0000 UTC m=+691.119617091" observedRunningTime="2025-12-04 22:30:28.536064166 +0000 UTC m=+692.263589835" watchObservedRunningTime="2025-12-04 22:30:28.540345485 +0000 UTC m=+692.267871144" Dec 04 22:30:32.828691 master-0 kubenswrapper[33572]: I1204 22:30:32.828603 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" Dec 04 22:30:36.869221 master-0 kubenswrapper[33572]: I1204 22:30:36.869152 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst"] Dec 04 22:30:36.870551 master-0 kubenswrapper[33572]: I1204 22:30:36.870497 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst" Dec 04 22:30:36.901266 master-0 kubenswrapper[33572]: I1204 22:30:36.901198 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst"] Dec 04 22:30:36.912067 master-0 kubenswrapper[33572]: I1204 22:30:36.912001 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl9bp\" (UniqueName: \"kubernetes.io/projected/339808d7-8bed-4362-a4d0-c367d186332d-kube-api-access-bl9bp\") pod \"openstack-operator-controller-operator-589d7b4556-6vpst\" (UID: \"339808d7-8bed-4362-a4d0-c367d186332d\") " pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst" Dec 04 22:30:37.014175 master-0 kubenswrapper[33572]: I1204 22:30:37.014095 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl9bp\" (UniqueName: \"kubernetes.io/projected/339808d7-8bed-4362-a4d0-c367d186332d-kube-api-access-bl9bp\") pod \"openstack-operator-controller-operator-589d7b4556-6vpst\" (UID: \"339808d7-8bed-4362-a4d0-c367d186332d\") " pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst" Dec 04 22:30:37.037171 master-0 kubenswrapper[33572]: I1204 22:30:37.037129 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl9bp\" (UniqueName: \"kubernetes.io/projected/339808d7-8bed-4362-a4d0-c367d186332d-kube-api-access-bl9bp\") pod \"openstack-operator-controller-operator-589d7b4556-6vpst\" (UID: \"339808d7-8bed-4362-a4d0-c367d186332d\") " pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst" Dec 04 22:30:37.189932 master-0 kubenswrapper[33572]: I1204 22:30:37.189742 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst" Dec 04 22:30:37.727397 master-0 kubenswrapper[33572]: I1204 22:30:37.727327 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst"] Dec 04 22:30:37.737884 master-0 kubenswrapper[33572]: W1204 22:30:37.737786 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod339808d7_8bed_4362_a4d0_c367d186332d.slice/crio-02cedc61f25fdd85975243db4069e1367aebc90fcb956ed391db924320aabc05 WatchSource:0}: Error finding container 02cedc61f25fdd85975243db4069e1367aebc90fcb956ed391db924320aabc05: Status 404 returned error can't find the container with id 02cedc61f25fdd85975243db4069e1367aebc90fcb956ed391db924320aabc05 Dec 04 22:30:38.516922 master-0 kubenswrapper[33572]: I1204 22:30:38.516834 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst" event={"ID":"339808d7-8bed-4362-a4d0-c367d186332d","Type":"ContainerStarted","Data":"caa6a23ec2378295717bf8859f7bc90d16b84bddb55c6546ace8c927ec969b37"} Dec 04 22:30:38.516922 master-0 kubenswrapper[33572]: I1204 22:30:38.516911 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst" event={"ID":"339808d7-8bed-4362-a4d0-c367d186332d","Type":"ContainerStarted","Data":"02cedc61f25fdd85975243db4069e1367aebc90fcb956ed391db924320aabc05"} Dec 04 22:30:38.517785 master-0 kubenswrapper[33572]: I1204 22:30:38.516959 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst" Dec 04 22:30:38.563594 master-0 kubenswrapper[33572]: I1204 22:30:38.563462 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst" podStartSLOduration=2.563433825 podStartE2EDuration="2.563433825s" podCreationTimestamp="2025-12-04 22:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:30:38.559879887 +0000 UTC m=+702.287405556" watchObservedRunningTime="2025-12-04 22:30:38.563433825 +0000 UTC m=+702.290959484" Dec 04 22:30:47.192838 master-0 kubenswrapper[33572]: I1204 22:30:47.192764 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-6vpst" Dec 04 22:30:47.328595 master-0 kubenswrapper[33572]: I1204 22:30:47.328517 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj"] Dec 04 22:30:47.329004 master-0 kubenswrapper[33572]: I1204 22:30:47.328812 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" podUID="a70e6ff3-ffc3-4937-98e8-3ce53f402969" containerName="operator" containerID="cri-o://beebbb3424a2ae30001a0a74e7f2ec1a09277ca357bae7b4ffaa5a2da478197f" gracePeriod=10 Dec 04 22:30:47.628113 master-0 kubenswrapper[33572]: I1204 22:30:47.628046 33572 generic.go:334] "Generic (PLEG): container finished" podID="a70e6ff3-ffc3-4937-98e8-3ce53f402969" containerID="beebbb3424a2ae30001a0a74e7f2ec1a09277ca357bae7b4ffaa5a2da478197f" exitCode=0 Dec 04 22:30:47.628113 master-0 kubenswrapper[33572]: I1204 22:30:47.628114 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" event={"ID":"a70e6ff3-ffc3-4937-98e8-3ce53f402969","Type":"ContainerDied","Data":"beebbb3424a2ae30001a0a74e7f2ec1a09277ca357bae7b4ffaa5a2da478197f"} Dec 04 22:30:47.797038 master-0 kubenswrapper[33572]: I1204 22:30:47.796971 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" Dec 04 22:30:47.893220 master-0 kubenswrapper[33572]: I1204 22:30:47.893132 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpncj\" (UniqueName: \"kubernetes.io/projected/a70e6ff3-ffc3-4937-98e8-3ce53f402969-kube-api-access-qpncj\") pod \"a70e6ff3-ffc3-4937-98e8-3ce53f402969\" (UID: \"a70e6ff3-ffc3-4937-98e8-3ce53f402969\") " Dec 04 22:30:47.897857 master-0 kubenswrapper[33572]: I1204 22:30:47.897795 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a70e6ff3-ffc3-4937-98e8-3ce53f402969-kube-api-access-qpncj" (OuterVolumeSpecName: "kube-api-access-qpncj") pod "a70e6ff3-ffc3-4937-98e8-3ce53f402969" (UID: "a70e6ff3-ffc3-4937-98e8-3ce53f402969"). InnerVolumeSpecName "kube-api-access-qpncj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:30:47.996379 master-0 kubenswrapper[33572]: I1204 22:30:47.996196 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpncj\" (UniqueName: \"kubernetes.io/projected/a70e6ff3-ffc3-4937-98e8-3ce53f402969-kube-api-access-qpncj\") on node \"master-0\" DevicePath \"\"" Dec 04 22:30:48.644558 master-0 kubenswrapper[33572]: I1204 22:30:48.644368 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" event={"ID":"a70e6ff3-ffc3-4937-98e8-3ce53f402969","Type":"ContainerDied","Data":"d7eaa1e7388d0a902f5d534a1cdbc216ceab0d229fcf02e57f5f77fd4d3305c0"} Dec 04 22:30:48.645716 master-0 kubenswrapper[33572]: I1204 22:30:48.644431 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj" Dec 04 22:30:48.645716 master-0 kubenswrapper[33572]: I1204 22:30:48.644610 33572 scope.go:117] "RemoveContainer" containerID="beebbb3424a2ae30001a0a74e7f2ec1a09277ca357bae7b4ffaa5a2da478197f" Dec 04 22:30:48.686813 master-0 kubenswrapper[33572]: I1204 22:30:48.686726 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj"] Dec 04 22:30:48.703838 master-0 kubenswrapper[33572]: I1204 22:30:48.703761 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-qsvnj"] Dec 04 22:30:50.541598 master-0 kubenswrapper[33572]: I1204 22:30:50.541438 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a70e6ff3-ffc3-4937-98e8-3ce53f402969" path="/var/lib/kubelet/pods/a70e6ff3-ffc3-4937-98e8-3ce53f402969/volumes" Dec 04 22:31:53.349230 master-0 kubenswrapper[33572]: I1204 22:31:53.348909 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k"] Dec 04 22:31:53.350059 master-0 kubenswrapper[33572]: E1204 22:31:53.349607 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a70e6ff3-ffc3-4937-98e8-3ce53f402969" containerName="operator" Dec 04 22:31:53.350059 master-0 kubenswrapper[33572]: I1204 22:31:53.349631 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70e6ff3-ffc3-4937-98e8-3ce53f402969" containerName="operator" Dec 04 22:31:53.350166 master-0 kubenswrapper[33572]: I1204 22:31:53.350148 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="a70e6ff3-ffc3-4937-98e8-3ce53f402969" containerName="operator" Dec 04 22:31:53.351865 master-0 kubenswrapper[33572]: I1204 22:31:53.351750 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" Dec 04 22:31:53.356449 master-0 kubenswrapper[33572]: I1204 22:31:53.356023 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v"] Dec 04 22:31:53.359047 master-0 kubenswrapper[33572]: I1204 22:31:53.358956 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" Dec 04 22:31:53.369069 master-0 kubenswrapper[33572]: I1204 22:31:53.369014 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k"] Dec 04 22:31:53.384451 master-0 kubenswrapper[33572]: I1204 22:31:53.384386 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v"] Dec 04 22:31:53.402210 master-0 kubenswrapper[33572]: I1204 22:31:53.402149 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r"] Dec 04 22:31:53.405793 master-0 kubenswrapper[33572]: I1204 22:31:53.405757 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" Dec 04 22:31:53.412300 master-0 kubenswrapper[33572]: I1204 22:31:53.412258 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p"] Dec 04 22:31:53.429175 master-0 kubenswrapper[33572]: I1204 22:31:53.429110 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" Dec 04 22:31:53.447271 master-0 kubenswrapper[33572]: I1204 22:31:53.438541 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r"] Dec 04 22:31:53.478761 master-0 kubenswrapper[33572]: I1204 22:31:53.478701 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl"] Dec 04 22:31:53.480839 master-0 kubenswrapper[33572]: I1204 22:31:53.480810 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" Dec 04 22:31:53.494532 master-0 kubenswrapper[33572]: I1204 22:31:53.491096 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p"] Dec 04 22:31:53.521591 master-0 kubenswrapper[33572]: I1204 22:31:53.521485 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fckpn\" (UniqueName: \"kubernetes.io/projected/c9cefa2f-e4d2-4c8a-b27c-c71af710a6df-kube-api-access-fckpn\") pod \"barbican-operator-controller-manager-5cd89994b5-74h4k\" (UID: \"c9cefa2f-e4d2-4c8a-b27c-c71af710a6df\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" Dec 04 22:31:53.521591 master-0 kubenswrapper[33572]: I1204 22:31:53.521552 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n79m7\" (UniqueName: \"kubernetes.io/projected/e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca-kube-api-access-n79m7\") pod \"designate-operator-controller-manager-84bc9f68f5-7rc6r\" (UID: \"e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" Dec 04 22:31:53.521591 master-0 kubenswrapper[33572]: I1204 22:31:53.521605 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rklpp\" (UniqueName: \"kubernetes.io/projected/68c529a2-072e-4777-9fb9-ec34aa5396ae-kube-api-access-rklpp\") pod \"cinder-operator-controller-manager-f8856dd79-ds48v\" (UID: \"68c529a2-072e-4777-9fb9-ec34aa5396ae\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" Dec 04 22:31:53.521944 master-0 kubenswrapper[33572]: I1204 22:31:53.521632 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxzb\" (UniqueName: \"kubernetes.io/projected/f9ec5f3f-d171-4fc3-abb1-489d49fe30d9-kube-api-access-dgxzb\") pod \"glance-operator-controller-manager-78cd4f7769-wcm5p\" (UID: \"f9ec5f3f-d171-4fc3-abb1-489d49fe30d9\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" Dec 04 22:31:53.521944 master-0 kubenswrapper[33572]: I1204 22:31:53.521671 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz54c\" (UniqueName: \"kubernetes.io/projected/1d25b1d0-f7a6-451b-897f-474f32b23ef1-kube-api-access-nz54c\") pod \"heat-operator-controller-manager-7fd96594c7-5sgkl\" (UID: \"1d25b1d0-f7a6-451b-897f-474f32b23ef1\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" Dec 04 22:31:53.521944 master-0 kubenswrapper[33572]: I1204 22:31:53.521848 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl"] Dec 04 22:31:53.554434 master-0 kubenswrapper[33572]: I1204 22:31:53.554371 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz"] Dec 04 22:31:53.556354 master-0 kubenswrapper[33572]: I1204 22:31:53.556321 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" Dec 04 22:31:53.587231 master-0 kubenswrapper[33572]: I1204 22:31:53.587191 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz"] Dec 04 22:31:53.597665 master-0 kubenswrapper[33572]: I1204 22:31:53.596760 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956"] Dec 04 22:31:53.598296 master-0 kubenswrapper[33572]: I1204 22:31:53.598267 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:31:53.605378 master-0 kubenswrapper[33572]: I1204 22:31:53.605258 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 04 22:31:53.616461 master-0 kubenswrapper[33572]: I1204 22:31:53.616406 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956"] Dec 04 22:31:53.627322 master-0 kubenswrapper[33572]: I1204 22:31:53.627273 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzv7d\" (UniqueName: \"kubernetes.io/projected/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-kube-api-access-xzv7d\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:31:53.627516 master-0 kubenswrapper[33572]: I1204 22:31:53.627398 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:31:53.627516 master-0 kubenswrapper[33572]: I1204 22:31:53.627463 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fckpn\" (UniqueName: \"kubernetes.io/projected/c9cefa2f-e4d2-4c8a-b27c-c71af710a6df-kube-api-access-fckpn\") pod \"barbican-operator-controller-manager-5cd89994b5-74h4k\" (UID: \"c9cefa2f-e4d2-4c8a-b27c-c71af710a6df\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" Dec 04 22:31:53.627516 master-0 kubenswrapper[33572]: I1204 22:31:53.627486 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n79m7\" (UniqueName: \"kubernetes.io/projected/e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca-kube-api-access-n79m7\") pod \"designate-operator-controller-manager-84bc9f68f5-7rc6r\" (UID: \"e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" Dec 04 22:31:53.627667 master-0 kubenswrapper[33572]: I1204 22:31:53.627526 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rklpp\" (UniqueName: \"kubernetes.io/projected/68c529a2-072e-4777-9fb9-ec34aa5396ae-kube-api-access-rklpp\") pod \"cinder-operator-controller-manager-f8856dd79-ds48v\" (UID: \"68c529a2-072e-4777-9fb9-ec34aa5396ae\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" Dec 04 22:31:53.627667 master-0 kubenswrapper[33572]: I1204 22:31:53.627553 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxzb\" (UniqueName: \"kubernetes.io/projected/f9ec5f3f-d171-4fc3-abb1-489d49fe30d9-kube-api-access-dgxzb\") pod \"glance-operator-controller-manager-78cd4f7769-wcm5p\" (UID: \"f9ec5f3f-d171-4fc3-abb1-489d49fe30d9\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" Dec 04 22:31:53.627667 master-0 kubenswrapper[33572]: I1204 22:31:53.627579 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz54c\" (UniqueName: \"kubernetes.io/projected/1d25b1d0-f7a6-451b-897f-474f32b23ef1-kube-api-access-nz54c\") pod \"heat-operator-controller-manager-7fd96594c7-5sgkl\" (UID: \"1d25b1d0-f7a6-451b-897f-474f32b23ef1\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" Dec 04 22:31:53.627795 master-0 kubenswrapper[33572]: I1204 22:31:53.627725 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nkbl\" (UniqueName: \"kubernetes.io/projected/075b6463-8b4e-47e9-9662-c6fa561e9079-kube-api-access-8nkbl\") pod \"horizon-operator-controller-manager-f6cc97788-khfnz\" (UID: \"075b6463-8b4e-47e9-9662-c6fa561e9079\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" Dec 04 22:31:53.635162 master-0 kubenswrapper[33572]: I1204 22:31:53.628183 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v"] Dec 04 22:31:53.635162 master-0 kubenswrapper[33572]: I1204 22:31:53.633167 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" Dec 04 22:31:53.663336 master-0 kubenswrapper[33572]: I1204 22:31:53.663268 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz54c\" (UniqueName: \"kubernetes.io/projected/1d25b1d0-f7a6-451b-897f-474f32b23ef1-kube-api-access-nz54c\") pod \"heat-operator-controller-manager-7fd96594c7-5sgkl\" (UID: \"1d25b1d0-f7a6-451b-897f-474f32b23ef1\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" Dec 04 22:31:53.666142 master-0 kubenswrapper[33572]: I1204 22:31:53.664237 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rklpp\" (UniqueName: \"kubernetes.io/projected/68c529a2-072e-4777-9fb9-ec34aa5396ae-kube-api-access-rklpp\") pod \"cinder-operator-controller-manager-f8856dd79-ds48v\" (UID: \"68c529a2-072e-4777-9fb9-ec34aa5396ae\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" Dec 04 22:31:53.670525 master-0 kubenswrapper[33572]: I1204 22:31:53.668954 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fckpn\" (UniqueName: \"kubernetes.io/projected/c9cefa2f-e4d2-4c8a-b27c-c71af710a6df-kube-api-access-fckpn\") pod \"barbican-operator-controller-manager-5cd89994b5-74h4k\" (UID: \"c9cefa2f-e4d2-4c8a-b27c-c71af710a6df\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" Dec 04 22:31:53.675408 master-0 kubenswrapper[33572]: I1204 22:31:53.674587 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxzb\" (UniqueName: \"kubernetes.io/projected/f9ec5f3f-d171-4fc3-abb1-489d49fe30d9-kube-api-access-dgxzb\") pod \"glance-operator-controller-manager-78cd4f7769-wcm5p\" (UID: \"f9ec5f3f-d171-4fc3-abb1-489d49fe30d9\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" Dec 04 22:31:53.679527 master-0 kubenswrapper[33572]: I1204 22:31:53.677130 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v"] Dec 04 22:31:53.692553 master-0 kubenswrapper[33572]: I1204 22:31:53.690365 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" Dec 04 22:31:53.699416 master-0 kubenswrapper[33572]: I1204 22:31:53.694962 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n79m7\" (UniqueName: \"kubernetes.io/projected/e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca-kube-api-access-n79m7\") pod \"designate-operator-controller-manager-84bc9f68f5-7rc6r\" (UID: \"e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" Dec 04 22:31:53.703459 master-0 kubenswrapper[33572]: I1204 22:31:53.703368 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq"] Dec 04 22:31:53.706387 master-0 kubenswrapper[33572]: I1204 22:31:53.706332 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" Dec 04 22:31:53.710020 master-0 kubenswrapper[33572]: I1204 22:31:53.709968 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" Dec 04 22:31:53.729555 master-0 kubenswrapper[33572]: I1204 22:31:53.728922 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:31:53.730008 master-0 kubenswrapper[33572]: I1204 22:31:53.729987 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g464z\" (UniqueName: \"kubernetes.io/projected/23f59072-43f4-4dc3-a484-48ae31ed7eee-kube-api-access-g464z\") pod \"keystone-operator-controller-manager-58b8dcc5fb-pnhmq\" (UID: \"23f59072-43f4-4dc3-a484-48ae31ed7eee\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" Dec 04 22:31:53.730188 master-0 kubenswrapper[33572]: I1204 22:31:53.730167 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2dv5\" (UniqueName: \"kubernetes.io/projected/5c9990c8-62a6-42fd-9965-50331f773940-kube-api-access-z2dv5\") pod \"ironic-operator-controller-manager-7c9bfd6967-5pn2v\" (UID: \"5c9990c8-62a6-42fd-9965-50331f773940\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" Dec 04 22:31:53.730293 master-0 kubenswrapper[33572]: I1204 22:31:53.730276 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nkbl\" (UniqueName: \"kubernetes.io/projected/075b6463-8b4e-47e9-9662-c6fa561e9079-kube-api-access-8nkbl\") pod \"horizon-operator-controller-manager-f6cc97788-khfnz\" (UID: \"075b6463-8b4e-47e9-9662-c6fa561e9079\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" Dec 04 22:31:53.736493 master-0 kubenswrapper[33572]: E1204 22:31:53.732232 33572 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 22:31:53.736493 master-0 kubenswrapper[33572]: E1204 22:31:53.732328 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert podName:f9e3f583-1af1-44f2-a6e3-271336e2cf1e nodeName:}" failed. No retries permitted until 2025-12-04 22:31:54.23230619 +0000 UTC m=+777.959831839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-qr956" (UID: "f9e3f583-1af1-44f2-a6e3-271336e2cf1e") : secret "infra-operator-webhook-server-cert" not found Dec 04 22:31:53.737169 master-0 kubenswrapper[33572]: I1204 22:31:53.730770 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzv7d\" (UniqueName: \"kubernetes.io/projected/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-kube-api-access-xzv7d\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:31:53.749135 master-0 kubenswrapper[33572]: I1204 22:31:53.749054 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq"] Dec 04 22:31:53.749633 master-0 kubenswrapper[33572]: I1204 22:31:53.749463 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" Dec 04 22:31:53.761927 master-0 kubenswrapper[33572]: I1204 22:31:53.759399 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzv7d\" (UniqueName: \"kubernetes.io/projected/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-kube-api-access-xzv7d\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:31:53.783569 master-0 kubenswrapper[33572]: I1204 22:31:53.782571 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" Dec 04 22:31:53.787548 master-0 kubenswrapper[33572]: I1204 22:31:53.785776 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr"] Dec 04 22:31:53.787548 master-0 kubenswrapper[33572]: I1204 22:31:53.787226 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" Dec 04 22:31:53.799268 master-0 kubenswrapper[33572]: I1204 22:31:53.799227 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nkbl\" (UniqueName: \"kubernetes.io/projected/075b6463-8b4e-47e9-9662-c6fa561e9079-kube-api-access-8nkbl\") pod \"horizon-operator-controller-manager-f6cc97788-khfnz\" (UID: \"075b6463-8b4e-47e9-9662-c6fa561e9079\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" Dec 04 22:31:53.846014 master-0 kubenswrapper[33572]: I1204 22:31:53.841590 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9px5j\" (UniqueName: \"kubernetes.io/projected/c8e6de34-2747-42ce-b9c1-dfdcd71d3707-kube-api-access-9px5j\") pod \"manila-operator-controller-manager-56f9fbf74b-xsxzr\" (UID: \"c8e6de34-2747-42ce-b9c1-dfdcd71d3707\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" Dec 04 22:31:53.846014 master-0 kubenswrapper[33572]: I1204 22:31:53.841697 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g464z\" (UniqueName: \"kubernetes.io/projected/23f59072-43f4-4dc3-a484-48ae31ed7eee-kube-api-access-g464z\") pod \"keystone-operator-controller-manager-58b8dcc5fb-pnhmq\" (UID: \"23f59072-43f4-4dc3-a484-48ae31ed7eee\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" Dec 04 22:31:53.846014 master-0 kubenswrapper[33572]: I1204 22:31:53.841977 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2dv5\" (UniqueName: \"kubernetes.io/projected/5c9990c8-62a6-42fd-9965-50331f773940-kube-api-access-z2dv5\") pod \"ironic-operator-controller-manager-7c9bfd6967-5pn2v\" (UID: \"5c9990c8-62a6-42fd-9965-50331f773940\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" Dec 04 22:31:53.846014 master-0 kubenswrapper[33572]: I1204 22:31:53.844434 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" Dec 04 22:31:53.873750 master-0 kubenswrapper[33572]: I1204 22:31:53.873709 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g464z\" (UniqueName: \"kubernetes.io/projected/23f59072-43f4-4dc3-a484-48ae31ed7eee-kube-api-access-g464z\") pod \"keystone-operator-controller-manager-58b8dcc5fb-pnhmq\" (UID: \"23f59072-43f4-4dc3-a484-48ae31ed7eee\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" Dec 04 22:31:53.880933 master-0 kubenswrapper[33572]: I1204 22:31:53.880391 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz"] Dec 04 22:31:53.882746 master-0 kubenswrapper[33572]: I1204 22:31:53.882025 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" Dec 04 22:31:53.891292 master-0 kubenswrapper[33572]: I1204 22:31:53.887961 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" Dec 04 22:31:53.896212 master-0 kubenswrapper[33572]: I1204 22:31:53.896169 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2dv5\" (UniqueName: \"kubernetes.io/projected/5c9990c8-62a6-42fd-9965-50331f773940-kube-api-access-z2dv5\") pod \"ironic-operator-controller-manager-7c9bfd6967-5pn2v\" (UID: \"5c9990c8-62a6-42fd-9965-50331f773940\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" Dec 04 22:31:53.896308 master-0 kubenswrapper[33572]: I1204 22:31:53.896252 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr"] Dec 04 22:31:53.915422 master-0 kubenswrapper[33572]: I1204 22:31:53.913635 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz"] Dec 04 22:31:53.932237 master-0 kubenswrapper[33572]: I1204 22:31:53.930157 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8"] Dec 04 22:31:53.932237 master-0 kubenswrapper[33572]: I1204 22:31:53.932139 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" Dec 04 22:31:53.944666 master-0 kubenswrapper[33572]: I1204 22:31:53.944418 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lzdr\" (UniqueName: \"kubernetes.io/projected/fb197cd4-44a7-414d-aef1-a6faf66f22d6-kube-api-access-6lzdr\") pod \"mariadb-operator-controller-manager-647d75769b-v8srz\" (UID: \"fb197cd4-44a7-414d-aef1-a6faf66f22d6\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" Dec 04 22:31:53.944666 master-0 kubenswrapper[33572]: I1204 22:31:53.944565 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9px5j\" (UniqueName: \"kubernetes.io/projected/c8e6de34-2747-42ce-b9c1-dfdcd71d3707-kube-api-access-9px5j\") pod \"manila-operator-controller-manager-56f9fbf74b-xsxzr\" (UID: \"c8e6de34-2747-42ce-b9c1-dfdcd71d3707\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" Dec 04 22:31:53.948687 master-0 kubenswrapper[33572]: I1204 22:31:53.947447 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8"] Dec 04 22:31:53.966073 master-0 kubenswrapper[33572]: I1204 22:31:53.964854 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" Dec 04 22:31:54.005092 master-0 kubenswrapper[33572]: I1204 22:31:53.994886 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9px5j\" (UniqueName: \"kubernetes.io/projected/c8e6de34-2747-42ce-b9c1-dfdcd71d3707-kube-api-access-9px5j\") pod \"manila-operator-controller-manager-56f9fbf74b-xsxzr\" (UID: \"c8e6de34-2747-42ce-b9c1-dfdcd71d3707\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" Dec 04 22:31:54.005092 master-0 kubenswrapper[33572]: I1204 22:31:53.994980 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd"] Dec 04 22:31:54.005092 master-0 kubenswrapper[33572]: I1204 22:31:54.001709 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" Dec 04 22:31:54.028532 master-0 kubenswrapper[33572]: I1204 22:31:54.011392 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd"] Dec 04 22:31:54.035793 master-0 kubenswrapper[33572]: I1204 22:31:54.035712 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8"] Dec 04 22:31:54.039036 master-0 kubenswrapper[33572]: I1204 22:31:54.039003 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" Dec 04 22:31:54.049693 master-0 kubenswrapper[33572]: I1204 22:31:54.046647 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z6lj\" (UniqueName: \"kubernetes.io/projected/2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3-kube-api-access-4z6lj\") pod \"nova-operator-controller-manager-865fc86d5b-pzbmd\" (UID: \"2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" Dec 04 22:31:54.049693 master-0 kubenswrapper[33572]: I1204 22:31:54.046958 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lzdr\" (UniqueName: \"kubernetes.io/projected/fb197cd4-44a7-414d-aef1-a6faf66f22d6-kube-api-access-6lzdr\") pod \"mariadb-operator-controller-manager-647d75769b-v8srz\" (UID: \"fb197cd4-44a7-414d-aef1-a6faf66f22d6\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" Dec 04 22:31:54.049693 master-0 kubenswrapper[33572]: I1204 22:31:54.047032 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8j5n\" (UniqueName: \"kubernetes.io/projected/fd79fe85-44e9-47e6-857a-672f42581cb8-kube-api-access-v8j5n\") pod \"neutron-operator-controller-manager-7cdd6b54fb-jjxh8\" (UID: \"fd79fe85-44e9-47e6-857a-672f42581cb8\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" Dec 04 22:31:54.053250 master-0 kubenswrapper[33572]: I1204 22:31:54.050310 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" Dec 04 22:31:54.075725 master-0 kubenswrapper[33572]: I1204 22:31:54.075670 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8"] Dec 04 22:31:54.088096 master-0 kubenswrapper[33572]: I1204 22:31:54.088054 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lzdr\" (UniqueName: \"kubernetes.io/projected/fb197cd4-44a7-414d-aef1-a6faf66f22d6-kube-api-access-6lzdr\") pod \"mariadb-operator-controller-manager-647d75769b-v8srz\" (UID: \"fb197cd4-44a7-414d-aef1-a6faf66f22d6\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" Dec 04 22:31:54.098840 master-0 kubenswrapper[33572]: I1204 22:31:54.098780 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-748fk"] Dec 04 22:31:54.100615 master-0 kubenswrapper[33572]: I1204 22:31:54.100584 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" Dec 04 22:31:54.107276 master-0 kubenswrapper[33572]: I1204 22:31:54.107255 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf"] Dec 04 22:31:54.109253 master-0 kubenswrapper[33572]: I1204 22:31:54.109230 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:31:54.113208 master-0 kubenswrapper[33572]: I1204 22:31:54.111892 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 04 22:31:54.118952 master-0 kubenswrapper[33572]: I1204 22:31:54.118717 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-llths"] Dec 04 22:31:54.129208 master-0 kubenswrapper[33572]: I1204 22:31:54.129155 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" Dec 04 22:31:54.160469 master-0 kubenswrapper[33572]: I1204 22:31:54.150137 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-748fk"] Dec 04 22:31:54.160469 master-0 kubenswrapper[33572]: I1204 22:31:54.157698 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z6lj\" (UniqueName: \"kubernetes.io/projected/2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3-kube-api-access-4z6lj\") pod \"nova-operator-controller-manager-865fc86d5b-pzbmd\" (UID: \"2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" Dec 04 22:31:54.160469 master-0 kubenswrapper[33572]: I1204 22:31:54.157886 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v252n\" (UniqueName: \"kubernetes.io/projected/a08eac0d-e5f3-41bd-b255-bb4e6546b7f9-kube-api-access-v252n\") pod \"ovn-operator-controller-manager-647f96877-748fk\" (UID: \"a08eac0d-e5f3-41bd-b255-bb4e6546b7f9\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" Dec 04 22:31:54.160469 master-0 kubenswrapper[33572]: I1204 22:31:54.157981 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkwf7\" (UniqueName: \"kubernetes.io/projected/61f59de0-4c3d-40f3-88bc-05e8174f41de-kube-api-access-zkwf7\") pod \"placement-operator-controller-manager-6b64f6f645-llths\" (UID: \"61f59de0-4c3d-40f3-88bc-05e8174f41de\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" Dec 04 22:31:54.160469 master-0 kubenswrapper[33572]: I1204 22:31:54.160447 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:31:54.160876 master-0 kubenswrapper[33572]: I1204 22:31:54.160624 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9hjs\" (UniqueName: \"kubernetes.io/projected/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-kube-api-access-v9hjs\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:31:54.160876 master-0 kubenswrapper[33572]: I1204 22:31:54.160707 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8j5n\" (UniqueName: \"kubernetes.io/projected/fd79fe85-44e9-47e6-857a-672f42581cb8-kube-api-access-v8j5n\") pod \"neutron-operator-controller-manager-7cdd6b54fb-jjxh8\" (UID: \"fd79fe85-44e9-47e6-857a-672f42581cb8\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" Dec 04 22:31:54.160876 master-0 kubenswrapper[33572]: I1204 22:31:54.160739 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65zr9\" (UniqueName: \"kubernetes.io/projected/8d0cfe95-c307-45c5-aba1-45c7d5217b2b-kube-api-access-65zr9\") pod \"octavia-operator-controller-manager-845b79dc4f-7v5g8\" (UID: \"8d0cfe95-c307-45c5-aba1-45c7d5217b2b\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" Dec 04 22:31:54.163092 master-0 kubenswrapper[33572]: I1204 22:31:54.161700 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" Dec 04 22:31:54.177955 master-0 kubenswrapper[33572]: I1204 22:31:54.172664 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf"] Dec 04 22:31:54.180684 master-0 kubenswrapper[33572]: I1204 22:31:54.179261 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-llths"] Dec 04 22:31:54.195308 master-0 kubenswrapper[33572]: I1204 22:31:54.195249 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8j5n\" (UniqueName: \"kubernetes.io/projected/fd79fe85-44e9-47e6-857a-672f42581cb8-kube-api-access-v8j5n\") pod \"neutron-operator-controller-manager-7cdd6b54fb-jjxh8\" (UID: \"fd79fe85-44e9-47e6-857a-672f42581cb8\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" Dec 04 22:31:54.196388 master-0 kubenswrapper[33572]: I1204 22:31:54.196342 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z6lj\" (UniqueName: \"kubernetes.io/projected/2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3-kube-api-access-4z6lj\") pod \"nova-operator-controller-manager-865fc86d5b-pzbmd\" (UID: \"2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" Dec 04 22:31:54.203180 master-0 kubenswrapper[33572]: I1204 22:31:54.203113 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-jbqjt"] Dec 04 22:31:54.204868 master-0 kubenswrapper[33572]: I1204 22:31:54.204839 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" Dec 04 22:31:54.208938 master-0 kubenswrapper[33572]: I1204 22:31:54.208892 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" Dec 04 22:31:54.210471 master-0 kubenswrapper[33572]: I1204 22:31:54.210436 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" Dec 04 22:31:54.213356 master-0 kubenswrapper[33572]: I1204 22:31:54.213025 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-jbqjt"] Dec 04 22:31:54.242616 master-0 kubenswrapper[33572]: I1204 22:31:54.242465 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm"] Dec 04 22:31:54.252095 master-0 kubenswrapper[33572]: I1204 22:31:54.251975 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" Dec 04 22:31:54.255439 master-0 kubenswrapper[33572]: I1204 22:31:54.255374 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm"] Dec 04 22:31:54.262318 master-0 kubenswrapper[33572]: I1204 22:31:54.261128 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65"] Dec 04 22:31:54.264405 master-0 kubenswrapper[33572]: I1204 22:31:54.264218 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" Dec 04 22:31:54.266200 master-0 kubenswrapper[33572]: I1204 22:31:54.266115 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65zr9\" (UniqueName: \"kubernetes.io/projected/8d0cfe95-c307-45c5-aba1-45c7d5217b2b-kube-api-access-65zr9\") pod \"octavia-operator-controller-manager-845b79dc4f-7v5g8\" (UID: \"8d0cfe95-c307-45c5-aba1-45c7d5217b2b\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" Dec 04 22:31:54.266315 master-0 kubenswrapper[33572]: I1204 22:31:54.266231 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7j5\" (UniqueName: \"kubernetes.io/projected/8ea53b4d-8452-4f0d-9508-28101ae503a5-kube-api-access-xc7j5\") pod \"swift-operator-controller-manager-696b999796-jbqjt\" (UID: \"8ea53b4d-8452-4f0d-9508-28101ae503a5\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" Dec 04 22:31:54.266315 master-0 kubenswrapper[33572]: I1204 22:31:54.266301 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v252n\" (UniqueName: \"kubernetes.io/projected/a08eac0d-e5f3-41bd-b255-bb4e6546b7f9-kube-api-access-v252n\") pod \"ovn-operator-controller-manager-647f96877-748fk\" (UID: \"a08eac0d-e5f3-41bd-b255-bb4e6546b7f9\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" Dec 04 22:31:54.267783 master-0 kubenswrapper[33572]: I1204 22:31:54.266340 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:31:54.267783 master-0 kubenswrapper[33572]: I1204 22:31:54.266381 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkwf7\" (UniqueName: \"kubernetes.io/projected/61f59de0-4c3d-40f3-88bc-05e8174f41de-kube-api-access-zkwf7\") pod \"placement-operator-controller-manager-6b64f6f645-llths\" (UID: \"61f59de0-4c3d-40f3-88bc-05e8174f41de\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" Dec 04 22:31:54.267783 master-0 kubenswrapper[33572]: I1204 22:31:54.266422 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:31:54.267783 master-0 kubenswrapper[33572]: I1204 22:31:54.266516 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9hjs\" (UniqueName: \"kubernetes.io/projected/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-kube-api-access-v9hjs\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:31:54.267783 master-0 kubenswrapper[33572]: E1204 22:31:54.267641 33572 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 22:31:54.267783 master-0 kubenswrapper[33572]: E1204 22:31:54.267682 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert podName:f9e3f583-1af1-44f2-a6e3-271336e2cf1e nodeName:}" failed. No retries permitted until 2025-12-04 22:31:55.267667701 +0000 UTC m=+778.995193350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-qr956" (UID: "f9e3f583-1af1-44f2-a6e3-271336e2cf1e") : secret "infra-operator-webhook-server-cert" not found Dec 04 22:31:54.292061 master-0 kubenswrapper[33572]: E1204 22:31:54.268043 33572 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:31:54.292061 master-0 kubenswrapper[33572]: E1204 22:31:54.268070 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert podName:590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf nodeName:}" failed. No retries permitted until 2025-12-04 22:31:54.768061862 +0000 UTC m=+778.495587511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" (UID: "590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:31:54.292061 master-0 kubenswrapper[33572]: I1204 22:31:54.282555 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" Dec 04 22:31:54.300535 master-0 kubenswrapper[33572]: I1204 22:31:54.300272 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkwf7\" (UniqueName: \"kubernetes.io/projected/61f59de0-4c3d-40f3-88bc-05e8174f41de-kube-api-access-zkwf7\") pod \"placement-operator-controller-manager-6b64f6f645-llths\" (UID: \"61f59de0-4c3d-40f3-88bc-05e8174f41de\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" Dec 04 22:31:54.302185 master-0 kubenswrapper[33572]: I1204 22:31:54.301755 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65zr9\" (UniqueName: \"kubernetes.io/projected/8d0cfe95-c307-45c5-aba1-45c7d5217b2b-kube-api-access-65zr9\") pod \"octavia-operator-controller-manager-845b79dc4f-7v5g8\" (UID: \"8d0cfe95-c307-45c5-aba1-45c7d5217b2b\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" Dec 04 22:31:54.303980 master-0 kubenswrapper[33572]: I1204 22:31:54.303912 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9hjs\" (UniqueName: \"kubernetes.io/projected/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-kube-api-access-v9hjs\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:31:54.313423 master-0 kubenswrapper[33572]: I1204 22:31:54.309853 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v252n\" (UniqueName: \"kubernetes.io/projected/a08eac0d-e5f3-41bd-b255-bb4e6546b7f9-kube-api-access-v252n\") pod \"ovn-operator-controller-manager-647f96877-748fk\" (UID: \"a08eac0d-e5f3-41bd-b255-bb4e6546b7f9\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" Dec 04 22:31:54.332987 master-0 kubenswrapper[33572]: I1204 22:31:54.332853 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65"] Dec 04 22:31:54.335622 master-0 kubenswrapper[33572]: I1204 22:31:54.334742 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" Dec 04 22:31:54.345776 master-0 kubenswrapper[33572]: I1204 22:31:54.342134 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9"] Dec 04 22:31:54.345776 master-0 kubenswrapper[33572]: I1204 22:31:54.344857 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" Dec 04 22:31:54.373305 master-0 kubenswrapper[33572]: I1204 22:31:54.369879 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9"] Dec 04 22:31:54.376699 master-0 kubenswrapper[33572]: I1204 22:31:54.376355 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7j5\" (UniqueName: \"kubernetes.io/projected/8ea53b4d-8452-4f0d-9508-28101ae503a5-kube-api-access-xc7j5\") pod \"swift-operator-controller-manager-696b999796-jbqjt\" (UID: \"8ea53b4d-8452-4f0d-9508-28101ae503a5\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" Dec 04 22:31:54.376699 master-0 kubenswrapper[33572]: I1204 22:31:54.376547 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flq4h\" (UniqueName: \"kubernetes.io/projected/de6b792f-5002-4bc4-8586-b8f76e57bdf1-kube-api-access-flq4h\") pod \"watcher-operator-controller-manager-6b9b669fdb-r87g9\" (UID: \"de6b792f-5002-4bc4-8586-b8f76e57bdf1\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" Dec 04 22:31:54.402473 master-0 kubenswrapper[33572]: I1204 22:31:54.402406 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74"] Dec 04 22:31:54.403932 master-0 kubenswrapper[33572]: I1204 22:31:54.403902 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:54.408286 master-0 kubenswrapper[33572]: I1204 22:31:54.408224 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 04 22:31:54.408286 master-0 kubenswrapper[33572]: I1204 22:31:54.408254 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 04 22:31:54.409308 master-0 kubenswrapper[33572]: I1204 22:31:54.409263 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7j5\" (UniqueName: \"kubernetes.io/projected/8ea53b4d-8452-4f0d-9508-28101ae503a5-kube-api-access-xc7j5\") pod \"swift-operator-controller-manager-696b999796-jbqjt\" (UID: \"8ea53b4d-8452-4f0d-9508-28101ae503a5\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" Dec 04 22:31:54.435753 master-0 kubenswrapper[33572]: I1204 22:31:54.435682 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74"] Dec 04 22:31:54.459002 master-0 kubenswrapper[33572]: I1204 22:31:54.458952 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg"] Dec 04 22:31:54.460347 master-0 kubenswrapper[33572]: I1204 22:31:54.460318 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" Dec 04 22:31:54.462666 master-0 kubenswrapper[33572]: I1204 22:31:54.462624 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" Dec 04 22:31:54.479968 master-0 kubenswrapper[33572]: I1204 22:31:54.475831 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg"] Dec 04 22:31:54.496821 master-0 kubenswrapper[33572]: I1204 22:31:54.496742 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2t2x\" (UniqueName: \"kubernetes.io/projected/502a1fc5-ff64-42b5-be7c-9cba5c1044c5-kube-api-access-l2t2x\") pod \"test-operator-controller-manager-57dfcdd5b8-qqh65\" (UID: \"502a1fc5-ff64-42b5-be7c-9cba5c1044c5\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" Dec 04 22:31:54.496893 master-0 kubenswrapper[33572]: I1204 22:31:54.496822 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:54.496893 master-0 kubenswrapper[33572]: I1204 22:31:54.496845 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vzpc\" (UniqueName: \"kubernetes.io/projected/e76f37b2-5ae2-4931-ac95-8d6161415d17-kube-api-access-6vzpc\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:54.496960 master-0 kubenswrapper[33572]: I1204 22:31:54.496913 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flq4h\" (UniqueName: \"kubernetes.io/projected/de6b792f-5002-4bc4-8586-b8f76e57bdf1-kube-api-access-flq4h\") pod \"watcher-operator-controller-manager-6b9b669fdb-r87g9\" (UID: \"de6b792f-5002-4bc4-8586-b8f76e57bdf1\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" Dec 04 22:31:54.496960 master-0 kubenswrapper[33572]: I1204 22:31:54.496942 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mc9x\" (UniqueName: \"kubernetes.io/projected/b64f7476-1cd1-4d66-ad63-0a78ff022873-kube-api-access-9mc9x\") pod \"telemetry-operator-controller-manager-7b5867bfc7-4nnvm\" (UID: \"b64f7476-1cd1-4d66-ad63-0a78ff022873\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" Dec 04 22:31:54.497555 master-0 kubenswrapper[33572]: I1204 22:31:54.497019 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qjfs\" (UniqueName: \"kubernetes.io/projected/7eebd686-6e62-4fe5-9d52-f09c57e50b2e-kube-api-access-5qjfs\") pod \"rabbitmq-cluster-operator-manager-78955d896f-qffjg\" (UID: \"7eebd686-6e62-4fe5-9d52-f09c57e50b2e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" Dec 04 22:31:54.497555 master-0 kubenswrapper[33572]: I1204 22:31:54.497175 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:54.543524 master-0 kubenswrapper[33572]: I1204 22:31:54.543183 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" Dec 04 22:31:54.543736 master-0 kubenswrapper[33572]: I1204 22:31:54.543598 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" Dec 04 22:31:54.570163 master-0 kubenswrapper[33572]: I1204 22:31:54.570085 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flq4h\" (UniqueName: \"kubernetes.io/projected/de6b792f-5002-4bc4-8586-b8f76e57bdf1-kube-api-access-flq4h\") pod \"watcher-operator-controller-manager-6b9b669fdb-r87g9\" (UID: \"de6b792f-5002-4bc4-8586-b8f76e57bdf1\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" Dec 04 22:31:54.600187 master-0 kubenswrapper[33572]: I1204 22:31:54.599891 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mc9x\" (UniqueName: \"kubernetes.io/projected/b64f7476-1cd1-4d66-ad63-0a78ff022873-kube-api-access-9mc9x\") pod \"telemetry-operator-controller-manager-7b5867bfc7-4nnvm\" (UID: \"b64f7476-1cd1-4d66-ad63-0a78ff022873\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" Dec 04 22:31:54.600820 master-0 kubenswrapper[33572]: I1204 22:31:54.600610 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qjfs\" (UniqueName: \"kubernetes.io/projected/7eebd686-6e62-4fe5-9d52-f09c57e50b2e-kube-api-access-5qjfs\") pod \"rabbitmq-cluster-operator-manager-78955d896f-qffjg\" (UID: \"7eebd686-6e62-4fe5-9d52-f09c57e50b2e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" Dec 04 22:31:54.602230 master-0 kubenswrapper[33572]: I1204 22:31:54.602193 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:54.602458 master-0 kubenswrapper[33572]: I1204 22:31:54.602428 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2t2x\" (UniqueName: \"kubernetes.io/projected/502a1fc5-ff64-42b5-be7c-9cba5c1044c5-kube-api-access-l2t2x\") pod \"test-operator-controller-manager-57dfcdd5b8-qqh65\" (UID: \"502a1fc5-ff64-42b5-be7c-9cba5c1044c5\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" Dec 04 22:31:54.603172 master-0 kubenswrapper[33572]: I1204 22:31:54.602481 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:54.603825 master-0 kubenswrapper[33572]: I1204 22:31:54.603788 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vzpc\" (UniqueName: \"kubernetes.io/projected/e76f37b2-5ae2-4931-ac95-8d6161415d17-kube-api-access-6vzpc\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:54.604682 master-0 kubenswrapper[33572]: E1204 22:31:54.604649 33572 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 22:31:54.604751 master-0 kubenswrapper[33572]: E1204 22:31:54.604697 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:31:55.1046832 +0000 UTC m=+778.832208849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "metrics-server-cert" not found Dec 04 22:31:54.605964 master-0 kubenswrapper[33572]: E1204 22:31:54.605875 33572 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 22:31:54.605964 master-0 kubenswrapper[33572]: E1204 22:31:54.605940 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:31:55.105909515 +0000 UTC m=+778.833435164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "webhook-server-cert" not found Dec 04 22:31:54.617930 master-0 kubenswrapper[33572]: I1204 22:31:54.617874 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mc9x\" (UniqueName: \"kubernetes.io/projected/b64f7476-1cd1-4d66-ad63-0a78ff022873-kube-api-access-9mc9x\") pod \"telemetry-operator-controller-manager-7b5867bfc7-4nnvm\" (UID: \"b64f7476-1cd1-4d66-ad63-0a78ff022873\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" Dec 04 22:31:54.625932 master-0 kubenswrapper[33572]: I1204 22:31:54.625772 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qjfs\" (UniqueName: \"kubernetes.io/projected/7eebd686-6e62-4fe5-9d52-f09c57e50b2e-kube-api-access-5qjfs\") pod \"rabbitmq-cluster-operator-manager-78955d896f-qffjg\" (UID: \"7eebd686-6e62-4fe5-9d52-f09c57e50b2e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" Dec 04 22:31:54.637649 master-0 kubenswrapper[33572]: I1204 22:31:54.637566 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2t2x\" (UniqueName: \"kubernetes.io/projected/502a1fc5-ff64-42b5-be7c-9cba5c1044c5-kube-api-access-l2t2x\") pod \"test-operator-controller-manager-57dfcdd5b8-qqh65\" (UID: \"502a1fc5-ff64-42b5-be7c-9cba5c1044c5\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" Dec 04 22:31:54.640176 master-0 kubenswrapper[33572]: I1204 22:31:54.640138 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vzpc\" (UniqueName: \"kubernetes.io/projected/e76f37b2-5ae2-4931-ac95-8d6161415d17-kube-api-access-6vzpc\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:54.661299 master-0 kubenswrapper[33572]: I1204 22:31:54.661253 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" Dec 04 22:31:54.809552 master-0 kubenswrapper[33572]: I1204 22:31:54.808698 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:31:54.809552 master-0 kubenswrapper[33572]: E1204 22:31:54.808972 33572 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:31:54.809552 master-0 kubenswrapper[33572]: E1204 22:31:54.809021 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert podName:590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf nodeName:}" failed. No retries permitted until 2025-12-04 22:31:55.809004818 +0000 UTC m=+779.536530467 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" (UID: "590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:31:54.816011 master-0 kubenswrapper[33572]: I1204 22:31:54.811777 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" Dec 04 22:31:54.919989 master-0 kubenswrapper[33572]: I1204 22:31:54.919952 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" Dec 04 22:31:54.932429 master-0 kubenswrapper[33572]: I1204 22:31:54.931324 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" Dec 04 22:31:55.046847 master-0 kubenswrapper[33572]: I1204 22:31:55.045613 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p"] Dec 04 22:31:55.055888 master-0 kubenswrapper[33572]: I1204 22:31:55.055828 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r"] Dec 04 22:31:55.072177 master-0 kubenswrapper[33572]: I1204 22:31:55.071992 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v"] Dec 04 22:31:55.088151 master-0 kubenswrapper[33572]: W1204 22:31:55.085756 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c529a2_072e_4777_9fb9_ec34aa5396ae.slice/crio-86fde66adefb3ba5cd1355a5dbd3538aeb07724f99581271b785978072c67710 WatchSource:0}: Error finding container 86fde66adefb3ba5cd1355a5dbd3538aeb07724f99581271b785978072c67710: Status 404 returned error can't find the container with id 86fde66adefb3ba5cd1355a5dbd3538aeb07724f99581271b785978072c67710 Dec 04 22:31:55.095395 master-0 kubenswrapper[33572]: I1204 22:31:55.095333 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k"] Dec 04 22:31:55.104360 master-0 kubenswrapper[33572]: W1204 22:31:55.104144 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9cefa2f_e4d2_4c8a_b27c_c71af710a6df.slice/crio-27acd50625e2dd3a79472f06fc8291c62178ee6fdc003508b8a982f85f4e0235 WatchSource:0}: Error finding container 27acd50625e2dd3a79472f06fc8291c62178ee6fdc003508b8a982f85f4e0235: Status 404 returned error can't find the container with id 27acd50625e2dd3a79472f06fc8291c62178ee6fdc003508b8a982f85f4e0235 Dec 04 22:31:55.118703 master-0 kubenswrapper[33572]: I1204 22:31:55.116693 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:55.118703 master-0 kubenswrapper[33572]: E1204 22:31:55.116878 33572 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 22:31:55.118703 master-0 kubenswrapper[33572]: E1204 22:31:55.116983 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:31:56.116959001 +0000 UTC m=+779.844484740 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "metrics-server-cert" not found Dec 04 22:31:55.118703 master-0 kubenswrapper[33572]: I1204 22:31:55.117022 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:55.118703 master-0 kubenswrapper[33572]: E1204 22:31:55.117211 33572 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 22:31:55.118703 master-0 kubenswrapper[33572]: E1204 22:31:55.117327 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:31:56.11730857 +0000 UTC m=+779.844834219 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "webhook-server-cert" not found Dec 04 22:31:55.136108 master-0 kubenswrapper[33572]: I1204 22:31:55.136051 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl"] Dec 04 22:31:55.319859 master-0 kubenswrapper[33572]: I1204 22:31:55.319701 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:31:55.320082 master-0 kubenswrapper[33572]: E1204 22:31:55.319895 33572 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 22:31:55.320082 master-0 kubenswrapper[33572]: E1204 22:31:55.319970 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert podName:f9e3f583-1af1-44f2-a6e3-271336e2cf1e nodeName:}" failed. No retries permitted until 2025-12-04 22:31:57.319951122 +0000 UTC m=+781.047476781 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-qr956" (UID: "f9e3f583-1af1-44f2-a6e3-271336e2cf1e") : secret "infra-operator-webhook-server-cert" not found Dec 04 22:31:55.486422 master-0 kubenswrapper[33572]: I1204 22:31:55.486256 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" event={"ID":"f9ec5f3f-d171-4fc3-abb1-489d49fe30d9","Type":"ContainerStarted","Data":"876289a9fabff501cbf3adc476ef33c56225d52e9257a8f0e50d3c9c948a7d11"} Dec 04 22:31:55.490040 master-0 kubenswrapper[33572]: I1204 22:31:55.488090 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" event={"ID":"c9cefa2f-e4d2-4c8a-b27c-c71af710a6df","Type":"ContainerStarted","Data":"27acd50625e2dd3a79472f06fc8291c62178ee6fdc003508b8a982f85f4e0235"} Dec 04 22:31:55.490040 master-0 kubenswrapper[33572]: I1204 22:31:55.489779 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" event={"ID":"68c529a2-072e-4777-9fb9-ec34aa5396ae","Type":"ContainerStarted","Data":"86fde66adefb3ba5cd1355a5dbd3538aeb07724f99581271b785978072c67710"} Dec 04 22:31:55.491971 master-0 kubenswrapper[33572]: I1204 22:31:55.491937 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" event={"ID":"e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca","Type":"ContainerStarted","Data":"bbcf9a19b750967626eb39f6da24529159a1c604642f4a19adda1b9df6227fec"} Dec 04 22:31:55.492942 master-0 kubenswrapper[33572]: I1204 22:31:55.492905 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" event={"ID":"1d25b1d0-f7a6-451b-897f-474f32b23ef1","Type":"ContainerStarted","Data":"dda9cc5d3921d09c722d33e9d344c4e1ce20e76ea5c4eec556381184804a0c27"} Dec 04 22:31:55.836090 master-0 kubenswrapper[33572]: I1204 22:31:55.829101 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:31:55.836090 master-0 kubenswrapper[33572]: E1204 22:31:55.829635 33572 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:31:55.836090 master-0 kubenswrapper[33572]: E1204 22:31:55.833064 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert podName:590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf nodeName:}" failed. No retries permitted until 2025-12-04 22:31:57.832976514 +0000 UTC m=+781.560502153 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" (UID: "590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:31:56.139598 master-0 kubenswrapper[33572]: I1204 22:31:56.139435 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq"] Dec 04 22:31:56.144010 master-0 kubenswrapper[33572]: I1204 22:31:56.142624 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:56.144010 master-0 kubenswrapper[33572]: E1204 22:31:56.142784 33572 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 22:31:56.144010 master-0 kubenswrapper[33572]: E1204 22:31:56.142861 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:31:58.142842449 +0000 UTC m=+781.870368098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "webhook-server-cert" not found Dec 04 22:31:56.144010 master-0 kubenswrapper[33572]: I1204 22:31:56.143903 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:56.145744 master-0 kubenswrapper[33572]: E1204 22:31:56.143981 33572 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 22:31:56.145912 master-0 kubenswrapper[33572]: E1204 22:31:56.145880 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:31:58.145839692 +0000 UTC m=+781.873365391 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "metrics-server-cert" not found Dec 04 22:31:56.166333 master-0 kubenswrapper[33572]: W1204 22:31:56.165270 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb197cd4_44a7_414d_aef1_a6faf66f22d6.slice/crio-ee0932108b575f9e2a9aa09934616c1c79d2f714519b0805077a4c780ab73d83 WatchSource:0}: Error finding container ee0932108b575f9e2a9aa09934616c1c79d2f714519b0805077a4c780ab73d83: Status 404 returned error can't find the container with id ee0932108b575f9e2a9aa09934616c1c79d2f714519b0805077a4c780ab73d83 Dec 04 22:31:56.167851 master-0 kubenswrapper[33572]: W1204 22:31:56.167778 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea53b4d_8452_4f0d_9508_28101ae503a5.slice/crio-e56833f2ecd26b14c5e01396e4b186868fab187b61bdbe7e1f53bfdd1213e049 WatchSource:0}: Error finding container e56833f2ecd26b14c5e01396e4b186868fab187b61bdbe7e1f53bfdd1213e049: Status 404 returned error can't find the container with id e56833f2ecd26b14c5e01396e4b186868fab187b61bdbe7e1f53bfdd1213e049 Dec 04 22:31:56.192341 master-0 kubenswrapper[33572]: W1204 22:31:56.191774 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61f59de0_4c3d_40f3_88bc_05e8174f41de.slice/crio-5a9dfc46b6f26544e358272d5e8c8fabad75ca856be7754bdb2cb75d5968fcff WatchSource:0}: Error finding container 5a9dfc46b6f26544e358272d5e8c8fabad75ca856be7754bdb2cb75d5968fcff: Status 404 returned error can't find the container with id 5a9dfc46b6f26544e358272d5e8c8fabad75ca856be7754bdb2cb75d5968fcff Dec 04 22:31:56.231752 master-0 kubenswrapper[33572]: I1204 22:31:56.231686 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz"] Dec 04 22:31:56.271211 master-0 kubenswrapper[33572]: I1204 22:31:56.271171 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-jbqjt"] Dec 04 22:31:56.281408 master-0 kubenswrapper[33572]: E1204 22:31:56.280902 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:50,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:30,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:10,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v252n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-647f96877-748fk_openstack-operators(a08eac0d-e5f3-41bd-b255-bb4e6546b7f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:31:56.282690 master-0 kubenswrapper[33572]: W1204 22:31:56.281801 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8e6de34_2747_42ce_b9c1_dfdcd71d3707.slice/crio-3595f4a948945eacb62057eb00e4ad6033dd1685fb3b1c37d0dff2845c2edb3f WatchSource:0}: Error finding container 3595f4a948945eacb62057eb00e4ad6033dd1685fb3b1c37d0dff2845c2edb3f: Status 404 returned error can't find the container with id 3595f4a948945eacb62057eb00e4ad6033dd1685fb3b1c37d0dff2845c2edb3f Dec 04 22:31:56.284832 master-0 kubenswrapper[33572]: E1204 22:31:56.284780 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v252n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-647f96877-748fk_openstack-operators(a08eac0d-e5f3-41bd-b255-bb4e6546b7f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:31:56.286012 master-0 kubenswrapper[33572]: E1204 22:31:56.285951 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" podUID="a08eac0d-e5f3-41bd-b255-bb4e6546b7f9" Dec 04 22:31:56.293725 master-0 kubenswrapper[33572]: E1204 22:31:56.293672 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:50,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:30,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:10,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9px5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-56f9fbf74b-xsxzr_openstack-operators(c8e6de34-2747-42ce-b9c1-dfdcd71d3707): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:31:56.297196 master-0 kubenswrapper[33572]: E1204 22:31:56.296988 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9px5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-56f9fbf74b-xsxzr_openstack-operators(c8e6de34-2747-42ce-b9c1-dfdcd71d3707): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:31:56.298330 master-0 kubenswrapper[33572]: E1204 22:31:56.298257 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" podUID="c8e6de34-2747-42ce-b9c1-dfdcd71d3707" Dec 04 22:31:56.304603 master-0 kubenswrapper[33572]: I1204 22:31:56.304547 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8"] Dec 04 22:31:56.320557 master-0 kubenswrapper[33572]: I1204 22:31:56.320430 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v"] Dec 04 22:31:56.331283 master-0 kubenswrapper[33572]: I1204 22:31:56.331227 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd"] Dec 04 22:31:56.370111 master-0 kubenswrapper[33572]: I1204 22:31:56.370047 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-llths"] Dec 04 22:31:56.389492 master-0 kubenswrapper[33572]: I1204 22:31:56.384191 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9"] Dec 04 22:31:56.395634 master-0 kubenswrapper[33572]: I1204 22:31:56.395487 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8"] Dec 04 22:31:56.409925 master-0 kubenswrapper[33572]: I1204 22:31:56.409824 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz"] Dec 04 22:31:56.419403 master-0 kubenswrapper[33572]: I1204 22:31:56.419352 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-748fk"] Dec 04 22:31:56.451565 master-0 kubenswrapper[33572]: I1204 22:31:56.451469 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr"] Dec 04 22:31:56.505304 master-0 kubenswrapper[33572]: I1204 22:31:56.505230 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" event={"ID":"a08eac0d-e5f3-41bd-b255-bb4e6546b7f9","Type":"ContainerStarted","Data":"748be8d8926c8c50af4c3e7cd9b0ef15d3def378f499f534d1deaa5aa9fa78b1"} Dec 04 22:31:56.507932 master-0 kubenswrapper[33572]: I1204 22:31:56.507874 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" event={"ID":"61f59de0-4c3d-40f3-88bc-05e8174f41de","Type":"ContainerStarted","Data":"5a9dfc46b6f26544e358272d5e8c8fabad75ca856be7754bdb2cb75d5968fcff"} Dec 04 22:31:56.510287 master-0 kubenswrapper[33572]: I1204 22:31:56.510231 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" event={"ID":"c8e6de34-2747-42ce-b9c1-dfdcd71d3707","Type":"ContainerStarted","Data":"3595f4a948945eacb62057eb00e4ad6033dd1685fb3b1c37d0dff2845c2edb3f"} Dec 04 22:31:56.512926 master-0 kubenswrapper[33572]: E1204 22:31:56.512524 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" podUID="a08eac0d-e5f3-41bd-b255-bb4e6546b7f9" Dec 04 22:31:56.514793 master-0 kubenswrapper[33572]: I1204 22:31:56.514755 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" event={"ID":"23f59072-43f4-4dc3-a484-48ae31ed7eee","Type":"ContainerStarted","Data":"0524ed850756903cf51acfc283b64baad8c45cbaebc21a628a4b433a400da0ad"} Dec 04 22:31:56.515752 master-0 kubenswrapper[33572]: E1204 22:31:56.515715 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" podUID="c8e6de34-2747-42ce-b9c1-dfdcd71d3707" Dec 04 22:31:56.518209 master-0 kubenswrapper[33572]: I1204 22:31:56.518170 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" event={"ID":"075b6463-8b4e-47e9-9662-c6fa561e9079","Type":"ContainerStarted","Data":"b7a34424402491e149e50b98fca3f396dbb32a894ac6a67c7bcf3d90a963a359"} Dec 04 22:31:56.522576 master-0 kubenswrapper[33572]: I1204 22:31:56.522473 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" event={"ID":"fd79fe85-44e9-47e6-857a-672f42581cb8","Type":"ContainerStarted","Data":"0d88b4c26b385c0ce5c15f45177c9d1974b781bc175e5f021bc123ea148c7d17"} Dec 04 22:31:56.646630 master-0 kubenswrapper[33572]: I1204 22:31:56.646419 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" event={"ID":"5c9990c8-62a6-42fd-9965-50331f773940","Type":"ContainerStarted","Data":"ee2651c45adb5cda65e1a18102b6750118bb6e566f1ad97a1fba57ceea1b473c"} Dec 04 22:31:56.646630 master-0 kubenswrapper[33572]: I1204 22:31:56.646483 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" event={"ID":"de6b792f-5002-4bc4-8586-b8f76e57bdf1","Type":"ContainerStarted","Data":"5f9c9425f3708e62d3c6e9780e2ae6ef8995d62da1dac1f578ec66cfb48118a7"} Dec 04 22:31:56.646630 master-0 kubenswrapper[33572]: I1204 22:31:56.646523 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" event={"ID":"2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3","Type":"ContainerStarted","Data":"9d950ecd40b31324e33ec13bc34b53cb2255749d698e43ffc8fa12bf639bac66"} Dec 04 22:31:56.646630 master-0 kubenswrapper[33572]: I1204 22:31:56.646629 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" event={"ID":"fb197cd4-44a7-414d-aef1-a6faf66f22d6","Type":"ContainerStarted","Data":"ee0932108b575f9e2a9aa09934616c1c79d2f714519b0805077a4c780ab73d83"} Dec 04 22:31:56.646931 master-0 kubenswrapper[33572]: I1204 22:31:56.646644 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" event={"ID":"8d0cfe95-c307-45c5-aba1-45c7d5217b2b","Type":"ContainerStarted","Data":"3d5e44806714e555179a41c0b29393dee1e76c6528691c6ea739935fe8f3eb3c"} Dec 04 22:31:56.650198 master-0 kubenswrapper[33572]: I1204 22:31:56.650053 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" event={"ID":"8ea53b4d-8452-4f0d-9508-28101ae503a5","Type":"ContainerStarted","Data":"e56833f2ecd26b14c5e01396e4b186868fab187b61bdbe7e1f53bfdd1213e049"} Dec 04 22:31:56.666196 master-0 kubenswrapper[33572]: I1204 22:31:56.665051 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg"] Dec 04 22:31:56.712838 master-0 kubenswrapper[33572]: I1204 22:31:56.710570 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm"] Dec 04 22:31:56.735453 master-0 kubenswrapper[33572]: I1204 22:31:56.735245 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65"] Dec 04 22:31:56.749520 master-0 kubenswrapper[33572]: E1204 22:31:56.749437 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:50,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:30,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:10,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5qjfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000810000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-78955d896f-qffjg_openstack-operators(7eebd686-6e62-4fe5-9d52-f09c57e50b2e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:31:56.752643 master-0 kubenswrapper[33572]: E1204 22:31:56.752585 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" podUID="7eebd686-6e62-4fe5-9d52-f09c57e50b2e" Dec 04 22:31:57.406447 master-0 kubenswrapper[33572]: I1204 22:31:57.406352 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:31:57.406778 master-0 kubenswrapper[33572]: E1204 22:31:57.406647 33572 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 22:31:57.406840 master-0 kubenswrapper[33572]: E1204 22:31:57.406775 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert podName:f9e3f583-1af1-44f2-a6e3-271336e2cf1e nodeName:}" failed. No retries permitted until 2025-12-04 22:32:01.4067474 +0000 UTC m=+785.134273039 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-qr956" (UID: "f9e3f583-1af1-44f2-a6e3-271336e2cf1e") : secret "infra-operator-webhook-server-cert" not found Dec 04 22:31:57.664717 master-0 kubenswrapper[33572]: I1204 22:31:57.664598 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" event={"ID":"502a1fc5-ff64-42b5-be7c-9cba5c1044c5","Type":"ContainerStarted","Data":"dd7f69481fa33aee92bf7106efae59dea9e39b5f0dc21f4babbf107bfb379a98"} Dec 04 22:31:57.670314 master-0 kubenswrapper[33572]: I1204 22:31:57.670247 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" event={"ID":"7eebd686-6e62-4fe5-9d52-f09c57e50b2e","Type":"ContainerStarted","Data":"84cbb4a349d4f709f55009ae3556b3332e146988939f847960f3a4eada7aa7bd"} Dec 04 22:31:57.671881 master-0 kubenswrapper[33572]: E1204 22:31:57.671673 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" podUID="7eebd686-6e62-4fe5-9d52-f09c57e50b2e" Dec 04 22:31:57.673578 master-0 kubenswrapper[33572]: I1204 22:31:57.673533 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" event={"ID":"b64f7476-1cd1-4d66-ad63-0a78ff022873","Type":"ContainerStarted","Data":"bd162b70179e72e234f240ca8691e0ae394be63c9d84026ca3cc17e3cbeced1c"} Dec 04 22:31:57.676953 master-0 kubenswrapper[33572]: E1204 22:31:57.676902 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" podUID="c8e6de34-2747-42ce-b9c1-dfdcd71d3707" Dec 04 22:31:57.682781 master-0 kubenswrapper[33572]: E1204 22:31:57.682710 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" podUID="a08eac0d-e5f3-41bd-b255-bb4e6546b7f9" Dec 04 22:31:57.921102 master-0 kubenswrapper[33572]: I1204 22:31:57.918586 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:31:57.921102 master-0 kubenswrapper[33572]: E1204 22:31:57.919110 33572 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:31:57.921102 master-0 kubenswrapper[33572]: E1204 22:31:57.919170 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert podName:590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf nodeName:}" failed. No retries permitted until 2025-12-04 22:32:01.919152134 +0000 UTC m=+785.646677783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" (UID: "590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:31:58.224582 master-0 kubenswrapper[33572]: I1204 22:31:58.224332 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:58.224582 master-0 kubenswrapper[33572]: I1204 22:31:58.224446 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:31:58.224856 master-0 kubenswrapper[33572]: E1204 22:31:58.224630 33572 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 22:31:58.224856 master-0 kubenswrapper[33572]: E1204 22:31:58.224684 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:32:02.224665199 +0000 UTC m=+785.952190848 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "metrics-server-cert" not found Dec 04 22:31:58.226545 master-0 kubenswrapper[33572]: E1204 22:31:58.225008 33572 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 22:31:58.226545 master-0 kubenswrapper[33572]: E1204 22:31:58.225040 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:32:02.22503184 +0000 UTC m=+785.952557489 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "webhook-server-cert" not found Dec 04 22:31:58.685548 master-0 kubenswrapper[33572]: E1204 22:31:58.685165 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" podUID="7eebd686-6e62-4fe5-9d52-f09c57e50b2e" Dec 04 22:32:01.506844 master-0 kubenswrapper[33572]: I1204 22:32:01.506739 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:32:01.507807 master-0 kubenswrapper[33572]: E1204 22:32:01.507037 33572 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 22:32:01.507807 master-0 kubenswrapper[33572]: E1204 22:32:01.507082 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert podName:f9e3f583-1af1-44f2-a6e3-271336e2cf1e nodeName:}" failed. No retries permitted until 2025-12-04 22:32:09.507069644 +0000 UTC m=+793.234595293 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-qr956" (UID: "f9e3f583-1af1-44f2-a6e3-271336e2cf1e") : secret "infra-operator-webhook-server-cert" not found Dec 04 22:32:02.016821 master-0 kubenswrapper[33572]: I1204 22:32:02.016722 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:32:02.017203 master-0 kubenswrapper[33572]: E1204 22:32:02.016914 33572 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:32:02.017203 master-0 kubenswrapper[33572]: E1204 22:32:02.016982 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert podName:590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf nodeName:}" failed. No retries permitted until 2025-12-04 22:32:10.016962558 +0000 UTC m=+793.744488207 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" (UID: "590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:32:02.325222 master-0 kubenswrapper[33572]: I1204 22:32:02.324951 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:32:02.325222 master-0 kubenswrapper[33572]: E1204 22:32:02.325175 33572 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 22:32:02.325728 master-0 kubenswrapper[33572]: I1204 22:32:02.325252 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:32:02.325728 master-0 kubenswrapper[33572]: E1204 22:32:02.325278 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:32:10.32525572 +0000 UTC m=+794.052781369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "metrics-server-cert" not found Dec 04 22:32:02.325728 master-0 kubenswrapper[33572]: E1204 22:32:02.325473 33572 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 22:32:02.325728 master-0 kubenswrapper[33572]: E1204 22:32:02.325570 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:32:10.325547628 +0000 UTC m=+794.053073277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "webhook-server-cert" not found Dec 04 22:32:02.389712 master-0 kubenswrapper[33572]: I1204 22:32:02.389607 33572 scope.go:117] "RemoveContainer" containerID="4c5f8fe348cfd361d4cff444c1a27d696284b954e7c06243153cba9f36ad5161" Dec 04 22:32:09.587195 master-0 kubenswrapper[33572]: I1204 22:32:09.587114 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:32:09.587980 master-0 kubenswrapper[33572]: E1204 22:32:09.587333 33572 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 22:32:09.587980 master-0 kubenswrapper[33572]: E1204 22:32:09.587461 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert podName:f9e3f583-1af1-44f2-a6e3-271336e2cf1e nodeName:}" failed. No retries permitted until 2025-12-04 22:32:25.587429526 +0000 UTC m=+809.314955205 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-qr956" (UID: "f9e3f583-1af1-44f2-a6e3-271336e2cf1e") : secret "infra-operator-webhook-server-cert" not found Dec 04 22:32:10.097005 master-0 kubenswrapper[33572]: I1204 22:32:10.096927 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:32:10.097268 master-0 kubenswrapper[33572]: E1204 22:32:10.097183 33572 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:32:10.097311 master-0 kubenswrapper[33572]: E1204 22:32:10.097292 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert podName:590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf nodeName:}" failed. No retries permitted until 2025-12-04 22:32:26.097269479 +0000 UTC m=+809.824795138 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" (UID: "590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 22:32:10.401629 master-0 kubenswrapper[33572]: I1204 22:32:10.401415 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:32:10.401629 master-0 kubenswrapper[33572]: I1204 22:32:10.401627 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:32:10.401979 master-0 kubenswrapper[33572]: E1204 22:32:10.401635 33572 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 22:32:10.401979 master-0 kubenswrapper[33572]: E1204 22:32:10.401717 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:32:26.401699733 +0000 UTC m=+810.129225382 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "metrics-server-cert" not found Dec 04 22:32:10.401979 master-0 kubenswrapper[33572]: E1204 22:32:10.401847 33572 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 22:32:10.401979 master-0 kubenswrapper[33572]: E1204 22:32:10.401928 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs podName:e76f37b2-5ae2-4931-ac95-8d6161415d17 nodeName:}" failed. No retries permitted until 2025-12-04 22:32:26.401909279 +0000 UTC m=+810.129434938 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-gvd74" (UID: "e76f37b2-5ae2-4931-ac95-8d6161415d17") : secret "webhook-server-cert" not found Dec 04 22:32:13.763571 master-0 kubenswrapper[33572]: E1204 22:32:13.763521 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8nkbl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-f6cc97788-khfnz_openstack-operators(075b6463-8b4e-47e9-9662-c6fa561e9079): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:32:13.765698 master-0 kubenswrapper[33572]: E1204 22:32:13.765642 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" podUID="075b6463-8b4e-47e9-9662-c6fa561e9079" Dec 04 22:32:13.766376 master-0 kubenswrapper[33572]: E1204 22:32:13.766323 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nz54c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-7fd96594c7-5sgkl_openstack-operators(1d25b1d0-f7a6-451b-897f-474f32b23ef1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:32:13.770809 master-0 kubenswrapper[33572]: E1204 22:32:13.770772 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" podUID="1d25b1d0-f7a6-451b-897f-474f32b23ef1" Dec 04 22:32:13.818096 master-0 kubenswrapper[33572]: E1204 22:32:13.818038 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v252n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-647f96877-748fk_openstack-operators(a08eac0d-e5f3-41bd-b255-bb4e6546b7f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:32:13.820275 master-0 kubenswrapper[33572]: E1204 22:32:13.820214 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" podUID="a08eac0d-e5f3-41bd-b255-bb4e6546b7f9" Dec 04 22:32:13.910409 master-0 kubenswrapper[33572]: E1204 22:32:13.910170 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n79m7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-84bc9f68f5-7rc6r_openstack-operators(e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:32:13.911006 master-0 kubenswrapper[33572]: E1204 22:32:13.910945 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xc7j5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000810000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-696b999796-jbqjt_openstack-operators(8ea53b4d-8452-4f0d-9508-28101ae503a5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:32:13.911370 master-0 kubenswrapper[33572]: E1204 22:32:13.911327 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" podUID="e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca" Dec 04 22:32:13.916405 master-0 kubenswrapper[33572]: E1204 22:32:13.915461 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" podUID="8ea53b4d-8452-4f0d-9508-28101ae503a5" Dec 04 22:32:13.928974 master-0 kubenswrapper[33572]: E1204 22:32:13.928897 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9mc9x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7b5867bfc7-4nnvm_openstack-operators(b64f7476-1cd1-4d66-ad63-0a78ff022873): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:32:13.929084 master-0 kubenswrapper[33572]: E1204 22:32:13.929044 33572 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9px5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-56f9fbf74b-xsxzr_openstack-operators(c8e6de34-2747-42ce-b9c1-dfdcd71d3707): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 22:32:13.929327 master-0 kubenswrapper[33572]: I1204 22:32:13.929296 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" event={"ID":"b64f7476-1cd1-4d66-ad63-0a78ff022873","Type":"ContainerStarted","Data":"bfc11cb88617d5a096fe77be3ec179aa8d6f0f27a89b22a3faa9bded27e17b8c"} Dec 04 22:32:13.934516 master-0 kubenswrapper[33572]: E1204 22:32:13.930682 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" podUID="c8e6de34-2747-42ce-b9c1-dfdcd71d3707" Dec 04 22:32:13.934516 master-0 kubenswrapper[33572]: E1204 22:32:13.930776 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" podUID="b64f7476-1cd1-4d66-ad63-0a78ff022873" Dec 04 22:32:13.970650 master-0 kubenswrapper[33572]: I1204 22:32:13.969853 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" event={"ID":"a08eac0d-e5f3-41bd-b255-bb4e6546b7f9","Type":"ContainerStarted","Data":"abc7b7f94e3e5f097959d10f957785eeadeb0af723508c04725d3933da8495b9"} Dec 04 22:32:13.970650 master-0 kubenswrapper[33572]: I1204 22:32:13.970613 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" Dec 04 22:32:13.988865 master-0 kubenswrapper[33572]: E1204 22:32:13.988792 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" podUID="a08eac0d-e5f3-41bd-b255-bb4e6546b7f9" Dec 04 22:32:14.002927 master-0 kubenswrapper[33572]: I1204 22:32:14.001953 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" event={"ID":"61f59de0-4c3d-40f3-88bc-05e8174f41de","Type":"ContainerStarted","Data":"cdbb39a0117477960a016c6fcc28aed4a99c9349efcc3d0ea5dca1fb5c41518e"} Dec 04 22:32:14.037154 master-0 kubenswrapper[33572]: I1204 22:32:14.037089 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" event={"ID":"075b6463-8b4e-47e9-9662-c6fa561e9079","Type":"ContainerStarted","Data":"c030e9841850b05eba1d7e71a2ca714b0ca3382f73daaefe070b07b11fc7deba"} Dec 04 22:32:14.037517 master-0 kubenswrapper[33572]: I1204 22:32:14.037474 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" Dec 04 22:32:14.057822 master-0 kubenswrapper[33572]: E1204 22:32:14.057749 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" podUID="075b6463-8b4e-47e9-9662-c6fa561e9079" Dec 04 22:32:14.066457 master-0 kubenswrapper[33572]: I1204 22:32:14.066395 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" event={"ID":"8d0cfe95-c307-45c5-aba1-45c7d5217b2b","Type":"ContainerStarted","Data":"86d5a60534d9a2dec4cbbeaf4bc9d606548a1308077bb0293ad436a8303afd26"} Dec 04 22:32:14.118523 master-0 kubenswrapper[33572]: I1204 22:32:14.110849 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" event={"ID":"5c9990c8-62a6-42fd-9965-50331f773940","Type":"ContainerStarted","Data":"d4f4383d7e1fab896c6ee2c9ef9b5056f79bb64240091719358811ce4eb50048"} Dec 04 22:32:14.155105 master-0 kubenswrapper[33572]: I1204 22:32:14.155037 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" event={"ID":"23f59072-43f4-4dc3-a484-48ae31ed7eee","Type":"ContainerStarted","Data":"ceaa5b17e91a9baa60921dd4c39932344416eeae4eb19eb3940f59abdb1d68cf"} Dec 04 22:32:14.173531 master-0 kubenswrapper[33572]: I1204 22:32:14.167652 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" Dec 04 22:32:14.181894 master-0 kubenswrapper[33572]: E1204 22:32:14.179939 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" podUID="e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca" Dec 04 22:32:14.181894 master-0 kubenswrapper[33572]: I1204 22:32:14.180459 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" event={"ID":"f9ec5f3f-d171-4fc3-abb1-489d49fe30d9","Type":"ContainerStarted","Data":"5d72eedd94b4131b5b27a7eb33ce7a6f4383850e719558f7d311eb76c68bf8dc"} Dec 04 22:32:14.225540 master-0 kubenswrapper[33572]: I1204 22:32:14.224745 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" Dec 04 22:32:14.238544 master-0 kubenswrapper[33572]: E1204 22:32:14.233440 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" podUID="8ea53b4d-8452-4f0d-9508-28101ae503a5" Dec 04 22:32:14.252541 master-0 kubenswrapper[33572]: I1204 22:32:14.248170 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" event={"ID":"2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3","Type":"ContainerStarted","Data":"b5c6062ca359f3b898acf8eb403d457c7c35b6a1247c5d7afda1d2de772a0259"} Dec 04 22:32:14.257517 master-0 kubenswrapper[33572]: I1204 22:32:14.255874 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" event={"ID":"fb197cd4-44a7-414d-aef1-a6faf66f22d6","Type":"ContainerStarted","Data":"05f00859fc73ffb98d5f64d5c5789805e2c78fc11e77391845efeb29603ae7b5"} Dec 04 22:32:14.276538 master-0 kubenswrapper[33572]: I1204 22:32:14.274075 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" event={"ID":"1d25b1d0-f7a6-451b-897f-474f32b23ef1","Type":"ContainerStarted","Data":"2042c6beed2983df639599e9c9079d2bf625437041d57c4033158100bbf74906"} Dec 04 22:32:14.276538 master-0 kubenswrapper[33572]: I1204 22:32:14.275062 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" Dec 04 22:32:14.283651 master-0 kubenswrapper[33572]: E1204 22:32:14.283270 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" podUID="1d25b1d0-f7a6-451b-897f-474f32b23ef1" Dec 04 22:32:14.325037 master-0 kubenswrapper[33572]: I1204 22:32:14.324942 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" event={"ID":"fd79fe85-44e9-47e6-857a-672f42581cb8","Type":"ContainerStarted","Data":"44cd4d3ccb7dba0e087512831e87ef92c0a08d753f9424cadb28263bc90ca4df"} Dec 04 22:32:14.338523 master-0 kubenswrapper[33572]: I1204 22:32:14.337278 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" event={"ID":"502a1fc5-ff64-42b5-be7c-9cba5c1044c5","Type":"ContainerStarted","Data":"d04724e477c83bf4e4b6bf4392900e12b08dc1307612bbb7eda2cf8070f498f0"} Dec 04 22:32:14.339078 master-0 kubenswrapper[33572]: I1204 22:32:14.338856 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" event={"ID":"c9cefa2f-e4d2-4c8a-b27c-c71af710a6df","Type":"ContainerStarted","Data":"f4f1adb13f0d6c30348108269b9aff141861b5fc023ed0767afc978b22d69af2"} Dec 04 22:32:14.342535 master-0 kubenswrapper[33572]: I1204 22:32:14.340237 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" event={"ID":"de6b792f-5002-4bc4-8586-b8f76e57bdf1","Type":"ContainerStarted","Data":"2b73b894e7519d17855f178abe749fbd562c98a798a311cfa27e8a715ae3c9d1"} Dec 04 22:32:14.362717 master-0 kubenswrapper[33572]: I1204 22:32:14.362645 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" event={"ID":"68c529a2-072e-4777-9fb9-ec34aa5396ae","Type":"ContainerStarted","Data":"beae99b10bdb4aaa66030cc3c6b3084a40db679e4d7dd21140b0f0f3ec776ff7"} Dec 04 22:32:15.375279 master-0 kubenswrapper[33572]: I1204 22:32:15.375217 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" event={"ID":"e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca","Type":"ContainerStarted","Data":"5f439be96e60f611cfb2e6515616ed4ac5de393bff1f53fd1fe021e67f2d74dd"} Dec 04 22:32:15.377121 master-0 kubenswrapper[33572]: E1204 22:32:15.377046 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" podUID="e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca" Dec 04 22:32:15.379044 master-0 kubenswrapper[33572]: I1204 22:32:15.378994 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" event={"ID":"8ea53b4d-8452-4f0d-9508-28101ae503a5","Type":"ContainerStarted","Data":"1591a98f2f15888357feb7e7cb9cd9b54599f57744ae32b76e47f1b822d922a0"} Dec 04 22:32:15.400398 master-0 kubenswrapper[33572]: E1204 22:32:15.398795 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" podUID="8ea53b4d-8452-4f0d-9508-28101ae503a5" Dec 04 22:32:15.400398 master-0 kubenswrapper[33572]: I1204 22:32:15.400392 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" event={"ID":"c8e6de34-2747-42ce-b9c1-dfdcd71d3707","Type":"ContainerStarted","Data":"8737fb1d38018d21d54085f1b32bbb46824c1c2e5a0cdcf21545daad50172fe7"} Dec 04 22:32:15.401526 master-0 kubenswrapper[33572]: I1204 22:32:15.401350 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" Dec 04 22:32:15.405557 master-0 kubenswrapper[33572]: I1204 22:32:15.401658 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" Dec 04 22:32:15.405557 master-0 kubenswrapper[33572]: E1204 22:32:15.402377 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" podUID="075b6463-8b4e-47e9-9662-c6fa561e9079" Dec 04 22:32:15.405557 master-0 kubenswrapper[33572]: E1204 22:32:15.402449 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" podUID="c8e6de34-2747-42ce-b9c1-dfdcd71d3707" Dec 04 22:32:15.433836 master-0 kubenswrapper[33572]: E1204 22:32:15.433792 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" podUID="1d25b1d0-f7a6-451b-897f-474f32b23ef1" Dec 04 22:32:15.434098 master-0 kubenswrapper[33572]: E1204 22:32:15.434082 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" podUID="b64f7476-1cd1-4d66-ad63-0a78ff022873" Dec 04 22:32:15.434221 master-0 kubenswrapper[33572]: E1204 22:32:15.434202 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" podUID="a08eac0d-e5f3-41bd-b255-bb4e6546b7f9" Dec 04 22:32:16.412725 master-0 kubenswrapper[33572]: E1204 22:32:16.412013 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" podUID="8ea53b4d-8452-4f0d-9508-28101ae503a5" Dec 04 22:32:16.412725 master-0 kubenswrapper[33572]: E1204 22:32:16.412402 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" podUID="b64f7476-1cd1-4d66-ad63-0a78ff022873" Dec 04 22:32:16.412725 master-0 kubenswrapper[33572]: E1204 22:32:16.412465 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" podUID="c8e6de34-2747-42ce-b9c1-dfdcd71d3707" Dec 04 22:32:16.412725 master-0 kubenswrapper[33572]: E1204 22:32:16.412533 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" podUID="e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca" Dec 04 22:32:19.443643 master-0 kubenswrapper[33572]: I1204 22:32:19.443571 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" event={"ID":"c9cefa2f-e4d2-4c8a-b27c-c71af710a6df","Type":"ContainerStarted","Data":"357be65fc2c3edc2f1b1e68988c6039873779b401254e3582945d5db4cee5bd9"} Dec 04 22:32:19.444313 master-0 kubenswrapper[33572]: I1204 22:32:19.443853 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" Dec 04 22:32:19.446395 master-0 kubenswrapper[33572]: I1204 22:32:19.446355 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" Dec 04 22:32:19.448635 master-0 kubenswrapper[33572]: I1204 22:32:19.447479 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" event={"ID":"2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3","Type":"ContainerStarted","Data":"f37e7727d5e23bd783ffbf2b21c251dbd2eedd4abfc4d8d11c3bcf0b83f56026"} Dec 04 22:32:19.448635 master-0 kubenswrapper[33572]: I1204 22:32:19.447989 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" Dec 04 22:32:19.450005 master-0 kubenswrapper[33572]: I1204 22:32:19.449970 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" Dec 04 22:32:19.454896 master-0 kubenswrapper[33572]: I1204 22:32:19.454863 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" event={"ID":"de6b792f-5002-4bc4-8586-b8f76e57bdf1","Type":"ContainerStarted","Data":"0e02188da3543728032165e6f95ac10ebe224fec4e8633a2f3088a01b10ea9d9"} Dec 04 22:32:19.455784 master-0 kubenswrapper[33572]: I1204 22:32:19.455762 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" Dec 04 22:32:19.457977 master-0 kubenswrapper[33572]: I1204 22:32:19.457949 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" Dec 04 22:32:19.462424 master-0 kubenswrapper[33572]: I1204 22:32:19.462387 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" event={"ID":"68c529a2-072e-4777-9fb9-ec34aa5396ae","Type":"ContainerStarted","Data":"335856dc82f0f7d53b0a61bbc5126d580179ceadf0b933e463af5e842dddf1b5"} Dec 04 22:32:19.463346 master-0 kubenswrapper[33572]: I1204 22:32:19.463309 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" Dec 04 22:32:19.466819 master-0 kubenswrapper[33572]: I1204 22:32:19.466793 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" Dec 04 22:32:19.474219 master-0 kubenswrapper[33572]: I1204 22:32:19.474170 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" event={"ID":"23f59072-43f4-4dc3-a484-48ae31ed7eee","Type":"ContainerStarted","Data":"7c8a16788e70feb0b2fb25a3d5345361c9f5e1e7c8b6da818fcae2591f883777"} Dec 04 22:32:19.475182 master-0 kubenswrapper[33572]: I1204 22:32:19.475139 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" Dec 04 22:32:19.478185 master-0 kubenswrapper[33572]: I1204 22:32:19.478157 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" event={"ID":"7eebd686-6e62-4fe5-9d52-f09c57e50b2e","Type":"ContainerStarted","Data":"c791a860be43209a67defaaf612c284ddadb86e193e3556551bbf40178941da7"} Dec 04 22:32:19.479024 master-0 kubenswrapper[33572]: I1204 22:32:19.479005 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" Dec 04 22:32:19.480924 master-0 kubenswrapper[33572]: I1204 22:32:19.480834 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-74h4k" podStartSLOduration=2.890835479 podStartE2EDuration="26.480802559s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:55.104409333 +0000 UTC m=+778.831934982" lastFinishedPulling="2025-12-04 22:32:18.694376413 +0000 UTC m=+802.421902062" observedRunningTime="2025-12-04 22:32:19.465082684 +0000 UTC m=+803.192608333" watchObservedRunningTime="2025-12-04 22:32:19.480802559 +0000 UTC m=+803.208328248" Dec 04 22:32:19.481257 master-0 kubenswrapper[33572]: I1204 22:32:19.481236 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" event={"ID":"fb197cd4-44a7-414d-aef1-a6faf66f22d6","Type":"ContainerStarted","Data":"9a792ce33ec557af522d32e664aff5f342f746392e59b454307fee1f56d0d901"} Dec 04 22:32:19.488274 master-0 kubenswrapper[33572]: I1204 22:32:19.488235 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" Dec 04 22:32:19.488548 master-0 kubenswrapper[33572]: I1204 22:32:19.488525 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" event={"ID":"f9ec5f3f-d171-4fc3-abb1-489d49fe30d9","Type":"ContainerStarted","Data":"f4a28fea2a48bf624e73f0a7c7c2e6b03f7445709cde850b468ba1d7ccc09657"} Dec 04 22:32:19.488632 master-0 kubenswrapper[33572]: I1204 22:32:19.488618 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" Dec 04 22:32:19.488788 master-0 kubenswrapper[33572]: I1204 22:32:19.488771 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" Dec 04 22:32:19.492444 master-0 kubenswrapper[33572]: I1204 22:32:19.491051 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" Dec 04 22:32:19.492444 master-0 kubenswrapper[33572]: I1204 22:32:19.492412 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" event={"ID":"5c9990c8-62a6-42fd-9965-50331f773940","Type":"ContainerStarted","Data":"c875da793826e64e20589828e399aec866915c6abdf852f894cdb8a2b3f9e9af"} Dec 04 22:32:19.493053 master-0 kubenswrapper[33572]: I1204 22:32:19.493007 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" Dec 04 22:32:19.526675 master-0 kubenswrapper[33572]: I1204 22:32:19.515167 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" Dec 04 22:32:19.526675 master-0 kubenswrapper[33572]: I1204 22:32:19.515397 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" event={"ID":"61f59de0-4c3d-40f3-88bc-05e8174f41de","Type":"ContainerStarted","Data":"09715080532efce3a52731d6d07ce1fd17719dc28f9a35e6607fdb020417eafa"} Dec 04 22:32:19.542605 master-0 kubenswrapper[33572]: I1204 22:32:19.540067 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-r87g9" podStartSLOduration=4.117063814 podStartE2EDuration="26.540044463s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.2474124 +0000 UTC m=+779.974938039" lastFinishedPulling="2025-12-04 22:32:18.670393029 +0000 UTC m=+802.397918688" observedRunningTime="2025-12-04 22:32:19.514880905 +0000 UTC m=+803.242406574" watchObservedRunningTime="2025-12-04 22:32:19.540044463 +0000 UTC m=+803.267570112" Dec 04 22:32:19.555980 master-0 kubenswrapper[33572]: I1204 22:32:19.552623 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" Dec 04 22:32:19.563411 master-0 kubenswrapper[33572]: I1204 22:32:19.562437 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" Dec 04 22:32:19.563411 master-0 kubenswrapper[33572]: I1204 22:32:19.563138 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" event={"ID":"fd79fe85-44e9-47e6-857a-672f42581cb8","Type":"ContainerStarted","Data":"e47377a65ffc5bfc5813e85e328d22d0d682ce60879baeb19dfb679a50b74672"} Dec 04 22:32:19.568684 master-0 kubenswrapper[33572]: I1204 22:32:19.564212 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" Dec 04 22:32:19.568684 master-0 kubenswrapper[33572]: I1204 22:32:19.568269 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-ds48v" podStartSLOduration=3.163556194 podStartE2EDuration="26.568244855s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:55.096036331 +0000 UTC m=+778.823561990" lastFinishedPulling="2025-12-04 22:32:18.500724982 +0000 UTC m=+802.228250651" observedRunningTime="2025-12-04 22:32:19.561005545 +0000 UTC m=+803.288531194" watchObservedRunningTime="2025-12-04 22:32:19.568244855 +0000 UTC m=+803.295770514" Dec 04 22:32:19.569806 master-0 kubenswrapper[33572]: I1204 22:32:19.569768 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" Dec 04 22:32:19.590154 master-0 kubenswrapper[33572]: I1204 22:32:19.570512 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" event={"ID":"502a1fc5-ff64-42b5-be7c-9cba5c1044c5","Type":"ContainerStarted","Data":"363f199c101bacbe64c2235d4eed6832e5df5498a7d91d78adb98ef28ca73f0c"} Dec 04 22:32:19.590154 master-0 kubenswrapper[33572]: I1204 22:32:19.577026 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" Dec 04 22:32:19.590154 master-0 kubenswrapper[33572]: I1204 22:32:19.580232 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" Dec 04 22:32:19.590154 master-0 kubenswrapper[33572]: I1204 22:32:19.580951 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" event={"ID":"8d0cfe95-c307-45c5-aba1-45c7d5217b2b","Type":"ContainerStarted","Data":"5043b4657725014c7968d7cdb2fd918a4f02cb3268f3c12ade15c049e4ea8740"} Dec 04 22:32:19.590154 master-0 kubenswrapper[33572]: I1204 22:32:19.589844 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" Dec 04 22:32:19.613792 master-0 kubenswrapper[33572]: I1204 22:32:19.605017 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" Dec 04 22:32:19.613792 master-0 kubenswrapper[33572]: I1204 22:32:19.613539 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-pzbmd" podStartSLOduration=4.375270948 podStartE2EDuration="26.613480471s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.25428607 +0000 UTC m=+779.981811719" lastFinishedPulling="2025-12-04 22:32:18.492495593 +0000 UTC m=+802.220021242" observedRunningTime="2025-12-04 22:32:19.605250012 +0000 UTC m=+803.332775671" watchObservedRunningTime="2025-12-04 22:32:19.613480471 +0000 UTC m=+803.341006120" Dec 04 22:32:19.700670 master-0 kubenswrapper[33572]: I1204 22:32:19.697949 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-jjxh8" podStartSLOduration=4.332771238 podStartE2EDuration="26.697923883s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.269033389 +0000 UTC m=+779.996559038" lastFinishedPulling="2025-12-04 22:32:18.634186024 +0000 UTC m=+802.361711683" observedRunningTime="2025-12-04 22:32:19.641375893 +0000 UTC m=+803.368901542" watchObservedRunningTime="2025-12-04 22:32:19.697923883 +0000 UTC m=+803.425449542" Dec 04 22:32:19.710089 master-0 kubenswrapper[33572]: I1204 22:32:19.709993 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-5pn2v" podStartSLOduration=4.157785254 podStartE2EDuration="26.709969717s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.253434407 +0000 UTC m=+779.980960066" lastFinishedPulling="2025-12-04 22:32:18.80561886 +0000 UTC m=+802.533144529" observedRunningTime="2025-12-04 22:32:19.675329796 +0000 UTC m=+803.402855445" watchObservedRunningTime="2025-12-04 22:32:19.709969717 +0000 UTC m=+803.437495376" Dec 04 22:32:19.731546 master-0 kubenswrapper[33572]: I1204 22:32:19.731450 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-7v5g8" podStartSLOduration=3.814059279 podStartE2EDuration="26.731430492s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.257649924 +0000 UTC m=+779.985175573" lastFinishedPulling="2025-12-04 22:32:19.175021137 +0000 UTC m=+802.902546786" observedRunningTime="2025-12-04 22:32:19.726843925 +0000 UTC m=+803.454369574" watchObservedRunningTime="2025-12-04 22:32:19.731430492 +0000 UTC m=+803.458956141" Dec 04 22:32:19.769115 master-0 kubenswrapper[33572]: I1204 22:32:19.768974 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-pnhmq" podStartSLOduration=4.322078132 podStartE2EDuration="26.768948473s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.181182793 +0000 UTC m=+779.908708432" lastFinishedPulling="2025-12-04 22:32:18.628053114 +0000 UTC m=+802.355578773" observedRunningTime="2025-12-04 22:32:19.751149229 +0000 UTC m=+803.478674878" watchObservedRunningTime="2025-12-04 22:32:19.768948473 +0000 UTC m=+803.496474122" Dec 04 22:32:19.842103 master-0 kubenswrapper[33572]: I1204 22:32:19.841702 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-v8srz" podStartSLOduration=4.453052385 podStartE2EDuration="26.84166577s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.247209135 +0000 UTC m=+779.974734784" lastFinishedPulling="2025-12-04 22:32:18.63582248 +0000 UTC m=+802.363348169" observedRunningTime="2025-12-04 22:32:19.793998738 +0000 UTC m=+803.521524387" watchObservedRunningTime="2025-12-04 22:32:19.84166577 +0000 UTC m=+803.569191419" Dec 04 22:32:19.881251 master-0 kubenswrapper[33572]: I1204 22:32:19.876012 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-qffjg" podStartSLOduration=3.9959138039999997 podStartE2EDuration="25.875977042s" podCreationTimestamp="2025-12-04 22:31:54 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.749296882 +0000 UTC m=+780.476822531" lastFinishedPulling="2025-12-04 22:32:18.62936012 +0000 UTC m=+802.356885769" observedRunningTime="2025-12-04 22:32:19.819087114 +0000 UTC m=+803.546612763" watchObservedRunningTime="2025-12-04 22:32:19.875977042 +0000 UTC m=+803.603502691" Dec 04 22:32:19.881593 master-0 kubenswrapper[33572]: I1204 22:32:19.881515 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wcm5p" podStartSLOduration=3.076541729 podStartE2EDuration="26.881481054s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:55.066272745 +0000 UTC m=+778.793798394" lastFinishedPulling="2025-12-04 22:32:18.87121206 +0000 UTC m=+802.598737719" observedRunningTime="2025-12-04 22:32:19.842658887 +0000 UTC m=+803.570184556" watchObservedRunningTime="2025-12-04 22:32:19.881481054 +0000 UTC m=+803.609006703" Dec 04 22:32:19.895171 master-0 kubenswrapper[33572]: I1204 22:32:19.894369 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-llths" podStartSLOduration=4.194729279 podStartE2EDuration="26.894341391s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.25316399 +0000 UTC m=+779.980689649" lastFinishedPulling="2025-12-04 22:32:18.952776122 +0000 UTC m=+802.680301761" observedRunningTime="2025-12-04 22:32:19.869265406 +0000 UTC m=+803.596791075" watchObservedRunningTime="2025-12-04 22:32:19.894341391 +0000 UTC m=+803.621867030" Dec 04 22:32:19.923838 master-0 kubenswrapper[33572]: I1204 22:32:19.923252 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-qqh65" podStartSLOduration=4.614134964 podStartE2EDuration="26.923231313s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.743320887 +0000 UTC m=+780.470846536" lastFinishedPulling="2025-12-04 22:32:19.052417226 +0000 UTC m=+802.779942885" observedRunningTime="2025-12-04 22:32:19.892281204 +0000 UTC m=+803.619806843" watchObservedRunningTime="2025-12-04 22:32:19.923231313 +0000 UTC m=+803.650756962" Dec 04 22:32:23.754168 master-0 kubenswrapper[33572]: I1204 22:32:23.754087 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" Dec 04 22:32:23.859399 master-0 kubenswrapper[33572]: I1204 22:32:23.859315 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" Dec 04 22:32:23.893432 master-0 kubenswrapper[33572]: I1204 22:32:23.893335 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" Dec 04 22:32:24.167074 master-0 kubenswrapper[33572]: I1204 22:32:24.166753 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" Dec 04 22:32:24.466893 master-0 kubenswrapper[33572]: I1204 22:32:24.466838 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" Dec 04 22:32:24.552234 master-0 kubenswrapper[33572]: I1204 22:32:24.552068 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" Dec 04 22:32:24.645022 master-0 kubenswrapper[33572]: I1204 22:32:24.644951 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" event={"ID":"c8e6de34-2747-42ce-b9c1-dfdcd71d3707","Type":"ContainerStarted","Data":"82710dec52adc78950f1fa52acf9219a4f161fb13cda8d1ddfbd2a9b27f87734"} Dec 04 22:32:24.648016 master-0 kubenswrapper[33572]: I1204 22:32:24.647969 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" event={"ID":"075b6463-8b4e-47e9-9662-c6fa561e9079","Type":"ContainerStarted","Data":"8b5c150243bd71ef9d07ab9eca6d9edeb6b187f311b99af457a52906d7665a13"} Dec 04 22:32:24.652164 master-0 kubenswrapper[33572]: I1204 22:32:24.650476 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" event={"ID":"e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca","Type":"ContainerStarted","Data":"326d64690311309d9e52ba454e92f258bdfad4003260cfb4d2a1a6decd9f6432"} Dec 04 22:32:24.653679 master-0 kubenswrapper[33572]: I1204 22:32:24.653636 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" event={"ID":"1d25b1d0-f7a6-451b-897f-474f32b23ef1","Type":"ContainerStarted","Data":"10fbdc05452fca987522f9f3cae2e28ce968c1d520d3915dafd0619d89527009"} Dec 04 22:32:24.753812 master-0 kubenswrapper[33572]: I1204 22:32:24.725768 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-xsxzr" podStartSLOduration=14.999186064 podStartE2EDuration="31.725730653s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.293281702 +0000 UTC m=+780.020807351" lastFinishedPulling="2025-12-04 22:32:13.019826301 +0000 UTC m=+796.747351940" observedRunningTime="2025-12-04 22:32:24.676833538 +0000 UTC m=+808.404359187" watchObservedRunningTime="2025-12-04 22:32:24.725730653 +0000 UTC m=+808.453256322" Dec 04 22:32:24.777570 master-0 kubenswrapper[33572]: I1204 22:32:24.771474 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-7rc6r" podStartSLOduration=14.417997464 podStartE2EDuration="31.771439832s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:55.078949917 +0000 UTC m=+778.806475566" lastFinishedPulling="2025-12-04 22:32:12.432392285 +0000 UTC m=+796.159917934" observedRunningTime="2025-12-04 22:32:24.707950251 +0000 UTC m=+808.435475890" watchObservedRunningTime="2025-12-04 22:32:24.771439832 +0000 UTC m=+808.498965481" Dec 04 22:32:24.792534 master-0 kubenswrapper[33572]: I1204 22:32:24.782429 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-5sgkl" podStartSLOduration=14.03722604 podStartE2EDuration="31.782400806s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:55.156382654 +0000 UTC m=+778.883908303" lastFinishedPulling="2025-12-04 22:32:12.9015574 +0000 UTC m=+796.629083069" observedRunningTime="2025-12-04 22:32:24.760596241 +0000 UTC m=+808.488121910" watchObservedRunningTime="2025-12-04 22:32:24.782400806 +0000 UTC m=+808.509926455" Dec 04 22:32:24.811600 master-0 kubenswrapper[33572]: I1204 22:32:24.805990 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-khfnz" podStartSLOduration=15.648880298 podStartE2EDuration="31.80596927s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.274937413 +0000 UTC m=+780.002463072" lastFinishedPulling="2025-12-04 22:32:12.432026385 +0000 UTC m=+796.159552044" observedRunningTime="2025-12-04 22:32:24.805682822 +0000 UTC m=+808.533208481" watchObservedRunningTime="2025-12-04 22:32:24.80596927 +0000 UTC m=+808.533494919" Dec 04 22:32:24.832532 master-0 kubenswrapper[33572]: I1204 22:32:24.821097 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" Dec 04 22:32:25.595958 master-0 kubenswrapper[33572]: I1204 22:32:25.595114 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:32:25.600876 master-0 kubenswrapper[33572]: I1204 22:32:25.600812 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9e3f583-1af1-44f2-a6e3-271336e2cf1e-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-qr956\" (UID: \"f9e3f583-1af1-44f2-a6e3-271336e2cf1e\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:32:25.663657 master-0 kubenswrapper[33572]: I1204 22:32:25.663561 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" event={"ID":"a08eac0d-e5f3-41bd-b255-bb4e6546b7f9","Type":"ContainerStarted","Data":"a46fefa28f86cc7414a5968b4fc0dc0f7f3883d73042667ff213d1f832fdbaf9"} Dec 04 22:32:25.665655 master-0 kubenswrapper[33572]: I1204 22:32:25.665599 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" event={"ID":"b64f7476-1cd1-4d66-ad63-0a78ff022873","Type":"ContainerStarted","Data":"abde65e183a6e7df4550f66035d0542bffcc5eca2b31402fa5df338fda731bb0"} Dec 04 22:32:25.667220 master-0 kubenswrapper[33572]: I1204 22:32:25.667152 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" event={"ID":"8ea53b4d-8452-4f0d-9508-28101ae503a5","Type":"ContainerStarted","Data":"ac44b3905cbbc89818ef18a3c5b5de1ee66e7037c32ef3c4641c8a7d3b9df446"} Dec 04 22:32:25.685631 master-0 kubenswrapper[33572]: I1204 22:32:25.685559 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-647f96877-748fk" podStartSLOduration=16.011174527 podStartE2EDuration="32.685537739s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.280671492 +0000 UTC m=+780.008197141" lastFinishedPulling="2025-12-04 22:32:12.955034694 +0000 UTC m=+796.682560353" observedRunningTime="2025-12-04 22:32:25.683869352 +0000 UTC m=+809.411395001" watchObservedRunningTime="2025-12-04 22:32:25.685537739 +0000 UTC m=+809.413063378" Dec 04 22:32:25.724462 master-0 kubenswrapper[33572]: I1204 22:32:25.723236 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-4nnvm" podStartSLOduration=16.542424195 podStartE2EDuration="32.723211404s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.740074556 +0000 UTC m=+780.467600205" lastFinishedPulling="2025-12-04 22:32:12.920861765 +0000 UTC m=+796.648387414" observedRunningTime="2025-12-04 22:32:25.714080931 +0000 UTC m=+809.441606580" watchObservedRunningTime="2025-12-04 22:32:25.723211404 +0000 UTC m=+809.450737053" Dec 04 22:32:25.732504 master-0 kubenswrapper[33572]: I1204 22:32:25.732447 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:32:25.776347 master-0 kubenswrapper[33572]: I1204 22:32:25.775595 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-696b999796-jbqjt" podStartSLOduration=16.531528463 podStartE2EDuration="32.775576537s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:31:56.18865521 +0000 UTC m=+779.916180849" lastFinishedPulling="2025-12-04 22:32:12.432703274 +0000 UTC m=+796.160228923" observedRunningTime="2025-12-04 22:32:25.771585736 +0000 UTC m=+809.499111395" watchObservedRunningTime="2025-12-04 22:32:25.775576537 +0000 UTC m=+809.503102186" Dec 04 22:32:26.103470 master-0 kubenswrapper[33572]: I1204 22:32:26.103381 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:32:26.110489 master-0 kubenswrapper[33572]: I1204 22:32:26.110435 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746vn4vf\" (UID: \"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:32:26.118366 master-0 kubenswrapper[33572]: I1204 22:32:26.118267 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:32:26.244669 master-0 kubenswrapper[33572]: I1204 22:32:26.240873 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956"] Dec 04 22:32:26.410106 master-0 kubenswrapper[33572]: I1204 22:32:26.410053 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:32:26.410297 master-0 kubenswrapper[33572]: I1204 22:32:26.410162 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:32:26.416541 master-0 kubenswrapper[33572]: I1204 22:32:26.414351 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:32:26.416541 master-0 kubenswrapper[33572]: I1204 22:32:26.414632 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e76f37b2-5ae2-4931-ac95-8d6161415d17-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-gvd74\" (UID: \"e76f37b2-5ae2-4931-ac95-8d6161415d17\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:32:26.664623 master-0 kubenswrapper[33572]: I1204 22:32:26.664556 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:32:26.664813 master-0 kubenswrapper[33572]: W1204 22:32:26.664545 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod590f3f7c_1d4d_45d7_a42b_2a78fe5d3dbf.slice/crio-50dedb1bda323f247b5847288904b63659db938d0b63dfdc9ea5623324c2e349 WatchSource:0}: Error finding container 50dedb1bda323f247b5847288904b63659db938d0b63dfdc9ea5623324c2e349: Status 404 returned error can't find the container with id 50dedb1bda323f247b5847288904b63659db938d0b63dfdc9ea5623324c2e349 Dec 04 22:32:26.665789 master-0 kubenswrapper[33572]: I1204 22:32:26.665731 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf"] Dec 04 22:32:26.694341 master-0 kubenswrapper[33572]: I1204 22:32:26.694265 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" event={"ID":"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf","Type":"ContainerStarted","Data":"50dedb1bda323f247b5847288904b63659db938d0b63dfdc9ea5623324c2e349"} Dec 04 22:32:26.696837 master-0 kubenswrapper[33572]: I1204 22:32:26.696766 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" event={"ID":"f9e3f583-1af1-44f2-a6e3-271336e2cf1e","Type":"ContainerStarted","Data":"5f327cd64f390c8ac085e6b04e3d6ec884159fe10f19d107ac1a810bd71d9f36"} Dec 04 22:32:27.154352 master-0 kubenswrapper[33572]: I1204 22:32:27.154306 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74"] Dec 04 22:32:27.159385 master-0 kubenswrapper[33572]: W1204 22:32:27.159328 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode76f37b2_5ae2_4931_ac95_8d6161415d17.slice/crio-aef4707e2cc3b44ad19b0fb6bb123f7002d1dc6b17767571cc9e91d0dd186981 WatchSource:0}: Error finding container aef4707e2cc3b44ad19b0fb6bb123f7002d1dc6b17767571cc9e91d0dd186981: Status 404 returned error can't find the container with id aef4707e2cc3b44ad19b0fb6bb123f7002d1dc6b17767571cc9e91d0dd186981 Dec 04 22:32:27.716339 master-0 kubenswrapper[33572]: I1204 22:32:27.716269 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" event={"ID":"e76f37b2-5ae2-4931-ac95-8d6161415d17","Type":"ContainerStarted","Data":"15c59187fbc88e125ec0cda017136c788d8a305cb85fbc337fc632019009753f"} Dec 04 22:32:27.716339 master-0 kubenswrapper[33572]: I1204 22:32:27.716335 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" event={"ID":"e76f37b2-5ae2-4931-ac95-8d6161415d17","Type":"ContainerStarted","Data":"aef4707e2cc3b44ad19b0fb6bb123f7002d1dc6b17767571cc9e91d0dd186981"} Dec 04 22:32:28.730394 master-0 kubenswrapper[33572]: I1204 22:32:28.730339 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:32:29.744960 master-0 kubenswrapper[33572]: I1204 22:32:29.744834 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" event={"ID":"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf","Type":"ContainerStarted","Data":"93035ea587383c6bab6036add54721b0786949bbc5393939dc451c52eb4cb208"} Dec 04 22:32:29.744960 master-0 kubenswrapper[33572]: I1204 22:32:29.744918 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" event={"ID":"590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf","Type":"ContainerStarted","Data":"66127258882ab57fead3289e0f459f0318c94b4bdcba31c7d36501c83bacac62"} Dec 04 22:32:29.746378 master-0 kubenswrapper[33572]: I1204 22:32:29.745219 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:32:29.748417 master-0 kubenswrapper[33572]: I1204 22:32:29.748364 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" event={"ID":"f9e3f583-1af1-44f2-a6e3-271336e2cf1e","Type":"ContainerStarted","Data":"95c10efe8e38c8bf6997d58342726c3d38ff9cafdc2b75fbdb3286a50d61cbe5"} Dec 04 22:32:29.748687 master-0 kubenswrapper[33572]: I1204 22:32:29.748649 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" event={"ID":"f9e3f583-1af1-44f2-a6e3-271336e2cf1e","Type":"ContainerStarted","Data":"249b15403b0463a6fcc4714a5a3a7652e52b9dd1559c24fe67ac26a4b0229978"} Dec 04 22:32:29.748869 master-0 kubenswrapper[33572]: I1204 22:32:29.748841 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:32:29.787596 master-0 kubenswrapper[33572]: I1204 22:32:29.787483 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" podStartSLOduration=34.656477444 podStartE2EDuration="36.787459847s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:32:26.675884752 +0000 UTC m=+810.403410441" lastFinishedPulling="2025-12-04 22:32:28.806867175 +0000 UTC m=+812.534392844" observedRunningTime="2025-12-04 22:32:29.772160343 +0000 UTC m=+813.499686072" watchObservedRunningTime="2025-12-04 22:32:29.787459847 +0000 UTC m=+813.514985496" Dec 04 22:32:29.788073 master-0 kubenswrapper[33572]: I1204 22:32:29.787668 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" podStartSLOduration=35.787661232 podStartE2EDuration="35.787661232s" podCreationTimestamp="2025-12-04 22:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:32:27.758629017 +0000 UTC m=+811.486154666" watchObservedRunningTime="2025-12-04 22:32:29.787661232 +0000 UTC m=+813.515186881" Dec 04 22:32:29.814573 master-0 kubenswrapper[33572]: I1204 22:32:29.814054 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" podStartSLOduration=34.288946147 podStartE2EDuration="36.814029614s" podCreationTimestamp="2025-12-04 22:31:53 +0000 UTC" firstStartedPulling="2025-12-04 22:32:26.263860861 +0000 UTC m=+809.991386550" lastFinishedPulling="2025-12-04 22:32:28.788944368 +0000 UTC m=+812.516470017" observedRunningTime="2025-12-04 22:32:29.804428858 +0000 UTC m=+813.531954517" watchObservedRunningTime="2025-12-04 22:32:29.814029614 +0000 UTC m=+813.541555273" Dec 04 22:32:35.738687 master-0 kubenswrapper[33572]: I1204 22:32:35.738583 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-qr956" Dec 04 22:32:36.125188 master-0 kubenswrapper[33572]: I1204 22:32:36.125123 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746vn4vf" Dec 04 22:32:36.677851 master-0 kubenswrapper[33572]: I1204 22:32:36.677762 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-gvd74" Dec 04 22:33:19.769594 master-0 kubenswrapper[33572]: I1204 22:33:19.769122 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs"] Dec 04 22:33:19.772562 master-0 kubenswrapper[33572]: I1204 22:33:19.771107 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" Dec 04 22:33:19.776017 master-0 kubenswrapper[33572]: I1204 22:33:19.775985 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 04 22:33:19.776456 master-0 kubenswrapper[33572]: I1204 22:33:19.776441 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 04 22:33:19.777541 master-0 kubenswrapper[33572]: I1204 22:33:19.776693 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 04 22:33:19.822063 master-0 kubenswrapper[33572]: I1204 22:33:19.822004 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs"] Dec 04 22:33:19.865651 master-0 kubenswrapper[33572]: I1204 22:33:19.865251 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-vn5dx"] Dec 04 22:33:19.867695 master-0 kubenswrapper[33572]: I1204 22:33:19.867442 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:19.871875 master-0 kubenswrapper[33572]: I1204 22:33:19.871828 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 04 22:33:19.872657 master-0 kubenswrapper[33572]: I1204 22:33:19.872610 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvdg2\" (UniqueName: \"kubernetes.io/projected/2fa144a4-7310-414a-a941-f9e08aa63084-kube-api-access-pvdg2\") pod \"dnsmasq-dns-5dbfd7c4bf-dwxvs\" (UID: \"2fa144a4-7310-414a-a941-f9e08aa63084\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" Dec 04 22:33:19.872774 master-0 kubenswrapper[33572]: I1204 22:33:19.872744 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa144a4-7310-414a-a941-f9e08aa63084-config\") pod \"dnsmasq-dns-5dbfd7c4bf-dwxvs\" (UID: \"2fa144a4-7310-414a-a941-f9e08aa63084\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" Dec 04 22:33:19.902596 master-0 kubenswrapper[33572]: I1204 22:33:19.902550 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-vn5dx"] Dec 04 22:33:19.974217 master-0 kubenswrapper[33572]: I1204 22:33:19.974136 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmssp\" (UniqueName: \"kubernetes.io/projected/6051b2e6-2761-4a90-a2eb-8b437af545b5-kube-api-access-fmssp\") pod \"dnsmasq-dns-75d7c5dbd7-vn5dx\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:19.974436 master-0 kubenswrapper[33572]: I1204 22:33:19.974231 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvdg2\" (UniqueName: \"kubernetes.io/projected/2fa144a4-7310-414a-a941-f9e08aa63084-kube-api-access-pvdg2\") pod \"dnsmasq-dns-5dbfd7c4bf-dwxvs\" (UID: \"2fa144a4-7310-414a-a941-f9e08aa63084\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" Dec 04 22:33:19.974436 master-0 kubenswrapper[33572]: I1204 22:33:19.974317 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-dns-svc\") pod \"dnsmasq-dns-75d7c5dbd7-vn5dx\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:19.974436 master-0 kubenswrapper[33572]: I1204 22:33:19.974369 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-config\") pod \"dnsmasq-dns-75d7c5dbd7-vn5dx\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:19.974436 master-0 kubenswrapper[33572]: I1204 22:33:19.974421 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa144a4-7310-414a-a941-f9e08aa63084-config\") pod \"dnsmasq-dns-5dbfd7c4bf-dwxvs\" (UID: \"2fa144a4-7310-414a-a941-f9e08aa63084\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" Dec 04 22:33:19.977372 master-0 kubenswrapper[33572]: I1204 22:33:19.976133 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa144a4-7310-414a-a941-f9e08aa63084-config\") pod \"dnsmasq-dns-5dbfd7c4bf-dwxvs\" (UID: \"2fa144a4-7310-414a-a941-f9e08aa63084\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" Dec 04 22:33:19.994150 master-0 kubenswrapper[33572]: I1204 22:33:19.994109 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvdg2\" (UniqueName: \"kubernetes.io/projected/2fa144a4-7310-414a-a941-f9e08aa63084-kube-api-access-pvdg2\") pod \"dnsmasq-dns-5dbfd7c4bf-dwxvs\" (UID: \"2fa144a4-7310-414a-a941-f9e08aa63084\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" Dec 04 22:33:20.076065 master-0 kubenswrapper[33572]: I1204 22:33:20.075882 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-dns-svc\") pod \"dnsmasq-dns-75d7c5dbd7-vn5dx\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:20.076065 master-0 kubenswrapper[33572]: I1204 22:33:20.075974 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-config\") pod \"dnsmasq-dns-75d7c5dbd7-vn5dx\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:20.076421 master-0 kubenswrapper[33572]: I1204 22:33:20.076119 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmssp\" (UniqueName: \"kubernetes.io/projected/6051b2e6-2761-4a90-a2eb-8b437af545b5-kube-api-access-fmssp\") pod \"dnsmasq-dns-75d7c5dbd7-vn5dx\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:20.078041 master-0 kubenswrapper[33572]: I1204 22:33:20.077097 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-dns-svc\") pod \"dnsmasq-dns-75d7c5dbd7-vn5dx\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:20.078041 master-0 kubenswrapper[33572]: I1204 22:33:20.077153 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-config\") pod \"dnsmasq-dns-75d7c5dbd7-vn5dx\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:20.095772 master-0 kubenswrapper[33572]: I1204 22:33:20.095701 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmssp\" (UniqueName: \"kubernetes.io/projected/6051b2e6-2761-4a90-a2eb-8b437af545b5-kube-api-access-fmssp\") pod \"dnsmasq-dns-75d7c5dbd7-vn5dx\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:20.100111 master-0 kubenswrapper[33572]: I1204 22:33:20.100066 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" Dec 04 22:33:20.198185 master-0 kubenswrapper[33572]: I1204 22:33:20.198110 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:20.572826 master-0 kubenswrapper[33572]: W1204 22:33:20.571923 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fa144a4_7310_414a_a941_f9e08aa63084.slice/crio-dcb4f3020eac496e1c001a00f4f666b9c8433a51e8e7000cd9a6f35a86fe6328 WatchSource:0}: Error finding container dcb4f3020eac496e1c001a00f4f666b9c8433a51e8e7000cd9a6f35a86fe6328: Status 404 returned error can't find the container with id dcb4f3020eac496e1c001a00f4f666b9c8433a51e8e7000cd9a6f35a86fe6328 Dec 04 22:33:20.575318 master-0 kubenswrapper[33572]: I1204 22:33:20.575264 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs"] Dec 04 22:33:21.401674 master-0 kubenswrapper[33572]: I1204 22:33:21.399799 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-vn5dx"] Dec 04 22:33:21.445197 master-0 kubenswrapper[33572]: I1204 22:33:21.444597 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" event={"ID":"6051b2e6-2761-4a90-a2eb-8b437af545b5","Type":"ContainerStarted","Data":"4844456a0268c256b0ca246c23f64864da9b2a61c284b5021b8bc967fb431f09"} Dec 04 22:33:21.446810 master-0 kubenswrapper[33572]: I1204 22:33:21.446776 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" event={"ID":"2fa144a4-7310-414a-a941-f9e08aa63084","Type":"ContainerStarted","Data":"dcb4f3020eac496e1c001a00f4f666b9c8433a51e8e7000cd9a6f35a86fe6328"} Dec 04 22:33:22.058609 master-0 kubenswrapper[33572]: I1204 22:33:22.058360 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs"] Dec 04 22:33:22.100090 master-0 kubenswrapper[33572]: I1204 22:33:22.099693 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9bff68687-2kwsl"] Dec 04 22:33:22.104029 master-0 kubenswrapper[33572]: I1204 22:33:22.103959 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.176686 master-0 kubenswrapper[33572]: I1204 22:33:22.176627 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9bff68687-2kwsl"] Dec 04 22:33:22.234478 master-0 kubenswrapper[33572]: I1204 22:33:22.232834 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhc5w\" (UniqueName: \"kubernetes.io/projected/9210fc14-13be-430c-a269-be48b495a428-kube-api-access-nhc5w\") pod \"dnsmasq-dns-9bff68687-2kwsl\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.234761 master-0 kubenswrapper[33572]: I1204 22:33:22.234641 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-dns-svc\") pod \"dnsmasq-dns-9bff68687-2kwsl\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.234761 master-0 kubenswrapper[33572]: I1204 22:33:22.234693 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-config\") pod \"dnsmasq-dns-9bff68687-2kwsl\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.354539 master-0 kubenswrapper[33572]: I1204 22:33:22.353779 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-dns-svc\") pod \"dnsmasq-dns-9bff68687-2kwsl\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.354539 master-0 kubenswrapper[33572]: I1204 22:33:22.353833 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-config\") pod \"dnsmasq-dns-9bff68687-2kwsl\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.354539 master-0 kubenswrapper[33572]: I1204 22:33:22.353948 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhc5w\" (UniqueName: \"kubernetes.io/projected/9210fc14-13be-430c-a269-be48b495a428-kube-api-access-nhc5w\") pod \"dnsmasq-dns-9bff68687-2kwsl\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.355299 master-0 kubenswrapper[33572]: I1204 22:33:22.355234 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-dns-svc\") pod \"dnsmasq-dns-9bff68687-2kwsl\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.356389 master-0 kubenswrapper[33572]: I1204 22:33:22.355876 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-config\") pod \"dnsmasq-dns-9bff68687-2kwsl\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.374565 master-0 kubenswrapper[33572]: I1204 22:33:22.374494 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhc5w\" (UniqueName: \"kubernetes.io/projected/9210fc14-13be-430c-a269-be48b495a428-kube-api-access-nhc5w\") pod \"dnsmasq-dns-9bff68687-2kwsl\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.432020 master-0 kubenswrapper[33572]: I1204 22:33:22.431934 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:22.446281 master-0 kubenswrapper[33572]: I1204 22:33:22.445915 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-vn5dx"] Dec 04 22:33:22.490982 master-0 kubenswrapper[33572]: I1204 22:33:22.470769 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-xxnr5"] Dec 04 22:33:22.490982 master-0 kubenswrapper[33572]: I1204 22:33:22.474915 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:22.493399 master-0 kubenswrapper[33572]: I1204 22:33:22.493345 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-xxnr5"] Dec 04 22:33:22.666918 master-0 kubenswrapper[33572]: I1204 22:33:22.664325 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-config\") pod \"dnsmasq-dns-658bb5765c-xxnr5\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:22.666918 master-0 kubenswrapper[33572]: I1204 22:33:22.664416 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-dns-svc\") pod \"dnsmasq-dns-658bb5765c-xxnr5\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:22.683339 master-0 kubenswrapper[33572]: I1204 22:33:22.671577 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbsjj\" (UniqueName: \"kubernetes.io/projected/13ad3678-93e9-4633-a3d8-0ca651d28bf2-kube-api-access-lbsjj\") pod \"dnsmasq-dns-658bb5765c-xxnr5\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:22.779335 master-0 kubenswrapper[33572]: I1204 22:33:22.778661 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbsjj\" (UniqueName: \"kubernetes.io/projected/13ad3678-93e9-4633-a3d8-0ca651d28bf2-kube-api-access-lbsjj\") pod \"dnsmasq-dns-658bb5765c-xxnr5\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:22.780903 master-0 kubenswrapper[33572]: I1204 22:33:22.780314 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-config\") pod \"dnsmasq-dns-658bb5765c-xxnr5\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:22.780903 master-0 kubenswrapper[33572]: I1204 22:33:22.780338 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-config\") pod \"dnsmasq-dns-658bb5765c-xxnr5\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:22.780903 master-0 kubenswrapper[33572]: I1204 22:33:22.780377 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-dns-svc\") pod \"dnsmasq-dns-658bb5765c-xxnr5\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:22.781097 master-0 kubenswrapper[33572]: I1204 22:33:22.781070 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-dns-svc\") pod \"dnsmasq-dns-658bb5765c-xxnr5\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:22.800792 master-0 kubenswrapper[33572]: I1204 22:33:22.800430 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbsjj\" (UniqueName: \"kubernetes.io/projected/13ad3678-93e9-4633-a3d8-0ca651d28bf2-kube-api-access-lbsjj\") pod \"dnsmasq-dns-658bb5765c-xxnr5\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:22.802272 master-0 kubenswrapper[33572]: I1204 22:33:22.801320 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:23.045761 master-0 kubenswrapper[33572]: I1204 22:33:23.042707 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9bff68687-2kwsl"] Dec 04 22:33:23.350048 master-0 kubenswrapper[33572]: I1204 22:33:23.348442 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-xxnr5"] Dec 04 22:33:23.350293 master-0 kubenswrapper[33572]: W1204 22:33:23.349174 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13ad3678_93e9_4633_a3d8_0ca651d28bf2.slice/crio-4ddf7e50a9f995f043ee25147d8524cdf1775071aef35fc7831f35cce747be5e WatchSource:0}: Error finding container 4ddf7e50a9f995f043ee25147d8524cdf1775071aef35fc7831f35cce747be5e: Status 404 returned error can't find the container with id 4ddf7e50a9f995f043ee25147d8524cdf1775071aef35fc7831f35cce747be5e Dec 04 22:33:23.490686 master-0 kubenswrapper[33572]: I1204 22:33:23.490521 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" event={"ID":"13ad3678-93e9-4633-a3d8-0ca651d28bf2","Type":"ContainerStarted","Data":"4ddf7e50a9f995f043ee25147d8524cdf1775071aef35fc7831f35cce747be5e"} Dec 04 22:33:23.493266 master-0 kubenswrapper[33572]: I1204 22:33:23.493202 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" event={"ID":"9210fc14-13be-430c-a269-be48b495a428","Type":"ContainerStarted","Data":"d4051d793b183763bff507fccbc9b1f09deaf49d1d873c128fd1d96a94e961cc"} Dec 04 22:33:26.274401 master-0 kubenswrapper[33572]: I1204 22:33:26.274073 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 22:33:26.313853 master-0 kubenswrapper[33572]: I1204 22:33:26.303976 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 22:33:26.313853 master-0 kubenswrapper[33572]: I1204 22:33:26.304119 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.313853 master-0 kubenswrapper[33572]: I1204 22:33:26.310335 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 04 22:33:26.313853 master-0 kubenswrapper[33572]: I1204 22:33:26.311012 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 22:33:26.313853 master-0 kubenswrapper[33572]: I1204 22:33:26.311278 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 22:33:26.313853 master-0 kubenswrapper[33572]: I1204 22:33:26.311398 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 22:33:26.313853 master-0 kubenswrapper[33572]: I1204 22:33:26.311566 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 22:33:26.313853 master-0 kubenswrapper[33572]: I1204 22:33:26.311567 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 22:33:26.313853 master-0 kubenswrapper[33572]: I1204 22:33:26.313286 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 22:33:26.313853 master-0 kubenswrapper[33572]: I1204 22:33:26.313395 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 22:33:26.318552 master-0 kubenswrapper[33572]: I1204 22:33:26.315687 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 04 22:33:26.318552 master-0 kubenswrapper[33572]: I1204 22:33:26.315875 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 04 22:33:26.326337 master-0 kubenswrapper[33572]: I1204 22:33:26.323926 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 22:33:26.359108 master-0 kubenswrapper[33572]: I1204 22:33:26.358891 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.422396 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d1338e8-405c-4439-ae7d-02034960a5c5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.422525 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d1338e8-405c-4439-ae7d-02034960a5c5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.422564 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d1338e8-405c-4439-ae7d-02034960a5c5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.422602 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.422676 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d1338e8-405c-4439-ae7d-02034960a5c5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.422722 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w52pf\" (UniqueName: \"kubernetes.io/projected/feb294eb-fff6-4a11-8e4c-b80bc0a64045-kube-api-access-w52pf\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.422770 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.422872 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.422960 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb294eb-fff6-4a11-8e4c-b80bc0a64045-memcached-tls-certs\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.423060 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-59e6feec-7c51-451f-9e81-ae15149b2ff2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^76620731-7fec-460b-a665-c7776f227d11\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.423182 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/feb294eb-fff6-4a11-8e4c-b80bc0a64045-kolla-config\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.428040 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj6wt\" (UniqueName: \"kubernetes.io/projected/1d1338e8-405c-4439-ae7d-02034960a5c5-kube-api-access-pj6wt\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.428124 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/feb294eb-fff6-4a11-8e4c-b80bc0a64045-config-data\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.428159 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb294eb-fff6-4a11-8e4c-b80bc0a64045-combined-ca-bundle\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.428230 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.430642 master-0 kubenswrapper[33572]: I1204 22:33:26.428339 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d1338e8-405c-4439-ae7d-02034960a5c5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.530695 master-0 kubenswrapper[33572]: I1204 22:33:26.530546 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d1338e8-405c-4439-ae7d-02034960a5c5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.530695 master-0 kubenswrapper[33572]: I1204 22:33:26.530591 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w52pf\" (UniqueName: \"kubernetes.io/projected/feb294eb-fff6-4a11-8e4c-b80bc0a64045-kube-api-access-w52pf\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.531101 master-0 kubenswrapper[33572]: I1204 22:33:26.530812 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531101 master-0 kubenswrapper[33572]: I1204 22:33:26.530896 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531101 master-0 kubenswrapper[33572]: I1204 22:33:26.531022 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb294eb-fff6-4a11-8e4c-b80bc0a64045-memcached-tls-certs\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.531101 master-0 kubenswrapper[33572]: I1204 22:33:26.531069 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-59e6feec-7c51-451f-9e81-ae15149b2ff2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^76620731-7fec-460b-a665-c7776f227d11\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531353 master-0 kubenswrapper[33572]: I1204 22:33:26.531129 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/feb294eb-fff6-4a11-8e4c-b80bc0a64045-kolla-config\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.531353 master-0 kubenswrapper[33572]: I1204 22:33:26.531164 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj6wt\" (UniqueName: \"kubernetes.io/projected/1d1338e8-405c-4439-ae7d-02034960a5c5-kube-api-access-pj6wt\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531353 master-0 kubenswrapper[33572]: I1204 22:33:26.531207 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/feb294eb-fff6-4a11-8e4c-b80bc0a64045-config-data\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.531353 master-0 kubenswrapper[33572]: I1204 22:33:26.531241 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb294eb-fff6-4a11-8e4c-b80bc0a64045-combined-ca-bundle\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.531353 master-0 kubenswrapper[33572]: I1204 22:33:26.531292 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531353 master-0 kubenswrapper[33572]: I1204 22:33:26.531341 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d1338e8-405c-4439-ae7d-02034960a5c5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531786 master-0 kubenswrapper[33572]: I1204 22:33:26.531367 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d1338e8-405c-4439-ae7d-02034960a5c5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531786 master-0 kubenswrapper[33572]: I1204 22:33:26.531426 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d1338e8-405c-4439-ae7d-02034960a5c5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531786 master-0 kubenswrapper[33572]: I1204 22:33:26.531444 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531786 master-0 kubenswrapper[33572]: I1204 22:33:26.531485 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d1338e8-405c-4439-ae7d-02034960a5c5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531786 master-0 kubenswrapper[33572]: I1204 22:33:26.531535 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d1338e8-405c-4439-ae7d-02034960a5c5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.531786 master-0 kubenswrapper[33572]: I1204 22:33:26.531567 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.536008 master-0 kubenswrapper[33572]: I1204 22:33:26.532059 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.536008 master-0 kubenswrapper[33572]: I1204 22:33:26.532645 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/feb294eb-fff6-4a11-8e4c-b80bc0a64045-kolla-config\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.536008 master-0 kubenswrapper[33572]: I1204 22:33:26.533188 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d1338e8-405c-4439-ae7d-02034960a5c5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.536008 master-0 kubenswrapper[33572]: I1204 22:33:26.533236 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d1338e8-405c-4439-ae7d-02034960a5c5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.536008 master-0 kubenswrapper[33572]: I1204 22:33:26.535448 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d1338e8-405c-4439-ae7d-02034960a5c5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.536008 master-0 kubenswrapper[33572]: I1204 22:33:26.535488 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:33:26.536008 master-0 kubenswrapper[33572]: I1204 22:33:26.535543 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-59e6feec-7c51-451f-9e81-ae15149b2ff2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^76620731-7fec-460b-a665-c7776f227d11\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/78cd96140d4dd75ec338499c3e5cf13fa42e5bfbbd571cb2f688a8d4affaf8e6/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.536008 master-0 kubenswrapper[33572]: I1204 22:33:26.535488 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/feb294eb-fff6-4a11-8e4c-b80bc0a64045-config-data\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.536274 master-0 kubenswrapper[33572]: I1204 22:33:26.536218 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.538043 master-0 kubenswrapper[33572]: I1204 22:33:26.537159 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb294eb-fff6-4a11-8e4c-b80bc0a64045-memcached-tls-certs\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.538980 master-0 kubenswrapper[33572]: I1204 22:33:26.538536 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d1338e8-405c-4439-ae7d-02034960a5c5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.548791 master-0 kubenswrapper[33572]: I1204 22:33:26.544061 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb294eb-fff6-4a11-8e4c-b80bc0a64045-combined-ca-bundle\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.548791 master-0 kubenswrapper[33572]: I1204 22:33:26.545014 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d1338e8-405c-4439-ae7d-02034960a5c5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.570220 master-0 kubenswrapper[33572]: I1204 22:33:26.570173 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w52pf\" (UniqueName: \"kubernetes.io/projected/feb294eb-fff6-4a11-8e4c-b80bc0a64045-kube-api-access-w52pf\") pod \"memcached-0\" (UID: \"feb294eb-fff6-4a11-8e4c-b80bc0a64045\") " pod="openstack/memcached-0" Dec 04 22:33:26.570884 master-0 kubenswrapper[33572]: I1204 22:33:26.570332 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj6wt\" (UniqueName: \"kubernetes.io/projected/1d1338e8-405c-4439-ae7d-02034960a5c5-kube-api-access-pj6wt\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:26.679521 master-0 kubenswrapper[33572]: I1204 22:33:26.677347 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 22:33:27.623231 master-0 kubenswrapper[33572]: I1204 22:33:27.623141 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 22:33:27.630073 master-0 kubenswrapper[33572]: I1204 22:33:27.630034 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.648583 master-0 kubenswrapper[33572]: I1204 22:33:27.637661 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 22:33:27.648583 master-0 kubenswrapper[33572]: I1204 22:33:27.637884 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 22:33:27.648583 master-0 kubenswrapper[33572]: I1204 22:33:27.638391 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 22:33:27.648583 master-0 kubenswrapper[33572]: I1204 22:33:27.638396 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 22:33:27.648583 master-0 kubenswrapper[33572]: I1204 22:33:27.638623 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 22:33:27.664273 master-0 kubenswrapper[33572]: I1204 22:33:27.663727 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 22:33:27.679892 master-0 kubenswrapper[33572]: I1204 22:33:27.679841 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.734681 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.734768 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0fdf3b5f-ebaf-4a7d-9ef1-ddeadd5d0d2a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e57f3d7e-658c-444e-92ff-2da6b71f67b0\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.734792 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/077ecfc2-ed81-4de5-993c-0f1084df9734-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.734840 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.734867 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/077ecfc2-ed81-4de5-993c-0f1084df9734-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.734891 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/077ecfc2-ed81-4de5-993c-0f1084df9734-config-data\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.734915 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t99j\" (UniqueName: \"kubernetes.io/projected/077ecfc2-ed81-4de5-993c-0f1084df9734-kube-api-access-6t99j\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.734974 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/077ecfc2-ed81-4de5-993c-0f1084df9734-pod-info\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.735414 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/077ecfc2-ed81-4de5-993c-0f1084df9734-server-conf\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.735483 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.736828 master-0 kubenswrapper[33572]: I1204 22:33:27.735532 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.837560 master-0 kubenswrapper[33572]: I1204 22:33:27.837021 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/077ecfc2-ed81-4de5-993c-0f1084df9734-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838167 master-0 kubenswrapper[33572]: I1204 22:33:27.838133 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/077ecfc2-ed81-4de5-993c-0f1084df9734-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838233 master-0 kubenswrapper[33572]: I1204 22:33:27.838197 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/077ecfc2-ed81-4de5-993c-0f1084df9734-config-data\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838271 master-0 kubenswrapper[33572]: I1204 22:33:27.838258 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t99j\" (UniqueName: \"kubernetes.io/projected/077ecfc2-ed81-4de5-993c-0f1084df9734-kube-api-access-6t99j\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838354 master-0 kubenswrapper[33572]: I1204 22:33:27.838336 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/077ecfc2-ed81-4de5-993c-0f1084df9734-pod-info\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838536 master-0 kubenswrapper[33572]: I1204 22:33:27.838519 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/077ecfc2-ed81-4de5-993c-0f1084df9734-server-conf\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838579 master-0 kubenswrapper[33572]: I1204 22:33:27.838547 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838579 master-0 kubenswrapper[33572]: I1204 22:33:27.838567 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838654 master-0 kubenswrapper[33572]: I1204 22:33:27.838625 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838712 master-0 kubenswrapper[33572]: I1204 22:33:27.838686 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0fdf3b5f-ebaf-4a7d-9ef1-ddeadd5d0d2a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e57f3d7e-658c-444e-92ff-2da6b71f67b0\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838762 master-0 kubenswrapper[33572]: I1204 22:33:27.838718 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/077ecfc2-ed81-4de5-993c-0f1084df9734-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.838900 master-0 kubenswrapper[33572]: I1204 22:33:27.838822 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.844306 master-0 kubenswrapper[33572]: I1204 22:33:27.844274 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.844415 master-0 kubenswrapper[33572]: I1204 22:33:27.844384 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.846239 master-0 kubenswrapper[33572]: I1204 22:33:27.845770 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/077ecfc2-ed81-4de5-993c-0f1084df9734-config-data\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.846759 master-0 kubenswrapper[33572]: I1204 22:33:27.846360 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/077ecfc2-ed81-4de5-993c-0f1084df9734-server-conf\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.846759 master-0 kubenswrapper[33572]: I1204 22:33:27.846727 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:33:27.846759 master-0 kubenswrapper[33572]: I1204 22:33:27.846750 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0fdf3b5f-ebaf-4a7d-9ef1-ddeadd5d0d2a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e57f3d7e-658c-444e-92ff-2da6b71f67b0\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/6522a300b255c1a4f02a905e36959cac96f0679fe44baa273b331168969414a8/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.847630 master-0 kubenswrapper[33572]: I1204 22:33:27.847567 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.850133 master-0 kubenswrapper[33572]: I1204 22:33:27.850093 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/077ecfc2-ed81-4de5-993c-0f1084df9734-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.851677 master-0 kubenswrapper[33572]: I1204 22:33:27.851650 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/077ecfc2-ed81-4de5-993c-0f1084df9734-pod-info\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.852069 master-0 kubenswrapper[33572]: I1204 22:33:27.852045 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/077ecfc2-ed81-4de5-993c-0f1084df9734-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:27.865777 master-0 kubenswrapper[33572]: I1204 22:33:27.865710 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t99j\" (UniqueName: \"kubernetes.io/projected/077ecfc2-ed81-4de5-993c-0f1084df9734-kube-api-access-6t99j\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:28.148494 master-0 kubenswrapper[33572]: I1204 22:33:28.139454 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-59e6feec-7c51-451f-9e81-ae15149b2ff2\" (UniqueName: \"kubernetes.io/csi/topolvm.io^76620731-7fec-460b-a665-c7776f227d11\") pod \"rabbitmq-cell1-server-0\" (UID: \"1d1338e8-405c-4439-ae7d-02034960a5c5\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:28.151888 master-0 kubenswrapper[33572]: I1204 22:33:28.149759 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:33:28.674073 master-0 kubenswrapper[33572]: I1204 22:33:28.672561 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 04 22:33:28.674966 master-0 kubenswrapper[33572]: I1204 22:33:28.674441 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 22:33:28.678756 master-0 kubenswrapper[33572]: I1204 22:33:28.677408 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 04 22:33:28.678756 master-0 kubenswrapper[33572]: I1204 22:33:28.677737 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 04 22:33:28.678756 master-0 kubenswrapper[33572]: I1204 22:33:28.678540 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 04 22:33:28.691888 master-0 kubenswrapper[33572]: I1204 22:33:28.691831 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 22:33:28.776428 master-0 kubenswrapper[33572]: I1204 22:33:28.776350 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f0454e-05e4-4510-bbf1-b273079b0f1d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.776860 master-0 kubenswrapper[33572]: I1204 22:33:28.776455 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5f0454e-05e4-4510-bbf1-b273079b0f1d-config-data-default\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.776860 master-0 kubenswrapper[33572]: I1204 22:33:28.776795 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f0454e-05e4-4510-bbf1-b273079b0f1d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.777065 master-0 kubenswrapper[33572]: I1204 22:33:28.777029 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5f0454e-05e4-4510-bbf1-b273079b0f1d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.777125 master-0 kubenswrapper[33572]: I1204 22:33:28.777100 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b79705d0-0756-449c-bcee-da458d47afae\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8daa9079-c618-4671-9d20-57cce73087e0\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.777208 master-0 kubenswrapper[33572]: I1204 22:33:28.777177 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzwsk\" (UniqueName: \"kubernetes.io/projected/f5f0454e-05e4-4510-bbf1-b273079b0f1d-kube-api-access-mzwsk\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.777341 master-0 kubenswrapper[33572]: I1204 22:33:28.777311 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5f0454e-05e4-4510-bbf1-b273079b0f1d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.777383 master-0 kubenswrapper[33572]: I1204 22:33:28.777350 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5f0454e-05e4-4510-bbf1-b273079b0f1d-kolla-config\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.879274 master-0 kubenswrapper[33572]: I1204 22:33:28.879218 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f0454e-05e4-4510-bbf1-b273079b0f1d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.879274 master-0 kubenswrapper[33572]: I1204 22:33:28.879270 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5f0454e-05e4-4510-bbf1-b273079b0f1d-config-data-default\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.879615 master-0 kubenswrapper[33572]: I1204 22:33:28.879344 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f0454e-05e4-4510-bbf1-b273079b0f1d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.879615 master-0 kubenswrapper[33572]: I1204 22:33:28.879403 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5f0454e-05e4-4510-bbf1-b273079b0f1d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.879615 master-0 kubenswrapper[33572]: I1204 22:33:28.879425 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b79705d0-0756-449c-bcee-da458d47afae\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8daa9079-c618-4671-9d20-57cce73087e0\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.879615 master-0 kubenswrapper[33572]: I1204 22:33:28.879449 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzwsk\" (UniqueName: \"kubernetes.io/projected/f5f0454e-05e4-4510-bbf1-b273079b0f1d-kube-api-access-mzwsk\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.879615 master-0 kubenswrapper[33572]: I1204 22:33:28.879482 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5f0454e-05e4-4510-bbf1-b273079b0f1d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.879615 master-0 kubenswrapper[33572]: I1204 22:33:28.879512 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5f0454e-05e4-4510-bbf1-b273079b0f1d-kolla-config\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.880172 master-0 kubenswrapper[33572]: I1204 22:33:28.880139 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5f0454e-05e4-4510-bbf1-b273079b0f1d-kolla-config\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.881289 master-0 kubenswrapper[33572]: I1204 22:33:28.881254 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5f0454e-05e4-4510-bbf1-b273079b0f1d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.881918 master-0 kubenswrapper[33572]: I1204 22:33:28.881886 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5f0454e-05e4-4510-bbf1-b273079b0f1d-config-data-default\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.883858 master-0 kubenswrapper[33572]: I1204 22:33:28.883604 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5f0454e-05e4-4510-bbf1-b273079b0f1d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.885202 master-0 kubenswrapper[33572]: I1204 22:33:28.885149 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:33:28.885265 master-0 kubenswrapper[33572]: I1204 22:33:28.885196 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b79705d0-0756-449c-bcee-da458d47afae\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8daa9079-c618-4671-9d20-57cce73087e0\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/2eb883597410f134fa00f17d41b5646f13a87c1fff414ea9883ec6ea6034a5be/globalmount\"" pod="openstack/openstack-galera-0" Dec 04 22:33:28.885381 master-0 kubenswrapper[33572]: I1204 22:33:28.885315 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f0454e-05e4-4510-bbf1-b273079b0f1d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.890248 master-0 kubenswrapper[33572]: I1204 22:33:28.888866 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5f0454e-05e4-4510-bbf1-b273079b0f1d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:28.899194 master-0 kubenswrapper[33572]: I1204 22:33:28.899127 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzwsk\" (UniqueName: \"kubernetes.io/projected/f5f0454e-05e4-4510-bbf1-b273079b0f1d-kube-api-access-mzwsk\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:29.508976 master-0 kubenswrapper[33572]: I1204 22:33:29.508884 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0fdf3b5f-ebaf-4a7d-9ef1-ddeadd5d0d2a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e57f3d7e-658c-444e-92ff-2da6b71f67b0\") pod \"rabbitmq-server-0\" (UID: \"077ecfc2-ed81-4de5-993c-0f1084df9734\") " pod="openstack/rabbitmq-server-0" Dec 04 22:33:29.751039 master-0 kubenswrapper[33572]: I1204 22:33:29.750849 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 22:33:30.020787 master-0 kubenswrapper[33572]: I1204 22:33:30.020732 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 22:33:30.022905 master-0 kubenswrapper[33572]: I1204 22:33:30.022866 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.033291 master-0 kubenswrapper[33572]: I1204 22:33:30.033209 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 04 22:33:30.034530 master-0 kubenswrapper[33572]: I1204 22:33:30.034476 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 04 22:33:30.034901 master-0 kubenswrapper[33572]: I1204 22:33:30.034854 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 04 22:33:30.048659 master-0 kubenswrapper[33572]: I1204 22:33:30.048600 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 22:33:30.123162 master-0 kubenswrapper[33572]: I1204 22:33:30.123090 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd8k4\" (UniqueName: \"kubernetes.io/projected/a95bf966-679b-4bad-8693-f1becf39685d-kube-api-access-xd8k4\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.123162 master-0 kubenswrapper[33572]: I1204 22:33:30.123150 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2d361ab2-b540-40cd-967a-33c1a821edfa\" (UniqueName: \"kubernetes.io/csi/topolvm.io^66ca3ba2-6478-489e-9cf4-94d1129d8448\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.123162 master-0 kubenswrapper[33572]: I1204 22:33:30.123173 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95bf966-679b-4bad-8693-f1becf39685d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.123634 master-0 kubenswrapper[33572]: I1204 22:33:30.123474 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95bf966-679b-4bad-8693-f1becf39685d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.123672 master-0 kubenswrapper[33572]: I1204 22:33:30.123654 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a95bf966-679b-4bad-8693-f1becf39685d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.123724 master-0 kubenswrapper[33572]: I1204 22:33:30.123697 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a95bf966-679b-4bad-8693-f1becf39685d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.123827 master-0 kubenswrapper[33572]: I1204 22:33:30.123777 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a95bf966-679b-4bad-8693-f1becf39685d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.123873 master-0 kubenswrapper[33572]: I1204 22:33:30.123850 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a95bf966-679b-4bad-8693-f1becf39685d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.227427 master-0 kubenswrapper[33572]: I1204 22:33:30.227315 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a95bf966-679b-4bad-8693-f1becf39685d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.227427 master-0 kubenswrapper[33572]: I1204 22:33:30.227444 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a95bf966-679b-4bad-8693-f1becf39685d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.228130 master-0 kubenswrapper[33572]: I1204 22:33:30.227542 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a95bf966-679b-4bad-8693-f1becf39685d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.228130 master-0 kubenswrapper[33572]: I1204 22:33:30.227581 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a95bf966-679b-4bad-8693-f1becf39685d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.228130 master-0 kubenswrapper[33572]: I1204 22:33:30.227810 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd8k4\" (UniqueName: \"kubernetes.io/projected/a95bf966-679b-4bad-8693-f1becf39685d-kube-api-access-xd8k4\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.228130 master-0 kubenswrapper[33572]: I1204 22:33:30.227846 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2d361ab2-b540-40cd-967a-33c1a821edfa\" (UniqueName: \"kubernetes.io/csi/topolvm.io^66ca3ba2-6478-489e-9cf4-94d1129d8448\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.228130 master-0 kubenswrapper[33572]: I1204 22:33:30.227865 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95bf966-679b-4bad-8693-f1becf39685d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.228130 master-0 kubenswrapper[33572]: I1204 22:33:30.227917 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95bf966-679b-4bad-8693-f1becf39685d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.228595 master-0 kubenswrapper[33572]: I1204 22:33:30.228569 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a95bf966-679b-4bad-8693-f1becf39685d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.229226 master-0 kubenswrapper[33572]: I1204 22:33:30.229164 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a95bf966-679b-4bad-8693-f1becf39685d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.229672 master-0 kubenswrapper[33572]: I1204 22:33:30.229643 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a95bf966-679b-4bad-8693-f1becf39685d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.230050 master-0 kubenswrapper[33572]: I1204 22:33:30.230023 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95bf966-679b-4bad-8693-f1becf39685d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.231125 master-0 kubenswrapper[33572]: I1204 22:33:30.231035 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:33:30.231125 master-0 kubenswrapper[33572]: I1204 22:33:30.231071 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2d361ab2-b540-40cd-967a-33c1a821edfa\" (UniqueName: \"kubernetes.io/csi/topolvm.io^66ca3ba2-6478-489e-9cf4-94d1129d8448\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c88cc077a99b2d8ed973c7f133fa2f50a58e71eb53e47167909cf063df496941/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.234364 master-0 kubenswrapper[33572]: I1204 22:33:30.234289 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95bf966-679b-4bad-8693-f1becf39685d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.239810 master-0 kubenswrapper[33572]: I1204 22:33:30.239750 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a95bf966-679b-4bad-8693-f1becf39685d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.250976 master-0 kubenswrapper[33572]: I1204 22:33:30.250922 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd8k4\" (UniqueName: \"kubernetes.io/projected/a95bf966-679b-4bad-8693-f1becf39685d-kube-api-access-xd8k4\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:30.560589 master-0 kubenswrapper[33572]: I1204 22:33:30.560489 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b79705d0-0756-449c-bcee-da458d47afae\" (UniqueName: \"kubernetes.io/csi/topolvm.io^8daa9079-c618-4671-9d20-57cce73087e0\") pod \"openstack-galera-0\" (UID: \"f5f0454e-05e4-4510-bbf1-b273079b0f1d\") " pod="openstack/openstack-galera-0" Dec 04 22:33:30.792530 master-0 kubenswrapper[33572]: I1204 22:33:30.792434 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 22:33:31.566985 master-0 kubenswrapper[33572]: I1204 22:33:31.566886 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2d361ab2-b540-40cd-967a-33c1a821edfa\" (UniqueName: \"kubernetes.io/csi/topolvm.io^66ca3ba2-6478-489e-9cf4-94d1129d8448\") pod \"openstack-cell1-galera-0\" (UID: \"a95bf966-679b-4bad-8693-f1becf39685d\") " pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:33.676988 master-0 kubenswrapper[33572]: I1204 22:33:33.676904 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 22:33:34.571321 master-0 kubenswrapper[33572]: I1204 22:33:34.571237 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rk7z7"] Dec 04 22:33:34.574947 master-0 kubenswrapper[33572]: I1204 22:33:34.573961 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.582180 master-0 kubenswrapper[33572]: I1204 22:33:34.582102 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 04 22:33:34.582351 master-0 kubenswrapper[33572]: I1204 22:33:34.582298 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 04 22:33:34.599534 master-0 kubenswrapper[33572]: I1204 22:33:34.588996 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rk7z7"] Dec 04 22:33:34.634996 master-0 kubenswrapper[33572]: I1204 22:33:34.634589 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7512b9d-7adc-4af3-a29f-b154799338cb-combined-ca-bundle\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.634996 master-0 kubenswrapper[33572]: I1204 22:33:34.634822 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7512b9d-7adc-4af3-a29f-b154799338cb-var-run-ovn\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.634996 master-0 kubenswrapper[33572]: I1204 22:33:34.634851 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7512b9d-7adc-4af3-a29f-b154799338cb-scripts\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.634996 master-0 kubenswrapper[33572]: I1204 22:33:34.634998 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7512b9d-7adc-4af3-a29f-b154799338cb-var-log-ovn\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.635430 master-0 kubenswrapper[33572]: I1204 22:33:34.635140 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7512b9d-7adc-4af3-a29f-b154799338cb-ovn-controller-tls-certs\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.635430 master-0 kubenswrapper[33572]: I1204 22:33:34.635201 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42jc8\" (UniqueName: \"kubernetes.io/projected/b7512b9d-7adc-4af3-a29f-b154799338cb-kube-api-access-42jc8\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.635430 master-0 kubenswrapper[33572]: I1204 22:33:34.635256 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7512b9d-7adc-4af3-a29f-b154799338cb-var-run\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.678285 master-0 kubenswrapper[33572]: I1204 22:33:34.678226 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-mk92t"] Dec 04 22:33:34.683622 master-0 kubenswrapper[33572]: I1204 22:33:34.683573 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.714155 master-0 kubenswrapper[33572]: I1204 22:33:34.714111 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mk92t"] Dec 04 22:33:34.737163 master-0 kubenswrapper[33572]: I1204 22:33:34.737082 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-var-lib\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.737400 master-0 kubenswrapper[33572]: I1204 22:33:34.737217 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7512b9d-7adc-4af3-a29f-b154799338cb-ovn-controller-tls-certs\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.737400 master-0 kubenswrapper[33572]: I1204 22:33:34.737265 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42jc8\" (UniqueName: \"kubernetes.io/projected/b7512b9d-7adc-4af3-a29f-b154799338cb-kube-api-access-42jc8\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.737400 master-0 kubenswrapper[33572]: I1204 22:33:34.737288 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p6qf\" (UniqueName: \"kubernetes.io/projected/477cfefc-eaf1-4469-b639-74b14e997c6b-kube-api-access-5p6qf\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.737400 master-0 kubenswrapper[33572]: I1204 22:33:34.737310 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/477cfefc-eaf1-4469-b639-74b14e997c6b-scripts\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.737400 master-0 kubenswrapper[33572]: I1204 22:33:34.737346 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7512b9d-7adc-4af3-a29f-b154799338cb-var-run\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.737400 master-0 kubenswrapper[33572]: I1204 22:33:34.737381 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-var-log\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.737400 master-0 kubenswrapper[33572]: I1204 22:33:34.737402 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7512b9d-7adc-4af3-a29f-b154799338cb-combined-ca-bundle\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.737767 master-0 kubenswrapper[33572]: I1204 22:33:34.737423 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7512b9d-7adc-4af3-a29f-b154799338cb-var-run-ovn\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.737767 master-0 kubenswrapper[33572]: I1204 22:33:34.737442 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7512b9d-7adc-4af3-a29f-b154799338cb-scripts\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.737767 master-0 kubenswrapper[33572]: I1204 22:33:34.737469 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-var-run\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.737767 master-0 kubenswrapper[33572]: I1204 22:33:34.737530 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-etc-ovs\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.737767 master-0 kubenswrapper[33572]: I1204 22:33:34.737571 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7512b9d-7adc-4af3-a29f-b154799338cb-var-log-ovn\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.738124 master-0 kubenswrapper[33572]: I1204 22:33:34.738103 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b7512b9d-7adc-4af3-a29f-b154799338cb-var-log-ovn\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.738331 master-0 kubenswrapper[33572]: I1204 22:33:34.738277 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7512b9d-7adc-4af3-a29f-b154799338cb-var-run-ovn\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.738482 master-0 kubenswrapper[33572]: I1204 22:33:34.738396 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b7512b9d-7adc-4af3-a29f-b154799338cb-var-run\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.740246 master-0 kubenswrapper[33572]: I1204 22:33:34.740220 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7512b9d-7adc-4af3-a29f-b154799338cb-scripts\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.741834 master-0 kubenswrapper[33572]: I1204 22:33:34.741781 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7512b9d-7adc-4af3-a29f-b154799338cb-combined-ca-bundle\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.743636 master-0 kubenswrapper[33572]: I1204 22:33:34.743594 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7512b9d-7adc-4af3-a29f-b154799338cb-ovn-controller-tls-certs\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.777612 master-0 kubenswrapper[33572]: I1204 22:33:34.777491 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42jc8\" (UniqueName: \"kubernetes.io/projected/b7512b9d-7adc-4af3-a29f-b154799338cb-kube-api-access-42jc8\") pod \"ovn-controller-rk7z7\" (UID: \"b7512b9d-7adc-4af3-a29f-b154799338cb\") " pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:34.839940 master-0 kubenswrapper[33572]: I1204 22:33:34.839807 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-var-run\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.839940 master-0 kubenswrapper[33572]: I1204 22:33:34.839885 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-etc-ovs\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.840185 master-0 kubenswrapper[33572]: I1204 22:33:34.839961 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-var-lib\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.840185 master-0 kubenswrapper[33572]: I1204 22:33:34.840057 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p6qf\" (UniqueName: \"kubernetes.io/projected/477cfefc-eaf1-4469-b639-74b14e997c6b-kube-api-access-5p6qf\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.840185 master-0 kubenswrapper[33572]: I1204 22:33:34.840079 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/477cfefc-eaf1-4469-b639-74b14e997c6b-scripts\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.840185 master-0 kubenswrapper[33572]: I1204 22:33:34.840136 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-var-log\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.840417 master-0 kubenswrapper[33572]: I1204 22:33:34.840397 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-var-log\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.840493 master-0 kubenswrapper[33572]: I1204 22:33:34.840478 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-var-run\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.840651 master-0 kubenswrapper[33572]: I1204 22:33:34.840633 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-etc-ovs\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.840799 master-0 kubenswrapper[33572]: I1204 22:33:34.840781 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/477cfefc-eaf1-4469-b639-74b14e997c6b-var-lib\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.843322 master-0 kubenswrapper[33572]: I1204 22:33:34.843281 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/477cfefc-eaf1-4469-b639-74b14e997c6b-scripts\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.863706 master-0 kubenswrapper[33572]: I1204 22:33:34.863632 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p6qf\" (UniqueName: \"kubernetes.io/projected/477cfefc-eaf1-4469-b639-74b14e997c6b-kube-api-access-5p6qf\") pod \"ovn-controller-ovs-mk92t\" (UID: \"477cfefc-eaf1-4469-b639-74b14e997c6b\") " pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:34.937971 master-0 kubenswrapper[33572]: I1204 22:33:34.937898 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:35.017630 master-0 kubenswrapper[33572]: I1204 22:33:35.017574 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:36.571215 master-0 kubenswrapper[33572]: I1204 22:33:36.571143 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 22:33:36.573670 master-0 kubenswrapper[33572]: I1204 22:33:36.573632 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.579318 master-0 kubenswrapper[33572]: I1204 22:33:36.579238 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 04 22:33:36.579585 master-0 kubenswrapper[33572]: I1204 22:33:36.579544 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 04 22:33:36.579813 master-0 kubenswrapper[33572]: I1204 22:33:36.579687 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 04 22:33:36.589726 master-0 kubenswrapper[33572]: I1204 22:33:36.589681 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 04 22:33:36.607632 master-0 kubenswrapper[33572]: I1204 22:33:36.607580 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 22:33:36.674572 master-0 kubenswrapper[33572]: I1204 22:33:36.674494 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d4383cda-877b-4a22-aae7-6af4d72cd93f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^42d04af6-cdd0-43e5-a017-9a46c1e22de9\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.674779 master-0 kubenswrapper[33572]: I1204 22:33:36.674613 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.674779 master-0 kubenswrapper[33572]: I1204 22:33:36.674676 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvr7p\" (UniqueName: \"kubernetes.io/projected/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-kube-api-access-fvr7p\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.674779 master-0 kubenswrapper[33572]: I1204 22:33:36.674765 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.674883 master-0 kubenswrapper[33572]: I1204 22:33:36.674831 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.674883 master-0 kubenswrapper[33572]: I1204 22:33:36.674868 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.674946 master-0 kubenswrapper[33572]: I1204 22:33:36.674902 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.674946 master-0 kubenswrapper[33572]: I1204 22:33:36.674938 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.777058 master-0 kubenswrapper[33572]: I1204 22:33:36.776999 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.777279 master-0 kubenswrapper[33572]: I1204 22:33:36.777076 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvr7p\" (UniqueName: \"kubernetes.io/projected/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-kube-api-access-fvr7p\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.777279 master-0 kubenswrapper[33572]: I1204 22:33:36.777132 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.777279 master-0 kubenswrapper[33572]: I1204 22:33:36.777181 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.777279 master-0 kubenswrapper[33572]: I1204 22:33:36.777208 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.777279 master-0 kubenswrapper[33572]: I1204 22:33:36.777226 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.777279 master-0 kubenswrapper[33572]: I1204 22:33:36.777250 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.777279 master-0 kubenswrapper[33572]: I1204 22:33:36.777274 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d4383cda-877b-4a22-aae7-6af4d72cd93f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^42d04af6-cdd0-43e5-a017-9a46c1e22de9\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.781223 master-0 kubenswrapper[33572]: I1204 22:33:36.778554 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.781223 master-0 kubenswrapper[33572]: I1204 22:33:36.779670 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.781223 master-0 kubenswrapper[33572]: I1204 22:33:36.780304 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:33:36.781223 master-0 kubenswrapper[33572]: I1204 22:33:36.780325 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d4383cda-877b-4a22-aae7-6af4d72cd93f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^42d04af6-cdd0-43e5-a017-9a46c1e22de9\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/0ce1314ee1cb5ad5da5a272b49aa49ed2a32caeb91681e655c578ddd69db0e85/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.781223 master-0 kubenswrapper[33572]: I1204 22:33:36.781164 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.790454 master-0 kubenswrapper[33572]: I1204 22:33:36.790390 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.790942 master-0 kubenswrapper[33572]: I1204 22:33:36.790908 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.816533 master-0 kubenswrapper[33572]: I1204 22:33:36.816443 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.816766 master-0 kubenswrapper[33572]: I1204 22:33:36.816719 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvr7p\" (UniqueName: \"kubernetes.io/projected/bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c-kube-api-access-fvr7p\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:36.841331 master-0 kubenswrapper[33572]: I1204 22:33:36.841201 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 22:33:36.846469 master-0 kubenswrapper[33572]: I1204 22:33:36.846418 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.849052 master-0 kubenswrapper[33572]: I1204 22:33:36.849024 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 04 22:33:36.850620 master-0 kubenswrapper[33572]: I1204 22:33:36.849287 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 04 22:33:36.850620 master-0 kubenswrapper[33572]: I1204 22:33:36.849555 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 04 22:33:36.855981 master-0 kubenswrapper[33572]: I1204 22:33:36.855876 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 22:33:36.878962 master-0 kubenswrapper[33572]: I1204 22:33:36.878853 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b00720a-09b8-4a70-a585-65807c6fbd56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.879381 master-0 kubenswrapper[33572]: I1204 22:33:36.879358 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b00720a-09b8-4a70-a585-65807c6fbd56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.879701 master-0 kubenswrapper[33572]: I1204 22:33:36.879641 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m995g\" (UniqueName: \"kubernetes.io/projected/3b00720a-09b8-4a70-a585-65807c6fbd56-kube-api-access-m995g\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.880276 master-0 kubenswrapper[33572]: I1204 22:33:36.880243 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b00720a-09b8-4a70-a585-65807c6fbd56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.880490 master-0 kubenswrapper[33572]: I1204 22:33:36.880471 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b00720a-09b8-4a70-a585-65807c6fbd56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.880688 master-0 kubenswrapper[33572]: I1204 22:33:36.880657 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9fbbb5d0-f1e9-46aa-aab8-69a648a34a2a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^95685fe3-0d5c-47ba-b6c2-6330a77b8e7b\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.880818 master-0 kubenswrapper[33572]: I1204 22:33:36.880800 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b00720a-09b8-4a70-a585-65807c6fbd56-config\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.880962 master-0 kubenswrapper[33572]: I1204 22:33:36.880943 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b00720a-09b8-4a70-a585-65807c6fbd56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.984304 master-0 kubenswrapper[33572]: I1204 22:33:36.984196 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b00720a-09b8-4a70-a585-65807c6fbd56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.984304 master-0 kubenswrapper[33572]: I1204 22:33:36.984306 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m995g\" (UniqueName: \"kubernetes.io/projected/3b00720a-09b8-4a70-a585-65807c6fbd56-kube-api-access-m995g\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.984584 master-0 kubenswrapper[33572]: I1204 22:33:36.984365 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b00720a-09b8-4a70-a585-65807c6fbd56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.984584 master-0 kubenswrapper[33572]: I1204 22:33:36.984404 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b00720a-09b8-4a70-a585-65807c6fbd56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.984584 master-0 kubenswrapper[33572]: I1204 22:33:36.984445 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b00720a-09b8-4a70-a585-65807c6fbd56-config\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.984584 master-0 kubenswrapper[33572]: I1204 22:33:36.984471 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9fbbb5d0-f1e9-46aa-aab8-69a648a34a2a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^95685fe3-0d5c-47ba-b6c2-6330a77b8e7b\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.984584 master-0 kubenswrapper[33572]: I1204 22:33:36.984526 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b00720a-09b8-4a70-a585-65807c6fbd56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.984743 master-0 kubenswrapper[33572]: I1204 22:33:36.984612 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b00720a-09b8-4a70-a585-65807c6fbd56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.985350 master-0 kubenswrapper[33572]: I1204 22:33:36.985325 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3b00720a-09b8-4a70-a585-65807c6fbd56-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.987932 master-0 kubenswrapper[33572]: I1204 22:33:36.987901 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b00720a-09b8-4a70-a585-65807c6fbd56-config\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.988100 master-0 kubenswrapper[33572]: I1204 22:33:36.987901 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b00720a-09b8-4a70-a585-65807c6fbd56-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.990925 master-0 kubenswrapper[33572]: I1204 22:33:36.990890 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:33:36.990990 master-0 kubenswrapper[33572]: I1204 22:33:36.990934 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9fbbb5d0-f1e9-46aa-aab8-69a648a34a2a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^95685fe3-0d5c-47ba-b6c2-6330a77b8e7b\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/cac9700404dd9f029aea68b73aaaad19131da3c70aef92edb28d8a0cc3b458d9/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:36.991461 master-0 kubenswrapper[33572]: I1204 22:33:36.991416 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b00720a-09b8-4a70-a585-65807c6fbd56-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:37.014402 master-0 kubenswrapper[33572]: I1204 22:33:37.014359 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b00720a-09b8-4a70-a585-65807c6fbd56-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:37.014669 master-0 kubenswrapper[33572]: I1204 22:33:37.014488 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b00720a-09b8-4a70-a585-65807c6fbd56-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:37.027023 master-0 kubenswrapper[33572]: I1204 22:33:37.026963 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m995g\" (UniqueName: \"kubernetes.io/projected/3b00720a-09b8-4a70-a585-65807c6fbd56-kube-api-access-m995g\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:38.273193 master-0 kubenswrapper[33572]: I1204 22:33:38.273117 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d4383cda-877b-4a22-aae7-6af4d72cd93f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^42d04af6-cdd0-43e5-a017-9a46c1e22de9\") pod \"ovsdbserver-nb-0\" (UID: \"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c\") " pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:38.423106 master-0 kubenswrapper[33572]: I1204 22:33:38.423000 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:39.647465 master-0 kubenswrapper[33572]: I1204 22:33:39.647406 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9fbbb5d0-f1e9-46aa-aab8-69a648a34a2a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^95685fe3-0d5c-47ba-b6c2-6330a77b8e7b\") pod \"ovsdbserver-sb-0\" (UID: \"3b00720a-09b8-4a70-a585-65807c6fbd56\") " pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:39.914635 master-0 kubenswrapper[33572]: I1204 22:33:39.913497 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:42.605281 master-0 kubenswrapper[33572]: I1204 22:33:42.600200 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 22:33:42.610521 master-0 kubenswrapper[33572]: I1204 22:33:42.608783 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 22:33:42.628065 master-0 kubenswrapper[33572]: W1204 22:33:42.627987 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb294eb_fff6_4a11_8e4c_b80bc0a64045.slice/crio-0a03552160bc9c963d7cb805188015f3322fdfcec572f2fc5e9cb7ae42555825 WatchSource:0}: Error finding container 0a03552160bc9c963d7cb805188015f3322fdfcec572f2fc5e9cb7ae42555825: Status 404 returned error can't find the container with id 0a03552160bc9c963d7cb805188015f3322fdfcec572f2fc5e9cb7ae42555825 Dec 04 22:33:42.632684 master-0 kubenswrapper[33572]: W1204 22:33:42.631623 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d1338e8_405c_4439_ae7d_02034960a5c5.slice/crio-4559a422d9d5901b3bbdda8eb140512fe37ed362da8d6a07581b93e3f6be5f29 WatchSource:0}: Error finding container 4559a422d9d5901b3bbdda8eb140512fe37ed362da8d6a07581b93e3f6be5f29: Status 404 returned error can't find the container with id 4559a422d9d5901b3bbdda8eb140512fe37ed362da8d6a07581b93e3f6be5f29 Dec 04 22:33:42.797386 master-0 kubenswrapper[33572]: I1204 22:33:42.797318 33572 generic.go:334] "Generic (PLEG): container finished" podID="9210fc14-13be-430c-a269-be48b495a428" containerID="7de6d891a052880605bc1258a0f158129737cf872a2328a68ccfc82bf4d26184" exitCode=0 Dec 04 22:33:42.797666 master-0 kubenswrapper[33572]: I1204 22:33:42.797411 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" event={"ID":"9210fc14-13be-430c-a269-be48b495a428","Type":"ContainerDied","Data":"7de6d891a052880605bc1258a0f158129737cf872a2328a68ccfc82bf4d26184"} Dec 04 22:33:42.800553 master-0 kubenswrapper[33572]: I1204 22:33:42.800438 33572 generic.go:334] "Generic (PLEG): container finished" podID="2fa144a4-7310-414a-a941-f9e08aa63084" containerID="c9a2deac7d2ac4a938d0ffa5325d4b0bf3ed50bda99ae33e992d6c45f8037fb9" exitCode=0 Dec 04 22:33:42.800655 master-0 kubenswrapper[33572]: I1204 22:33:42.800521 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" event={"ID":"2fa144a4-7310-414a-a941-f9e08aa63084","Type":"ContainerDied","Data":"c9a2deac7d2ac4a938d0ffa5325d4b0bf3ed50bda99ae33e992d6c45f8037fb9"} Dec 04 22:33:42.803400 master-0 kubenswrapper[33572]: I1204 22:33:42.803323 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1d1338e8-405c-4439-ae7d-02034960a5c5","Type":"ContainerStarted","Data":"4559a422d9d5901b3bbdda8eb140512fe37ed362da8d6a07581b93e3f6be5f29"} Dec 04 22:33:42.814804 master-0 kubenswrapper[33572]: I1204 22:33:42.814761 33572 generic.go:334] "Generic (PLEG): container finished" podID="6051b2e6-2761-4a90-a2eb-8b437af545b5" containerID="7725dc5283673288c9f42ff30edbc23d91e2e08e3427d849083d752905613e0e" exitCode=0 Dec 04 22:33:42.814989 master-0 kubenswrapper[33572]: I1204 22:33:42.814944 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" event={"ID":"6051b2e6-2761-4a90-a2eb-8b437af545b5","Type":"ContainerDied","Data":"7725dc5283673288c9f42ff30edbc23d91e2e08e3427d849083d752905613e0e"} Dec 04 22:33:42.818048 master-0 kubenswrapper[33572]: I1204 22:33:42.818025 33572 generic.go:334] "Generic (PLEG): container finished" podID="13ad3678-93e9-4633-a3d8-0ca651d28bf2" containerID="779e49b0391065149888853f93e62cfc4f49526580c8ae7920ab50b6735fe22d" exitCode=0 Dec 04 22:33:42.818149 master-0 kubenswrapper[33572]: I1204 22:33:42.818072 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" event={"ID":"13ad3678-93e9-4633-a3d8-0ca651d28bf2","Type":"ContainerDied","Data":"779e49b0391065149888853f93e62cfc4f49526580c8ae7920ab50b6735fe22d"} Dec 04 22:33:42.820619 master-0 kubenswrapper[33572]: I1204 22:33:42.820587 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"feb294eb-fff6-4a11-8e4c-b80bc0a64045","Type":"ContainerStarted","Data":"0a03552160bc9c963d7cb805188015f3322fdfcec572f2fc5e9cb7ae42555825"} Dec 04 22:33:43.534190 master-0 kubenswrapper[33572]: I1204 22:33:43.534108 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:43.547773 master-0 kubenswrapper[33572]: I1204 22:33:43.547721 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" Dec 04 22:33:43.661149 master-0 kubenswrapper[33572]: I1204 22:33:43.661083 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rk7z7"] Dec 04 22:33:43.675763 master-0 kubenswrapper[33572]: I1204 22:33:43.675693 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 22:33:43.679349 master-0 kubenswrapper[33572]: I1204 22:33:43.678475 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-dns-svc\") pod \"6051b2e6-2761-4a90-a2eb-8b437af545b5\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " Dec 04 22:33:43.679349 master-0 kubenswrapper[33572]: I1204 22:33:43.678739 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa144a4-7310-414a-a941-f9e08aa63084-config\") pod \"2fa144a4-7310-414a-a941-f9e08aa63084\" (UID: \"2fa144a4-7310-414a-a941-f9e08aa63084\") " Dec 04 22:33:43.680963 master-0 kubenswrapper[33572]: W1204 22:33:43.680155 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5f0454e_05e4_4510_bbf1_b273079b0f1d.slice/crio-82ae89b8b2d81ff4da1f61befcb48fbbf3e0eed46fa3eaa1614637308e45316f WatchSource:0}: Error finding container 82ae89b8b2d81ff4da1f61befcb48fbbf3e0eed46fa3eaa1614637308e45316f: Status 404 returned error can't find the container with id 82ae89b8b2d81ff4da1f61befcb48fbbf3e0eed46fa3eaa1614637308e45316f Dec 04 22:33:43.703269 master-0 kubenswrapper[33572]: I1204 22:33:43.703103 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fa144a4-7310-414a-a941-f9e08aa63084-config" (OuterVolumeSpecName: "config") pod "2fa144a4-7310-414a-a941-f9e08aa63084" (UID: "2fa144a4-7310-414a-a941-f9e08aa63084"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:33:43.710431 master-0 kubenswrapper[33572]: I1204 22:33:43.710389 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 22:33:43.719948 master-0 kubenswrapper[33572]: I1204 22:33:43.719881 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmssp\" (UniqueName: \"kubernetes.io/projected/6051b2e6-2761-4a90-a2eb-8b437af545b5-kube-api-access-fmssp\") pod \"6051b2e6-2761-4a90-a2eb-8b437af545b5\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " Dec 04 22:33:43.720294 master-0 kubenswrapper[33572]: I1204 22:33:43.720196 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvdg2\" (UniqueName: \"kubernetes.io/projected/2fa144a4-7310-414a-a941-f9e08aa63084-kube-api-access-pvdg2\") pod \"2fa144a4-7310-414a-a941-f9e08aa63084\" (UID: \"2fa144a4-7310-414a-a941-f9e08aa63084\") " Dec 04 22:33:43.720294 master-0 kubenswrapper[33572]: I1204 22:33:43.720237 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-config\") pod \"6051b2e6-2761-4a90-a2eb-8b437af545b5\" (UID: \"6051b2e6-2761-4a90-a2eb-8b437af545b5\") " Dec 04 22:33:43.720294 master-0 kubenswrapper[33572]: I1204 22:33:43.719900 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6051b2e6-2761-4a90-a2eb-8b437af545b5" (UID: "6051b2e6-2761-4a90-a2eb-8b437af545b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:33:43.725044 master-0 kubenswrapper[33572]: I1204 22:33:43.724360 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa144a4-7310-414a-a941-f9e08aa63084-kube-api-access-pvdg2" (OuterVolumeSpecName: "kube-api-access-pvdg2") pod "2fa144a4-7310-414a-a941-f9e08aa63084" (UID: "2fa144a4-7310-414a-a941-f9e08aa63084"). InnerVolumeSpecName "kube-api-access-pvdg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:33:43.725044 master-0 kubenswrapper[33572]: I1204 22:33:43.724880 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvdg2\" (UniqueName: \"kubernetes.io/projected/2fa144a4-7310-414a-a941-f9e08aa63084-kube-api-access-pvdg2\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:43.725044 master-0 kubenswrapper[33572]: I1204 22:33:43.724937 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:43.725044 master-0 kubenswrapper[33572]: I1204 22:33:43.724958 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa144a4-7310-414a-a941-f9e08aa63084-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:43.753647 master-0 kubenswrapper[33572]: I1204 22:33:43.753567 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 22:33:43.773779 master-0 kubenswrapper[33572]: I1204 22:33:43.773643 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6051b2e6-2761-4a90-a2eb-8b437af545b5-kube-api-access-fmssp" (OuterVolumeSpecName: "kube-api-access-fmssp") pod "6051b2e6-2761-4a90-a2eb-8b437af545b5" (UID: "6051b2e6-2761-4a90-a2eb-8b437af545b5"). InnerVolumeSpecName "kube-api-access-fmssp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:33:43.793728 master-0 kubenswrapper[33572]: I1204 22:33:43.793665 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-config" (OuterVolumeSpecName: "config") pod "6051b2e6-2761-4a90-a2eb-8b437af545b5" (UID: "6051b2e6-2761-4a90-a2eb-8b437af545b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:33:43.827045 master-0 kubenswrapper[33572]: I1204 22:33:43.826982 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6051b2e6-2761-4a90-a2eb-8b437af545b5-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:43.827045 master-0 kubenswrapper[33572]: I1204 22:33:43.827041 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmssp\" (UniqueName: \"kubernetes.io/projected/6051b2e6-2761-4a90-a2eb-8b437af545b5-kube-api-access-fmssp\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:43.864070 master-0 kubenswrapper[33572]: I1204 22:33:43.859831 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" event={"ID":"6051b2e6-2761-4a90-a2eb-8b437af545b5","Type":"ContainerDied","Data":"4844456a0268c256b0ca246c23f64864da9b2a61c284b5021b8bc967fb431f09"} Dec 04 22:33:43.864070 master-0 kubenswrapper[33572]: I1204 22:33:43.859919 33572 scope.go:117] "RemoveContainer" containerID="7725dc5283673288c9f42ff30edbc23d91e2e08e3427d849083d752905613e0e" Dec 04 22:33:43.864070 master-0 kubenswrapper[33572]: I1204 22:33:43.861707 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-vn5dx" Dec 04 22:33:43.867105 master-0 kubenswrapper[33572]: I1204 22:33:43.867051 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" event={"ID":"13ad3678-93e9-4633-a3d8-0ca651d28bf2","Type":"ContainerStarted","Data":"8cb4753b03872ab047f2684fe1272bd0ea0c2912cf120d2e6afbfe21862c4afe"} Dec 04 22:33:43.869230 master-0 kubenswrapper[33572]: I1204 22:33:43.869203 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:43.870474 master-0 kubenswrapper[33572]: W1204 22:33:43.870443 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod477cfefc_eaf1_4469_b639_74b14e997c6b.slice/crio-0c0ed96f9181880f38309da0b097be8bc396bc997a26ef4a2f3cb5cfc06238e5 WatchSource:0}: Error finding container 0c0ed96f9181880f38309da0b097be8bc396bc997a26ef4a2f3cb5cfc06238e5: Status 404 returned error can't find the container with id 0c0ed96f9181880f38309da0b097be8bc396bc997a26ef4a2f3cb5cfc06238e5 Dec 04 22:33:43.873921 master-0 kubenswrapper[33572]: I1204 22:33:43.873839 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rk7z7" event={"ID":"b7512b9d-7adc-4af3-a29f-b154799338cb","Type":"ContainerStarted","Data":"f7cf0ba0bedff4f1bd63fa5bd4a88493a22330e62e188f447fc94c38c862d8b5"} Dec 04 22:33:43.876077 master-0 kubenswrapper[33572]: I1204 22:33:43.876005 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"077ecfc2-ed81-4de5-993c-0f1084df9734","Type":"ContainerStarted","Data":"a2bec35a2687546791de3fad353335f15b7e66d23ea265d2c195f4a88c6d8986"} Dec 04 22:33:43.879412 master-0 kubenswrapper[33572]: I1204 22:33:43.879352 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mk92t"] Dec 04 22:33:43.889037 master-0 kubenswrapper[33572]: I1204 22:33:43.888935 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" event={"ID":"9210fc14-13be-430c-a269-be48b495a428","Type":"ContainerStarted","Data":"d91ca408628685973ee5f939d43797e6800953db5f0ac1bd2220d2cd13c1e724"} Dec 04 22:33:43.889240 master-0 kubenswrapper[33572]: I1204 22:33:43.889174 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:43.890874 master-0 kubenswrapper[33572]: I1204 22:33:43.890838 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" event={"ID":"2fa144a4-7310-414a-a941-f9e08aa63084","Type":"ContainerDied","Data":"dcb4f3020eac496e1c001a00f4f666b9c8433a51e8e7000cd9a6f35a86fe6328"} Dec 04 22:33:43.890996 master-0 kubenswrapper[33572]: I1204 22:33:43.890970 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs" Dec 04 22:33:43.893873 master-0 kubenswrapper[33572]: I1204 22:33:43.893818 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5f0454e-05e4-4510-bbf1-b273079b0f1d","Type":"ContainerStarted","Data":"82ae89b8b2d81ff4da1f61befcb48fbbf3e0eed46fa3eaa1614637308e45316f"} Dec 04 22:33:43.894671 master-0 kubenswrapper[33572]: I1204 22:33:43.894595 33572 scope.go:117] "RemoveContainer" containerID="c9a2deac7d2ac4a938d0ffa5325d4b0bf3ed50bda99ae33e992d6c45f8037fb9" Dec 04 22:33:43.896405 master-0 kubenswrapper[33572]: I1204 22:33:43.896320 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a95bf966-679b-4bad-8693-f1becf39685d","Type":"ContainerStarted","Data":"0f20f2ca930142dd16c6ab40da959f55d09cd537cd8330da5c28363707460c82"} Dec 04 22:33:43.915974 master-0 kubenswrapper[33572]: I1204 22:33:43.915850 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" podStartSLOduration=3.023108885 podStartE2EDuration="21.915814184s" podCreationTimestamp="2025-12-04 22:33:22 +0000 UTC" firstStartedPulling="2025-12-04 22:33:23.353250033 +0000 UTC m=+867.080775682" lastFinishedPulling="2025-12-04 22:33:42.245955322 +0000 UTC m=+885.973480981" observedRunningTime="2025-12-04 22:33:43.905759484 +0000 UTC m=+887.633285153" watchObservedRunningTime="2025-12-04 22:33:43.915814184 +0000 UTC m=+887.643339833" Dec 04 22:33:43.959615 master-0 kubenswrapper[33572]: I1204 22:33:43.959261 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-vn5dx"] Dec 04 22:33:43.972672 master-0 kubenswrapper[33572]: I1204 22:33:43.972562 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-vn5dx"] Dec 04 22:33:43.983267 master-0 kubenswrapper[33572]: I1204 22:33:43.983177 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" podStartSLOduration=2.976568956 podStartE2EDuration="21.983159392s" podCreationTimestamp="2025-12-04 22:33:22 +0000 UTC" firstStartedPulling="2025-12-04 22:33:23.057973913 +0000 UTC m=+866.785499562" lastFinishedPulling="2025-12-04 22:33:42.064564349 +0000 UTC m=+885.792089998" observedRunningTime="2025-12-04 22:33:43.976411954 +0000 UTC m=+887.703937603" watchObservedRunningTime="2025-12-04 22:33:43.983159392 +0000 UTC m=+887.710685041" Dec 04 22:33:44.018684 master-0 kubenswrapper[33572]: I1204 22:33:44.018626 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs"] Dec 04 22:33:44.030881 master-0 kubenswrapper[33572]: I1204 22:33:44.030807 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-dwxvs"] Dec 04 22:33:44.536923 master-0 kubenswrapper[33572]: I1204 22:33:44.536862 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fa144a4-7310-414a-a941-f9e08aa63084" path="/var/lib/kubelet/pods/2fa144a4-7310-414a-a941-f9e08aa63084/volumes" Dec 04 22:33:44.537435 master-0 kubenswrapper[33572]: I1204 22:33:44.537405 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6051b2e6-2761-4a90-a2eb-8b437af545b5" path="/var/lib/kubelet/pods/6051b2e6-2761-4a90-a2eb-8b437af545b5/volumes" Dec 04 22:33:44.739537 master-0 kubenswrapper[33572]: I1204 22:33:44.739012 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 22:33:44.850067 master-0 kubenswrapper[33572]: W1204 22:33:44.849951 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b00720a_09b8_4a70_a585_65807c6fbd56.slice/crio-ad6bde4cfc95d966366f56bf33b36d505e88c315ece1173b5c657c95d08c3861 WatchSource:0}: Error finding container ad6bde4cfc95d966366f56bf33b36d505e88c315ece1173b5c657c95d08c3861: Status 404 returned error can't find the container with id ad6bde4cfc95d966366f56bf33b36d505e88c315ece1173b5c657c95d08c3861 Dec 04 22:33:44.909047 master-0 kubenswrapper[33572]: I1204 22:33:44.908967 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mk92t" event={"ID":"477cfefc-eaf1-4469-b639-74b14e997c6b","Type":"ContainerStarted","Data":"0c0ed96f9181880f38309da0b097be8bc396bc997a26ef4a2f3cb5cfc06238e5"} Dec 04 22:33:44.911821 master-0 kubenswrapper[33572]: I1204 22:33:44.911738 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3b00720a-09b8-4a70-a585-65807c6fbd56","Type":"ContainerStarted","Data":"ad6bde4cfc95d966366f56bf33b36d505e88c315ece1173b5c657c95d08c3861"} Dec 04 22:33:44.973568 master-0 kubenswrapper[33572]: I1204 22:33:44.973292 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-tnf6q"] Dec 04 22:33:44.974093 master-0 kubenswrapper[33572]: E1204 22:33:44.973929 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa144a4-7310-414a-a941-f9e08aa63084" containerName="init" Dec 04 22:33:44.974093 master-0 kubenswrapper[33572]: I1204 22:33:44.973947 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa144a4-7310-414a-a941-f9e08aa63084" containerName="init" Dec 04 22:33:44.974093 master-0 kubenswrapper[33572]: E1204 22:33:44.973977 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6051b2e6-2761-4a90-a2eb-8b437af545b5" containerName="init" Dec 04 22:33:44.974093 master-0 kubenswrapper[33572]: I1204 22:33:44.973986 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6051b2e6-2761-4a90-a2eb-8b437af545b5" containerName="init" Dec 04 22:33:44.974290 master-0 kubenswrapper[33572]: I1204 22:33:44.974263 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="6051b2e6-2761-4a90-a2eb-8b437af545b5" containerName="init" Dec 04 22:33:44.974338 master-0 kubenswrapper[33572]: I1204 22:33:44.974307 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa144a4-7310-414a-a941-f9e08aa63084" containerName="init" Dec 04 22:33:44.975818 master-0 kubenswrapper[33572]: I1204 22:33:44.975735 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:44.980592 master-0 kubenswrapper[33572]: I1204 22:33:44.980472 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 04 22:33:45.002189 master-0 kubenswrapper[33572]: I1204 22:33:45.002102 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tnf6q"] Dec 04 22:33:45.062564 master-0 kubenswrapper[33572]: I1204 22:33:45.062495 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e0220383-d3f8-4a3f-9923-570792aa3a13-ovs-rundir\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.062808 master-0 kubenswrapper[33572]: I1204 22:33:45.062598 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0220383-d3f8-4a3f-9923-570792aa3a13-config\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.062808 master-0 kubenswrapper[33572]: I1204 22:33:45.062686 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0220383-d3f8-4a3f-9923-570792aa3a13-combined-ca-bundle\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.062808 master-0 kubenswrapper[33572]: I1204 22:33:45.062760 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0220383-d3f8-4a3f-9923-570792aa3a13-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.062808 master-0 kubenswrapper[33572]: I1204 22:33:45.062786 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e0220383-d3f8-4a3f-9923-570792aa3a13-ovn-rundir\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.063001 master-0 kubenswrapper[33572]: I1204 22:33:45.062817 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqbb\" (UniqueName: \"kubernetes.io/projected/e0220383-d3f8-4a3f-9923-570792aa3a13-kube-api-access-xbqbb\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.164460 master-0 kubenswrapper[33572]: I1204 22:33:45.164344 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e0220383-d3f8-4a3f-9923-570792aa3a13-ovs-rundir\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.164460 master-0 kubenswrapper[33572]: I1204 22:33:45.164434 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0220383-d3f8-4a3f-9923-570792aa3a13-config\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.164687 master-0 kubenswrapper[33572]: I1204 22:33:45.164540 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0220383-d3f8-4a3f-9923-570792aa3a13-combined-ca-bundle\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.164687 master-0 kubenswrapper[33572]: I1204 22:33:45.164620 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0220383-d3f8-4a3f-9923-570792aa3a13-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.164687 master-0 kubenswrapper[33572]: I1204 22:33:45.164663 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e0220383-d3f8-4a3f-9923-570792aa3a13-ovn-rundir\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.164778 master-0 kubenswrapper[33572]: I1204 22:33:45.164693 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqbb\" (UniqueName: \"kubernetes.io/projected/e0220383-d3f8-4a3f-9923-570792aa3a13-kube-api-access-xbqbb\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.165063 master-0 kubenswrapper[33572]: I1204 22:33:45.165029 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e0220383-d3f8-4a3f-9923-570792aa3a13-ovn-rundir\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.166515 master-0 kubenswrapper[33572]: I1204 22:33:45.166128 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e0220383-d3f8-4a3f-9923-570792aa3a13-ovs-rundir\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.166515 master-0 kubenswrapper[33572]: I1204 22:33:45.166249 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0220383-d3f8-4a3f-9923-570792aa3a13-config\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.169145 master-0 kubenswrapper[33572]: I1204 22:33:45.169114 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0220383-d3f8-4a3f-9923-570792aa3a13-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.171653 master-0 kubenswrapper[33572]: I1204 22:33:45.171617 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0220383-d3f8-4a3f-9923-570792aa3a13-combined-ca-bundle\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.183345 master-0 kubenswrapper[33572]: I1204 22:33:45.183293 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqbb\" (UniqueName: \"kubernetes.io/projected/e0220383-d3f8-4a3f-9923-570792aa3a13-kube-api-access-xbqbb\") pod \"ovn-controller-metrics-tnf6q\" (UID: \"e0220383-d3f8-4a3f-9923-570792aa3a13\") " pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.290382 master-0 kubenswrapper[33572]: I1204 22:33:45.290310 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-xxnr5"] Dec 04 22:33:45.305541 master-0 kubenswrapper[33572]: I1204 22:33:45.305008 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tnf6q" Dec 04 22:33:45.339206 master-0 kubenswrapper[33572]: I1204 22:33:45.337124 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b448889f-cx6r6"] Dec 04 22:33:45.342077 master-0 kubenswrapper[33572]: I1204 22:33:45.342044 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.345212 master-0 kubenswrapper[33572]: I1204 22:33:45.344706 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 04 22:33:45.372728 master-0 kubenswrapper[33572]: I1204 22:33:45.370623 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-config\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.372728 master-0 kubenswrapper[33572]: I1204 22:33:45.370692 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-dns-svc\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.372728 master-0 kubenswrapper[33572]: I1204 22:33:45.371068 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g48gb\" (UniqueName: \"kubernetes.io/projected/b449fa95-3750-4776-9c64-c2caf6d07fa5-kube-api-access-g48gb\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.372728 master-0 kubenswrapper[33572]: I1204 22:33:45.371332 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-ovsdbserver-sb\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.373631 master-0 kubenswrapper[33572]: I1204 22:33:45.373591 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b448889f-cx6r6"] Dec 04 22:33:45.476135 master-0 kubenswrapper[33572]: I1204 22:33:45.475868 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-config\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.476135 master-0 kubenswrapper[33572]: I1204 22:33:45.476016 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-dns-svc\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.476448 master-0 kubenswrapper[33572]: I1204 22:33:45.476385 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g48gb\" (UniqueName: \"kubernetes.io/projected/b449fa95-3750-4776-9c64-c2caf6d07fa5-kube-api-access-g48gb\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.477233 master-0 kubenswrapper[33572]: I1204 22:33:45.477187 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-ovsdbserver-sb\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.477233 master-0 kubenswrapper[33572]: I1204 22:33:45.477203 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-config\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.483647 master-0 kubenswrapper[33572]: I1204 22:33:45.483147 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-dns-svc\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.486678 master-0 kubenswrapper[33572]: I1204 22:33:45.486628 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-ovsdbserver-sb\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.490232 master-0 kubenswrapper[33572]: I1204 22:33:45.490178 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 22:33:45.502060 master-0 kubenswrapper[33572]: I1204 22:33:45.501725 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g48gb\" (UniqueName: \"kubernetes.io/projected/b449fa95-3750-4776-9c64-c2caf6d07fa5-kube-api-access-g48gb\") pod \"dnsmasq-dns-54b448889f-cx6r6\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:45.679634 master-0 kubenswrapper[33572]: I1204 22:33:45.679545 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:46.945100 master-0 kubenswrapper[33572]: I1204 22:33:46.945038 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c","Type":"ContainerStarted","Data":"814428ed1131f63c3fb980581e986ab97f0a1e94efa4e70a13e98dc4b910c168"} Dec 04 22:33:46.950843 master-0 kubenswrapper[33572]: I1204 22:33:46.945212 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" podUID="13ad3678-93e9-4633-a3d8-0ca651d28bf2" containerName="dnsmasq-dns" containerID="cri-o://8cb4753b03872ab047f2684fe1272bd0ea0c2912cf120d2e6afbfe21862c4afe" gracePeriod=10 Dec 04 22:33:47.174624 master-0 kubenswrapper[33572]: I1204 22:33:47.174472 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tnf6q"] Dec 04 22:33:47.300147 master-0 kubenswrapper[33572]: I1204 22:33:47.300081 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b448889f-cx6r6"] Dec 04 22:33:47.961937 master-0 kubenswrapper[33572]: I1204 22:33:47.961861 33572 generic.go:334] "Generic (PLEG): container finished" podID="13ad3678-93e9-4633-a3d8-0ca651d28bf2" containerID="8cb4753b03872ab047f2684fe1272bd0ea0c2912cf120d2e6afbfe21862c4afe" exitCode=0 Dec 04 22:33:47.962927 master-0 kubenswrapper[33572]: I1204 22:33:47.961947 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" event={"ID":"13ad3678-93e9-4633-a3d8-0ca651d28bf2","Type":"ContainerDied","Data":"8cb4753b03872ab047f2684fe1272bd0ea0c2912cf120d2e6afbfe21862c4afe"} Dec 04 22:33:47.963903 master-0 kubenswrapper[33572]: I1204 22:33:47.963810 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"feb294eb-fff6-4a11-8e4c-b80bc0a64045","Type":"ContainerStarted","Data":"20cee4651b7e1066e9336a88812bb94c1d884a95bc85cc724b4dd6e67ce304fd"} Dec 04 22:33:47.964003 master-0 kubenswrapper[33572]: I1204 22:33:47.963960 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 04 22:33:47.987145 master-0 kubenswrapper[33572]: I1204 22:33:47.987021 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.843004913 podStartE2EDuration="21.987005859s" podCreationTimestamp="2025-12-04 22:33:26 +0000 UTC" firstStartedPulling="2025-12-04 22:33:42.631223048 +0000 UTC m=+886.358748697" lastFinishedPulling="2025-12-04 22:33:46.775223994 +0000 UTC m=+890.502749643" observedRunningTime="2025-12-04 22:33:47.986012992 +0000 UTC m=+891.713538651" watchObservedRunningTime="2025-12-04 22:33:47.987005859 +0000 UTC m=+891.714531508" Dec 04 22:33:48.979100 master-0 kubenswrapper[33572]: I1204 22:33:48.979041 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1d1338e8-405c-4439-ae7d-02034960a5c5","Type":"ContainerStarted","Data":"4115e7db102073368d69ec1f7682282b2a553bc4eba90cce35c1dc2cb12172a7"} Dec 04 22:33:48.981671 master-0 kubenswrapper[33572]: I1204 22:33:48.981606 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"077ecfc2-ed81-4de5-993c-0f1084df9734","Type":"ContainerStarted","Data":"a3f0c1d136cea32b8f4e971882ddc52d40a18227ee3ad022f0524e308fee4383"} Dec 04 22:33:50.271414 master-0 kubenswrapper[33572]: W1204 22:33:50.271329 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0220383_d3f8_4a3f_9923_570792aa3a13.slice/crio-c2b9e2f36d4442646b7fc6da7debe5b77bc0d1f2f7fb34019caf11eac2058abc WatchSource:0}: Error finding container c2b9e2f36d4442646b7fc6da7debe5b77bc0d1f2f7fb34019caf11eac2058abc: Status 404 returned error can't find the container with id c2b9e2f36d4442646b7fc6da7debe5b77bc0d1f2f7fb34019caf11eac2058abc Dec 04 22:33:50.319423 master-0 kubenswrapper[33572]: W1204 22:33:50.319359 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb449fa95_3750_4776_9c64_c2caf6d07fa5.slice/crio-f77015ef5194e3c9fe3e7c149bd8f9e2c4894d576e472c148e849c246c34863b WatchSource:0}: Error finding container f77015ef5194e3c9fe3e7c149bd8f9e2c4894d576e472c148e849c246c34863b: Status 404 returned error can't find the container with id f77015ef5194e3c9fe3e7c149bd8f9e2c4894d576e472c148e849c246c34863b Dec 04 22:33:50.849086 master-0 kubenswrapper[33572]: I1204 22:33:50.849041 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:50.996572 master-0 kubenswrapper[33572]: I1204 22:33:50.995685 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-config\") pod \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " Dec 04 22:33:50.996572 master-0 kubenswrapper[33572]: I1204 22:33:50.995886 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbsjj\" (UniqueName: \"kubernetes.io/projected/13ad3678-93e9-4633-a3d8-0ca651d28bf2-kube-api-access-lbsjj\") pod \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " Dec 04 22:33:50.996856 master-0 kubenswrapper[33572]: I1204 22:33:50.996629 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-dns-svc\") pod \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\" (UID: \"13ad3678-93e9-4633-a3d8-0ca651d28bf2\") " Dec 04 22:33:51.008993 master-0 kubenswrapper[33572]: I1204 22:33:51.008948 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13ad3678-93e9-4633-a3d8-0ca651d28bf2-kube-api-access-lbsjj" (OuterVolumeSpecName: "kube-api-access-lbsjj") pod "13ad3678-93e9-4633-a3d8-0ca651d28bf2" (UID: "13ad3678-93e9-4633-a3d8-0ca651d28bf2"). InnerVolumeSpecName "kube-api-access-lbsjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:33:51.010140 master-0 kubenswrapper[33572]: I1204 22:33:51.010102 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" event={"ID":"13ad3678-93e9-4633-a3d8-0ca651d28bf2","Type":"ContainerDied","Data":"4ddf7e50a9f995f043ee25147d8524cdf1775071aef35fc7831f35cce747be5e"} Dec 04 22:33:51.010237 master-0 kubenswrapper[33572]: I1204 22:33:51.010159 33572 scope.go:117] "RemoveContainer" containerID="8cb4753b03872ab047f2684fe1272bd0ea0c2912cf120d2e6afbfe21862c4afe" Dec 04 22:33:51.010304 master-0 kubenswrapper[33572]: I1204 22:33:51.010281 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-xxnr5" Dec 04 22:33:51.014689 master-0 kubenswrapper[33572]: I1204 22:33:51.013737 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tnf6q" event={"ID":"e0220383-d3f8-4a3f-9923-570792aa3a13","Type":"ContainerStarted","Data":"c2b9e2f36d4442646b7fc6da7debe5b77bc0d1f2f7fb34019caf11eac2058abc"} Dec 04 22:33:51.015962 master-0 kubenswrapper[33572]: I1204 22:33:51.015642 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" event={"ID":"b449fa95-3750-4776-9c64-c2caf6d07fa5","Type":"ContainerStarted","Data":"f77015ef5194e3c9fe3e7c149bd8f9e2c4894d576e472c148e849c246c34863b"} Dec 04 22:33:51.064789 master-0 kubenswrapper[33572]: I1204 22:33:51.064664 33572 scope.go:117] "RemoveContainer" containerID="779e49b0391065149888853f93e62cfc4f49526580c8ae7920ab50b6735fe22d" Dec 04 22:33:51.098825 master-0 kubenswrapper[33572]: I1204 22:33:51.098788 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbsjj\" (UniqueName: \"kubernetes.io/projected/13ad3678-93e9-4633-a3d8-0ca651d28bf2-kube-api-access-lbsjj\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:51.122195 master-0 kubenswrapper[33572]: I1204 22:33:51.122138 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-config" (OuterVolumeSpecName: "config") pod "13ad3678-93e9-4633-a3d8-0ca651d28bf2" (UID: "13ad3678-93e9-4633-a3d8-0ca651d28bf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:33:51.179229 master-0 kubenswrapper[33572]: I1204 22:33:51.179163 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13ad3678-93e9-4633-a3d8-0ca651d28bf2" (UID: "13ad3678-93e9-4633-a3d8-0ca651d28bf2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:33:51.203684 master-0 kubenswrapper[33572]: I1204 22:33:51.203152 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:51.203684 master-0 kubenswrapper[33572]: I1204 22:33:51.203191 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ad3678-93e9-4633-a3d8-0ca651d28bf2-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:51.376982 master-0 kubenswrapper[33572]: I1204 22:33:51.376869 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-xxnr5"] Dec 04 22:33:51.389653 master-0 kubenswrapper[33572]: I1204 22:33:51.389334 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-xxnr5"] Dec 04 22:33:51.679492 master-0 kubenswrapper[33572]: I1204 22:33:51.679447 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 04 22:33:52.056295 master-0 kubenswrapper[33572]: I1204 22:33:52.056143 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rk7z7" event={"ID":"b7512b9d-7adc-4af3-a29f-b154799338cb","Type":"ContainerStarted","Data":"ac78b21f127ab20a3c73fb8308344d6a16e18b6ee7ccd9f98154479c9201a1ac"} Dec 04 22:33:52.056690 master-0 kubenswrapper[33572]: I1204 22:33:52.056655 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rk7z7" Dec 04 22:33:52.060454 master-0 kubenswrapper[33572]: I1204 22:33:52.060342 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c","Type":"ContainerStarted","Data":"8c48edf1bf62d221581f8aa6991a138177df65a93a75955d2c6f60f7d47fe9ae"} Dec 04 22:33:52.062591 master-0 kubenswrapper[33572]: I1204 22:33:52.062434 33572 generic.go:334] "Generic (PLEG): container finished" podID="b449fa95-3750-4776-9c64-c2caf6d07fa5" containerID="adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b" exitCode=0 Dec 04 22:33:52.062591 master-0 kubenswrapper[33572]: I1204 22:33:52.062469 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" event={"ID":"b449fa95-3750-4776-9c64-c2caf6d07fa5","Type":"ContainerDied","Data":"adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b"} Dec 04 22:33:52.065995 master-0 kubenswrapper[33572]: I1204 22:33:52.064679 33572 generic.go:334] "Generic (PLEG): container finished" podID="477cfefc-eaf1-4469-b639-74b14e997c6b" containerID="97571f168415670e21430eea2e599e0d6eb6eb9195139841c57e0c1964c383fb" exitCode=0 Dec 04 22:33:52.065995 master-0 kubenswrapper[33572]: I1204 22:33:52.064753 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mk92t" event={"ID":"477cfefc-eaf1-4469-b639-74b14e997c6b","Type":"ContainerDied","Data":"97571f168415670e21430eea2e599e0d6eb6eb9195139841c57e0c1964c383fb"} Dec 04 22:33:52.066755 master-0 kubenswrapper[33572]: I1204 22:33:52.066730 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3b00720a-09b8-4a70-a585-65807c6fbd56","Type":"ContainerStarted","Data":"cff0092e493a7e67cca46f7fc9574a90be30130a98e24d539a9bff29284febd9"} Dec 04 22:33:52.070514 master-0 kubenswrapper[33572]: I1204 22:33:52.069387 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5f0454e-05e4-4510-bbf1-b273079b0f1d","Type":"ContainerStarted","Data":"fd4848ab8331061910e1eab3d07787435b6c32ae1274d7cda48b15d917557f7c"} Dec 04 22:33:52.080529 master-0 kubenswrapper[33572]: I1204 22:33:52.077123 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a95bf966-679b-4bad-8693-f1becf39685d","Type":"ContainerStarted","Data":"8d338d7d98eedb4ec1f67820cfabb507aa5e884f98c4801230cab93d45df8002"} Dec 04 22:33:52.080529 master-0 kubenswrapper[33572]: I1204 22:33:52.077584 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rk7z7" podStartSLOduration=10.90810541 podStartE2EDuration="18.077565582s" podCreationTimestamp="2025-12-04 22:33:34 +0000 UTC" firstStartedPulling="2025-12-04 22:33:43.658436394 +0000 UTC m=+887.385962043" lastFinishedPulling="2025-12-04 22:33:50.827896546 +0000 UTC m=+894.555422215" observedRunningTime="2025-12-04 22:33:52.073583192 +0000 UTC m=+895.801108851" watchObservedRunningTime="2025-12-04 22:33:52.077565582 +0000 UTC m=+895.805091231" Dec 04 22:33:52.435753 master-0 kubenswrapper[33572]: I1204 22:33:52.435707 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:33:52.540757 master-0 kubenswrapper[33572]: I1204 22:33:52.540692 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13ad3678-93e9-4633-a3d8-0ca651d28bf2" path="/var/lib/kubelet/pods/13ad3678-93e9-4633-a3d8-0ca651d28bf2/volumes" Dec 04 22:33:53.089994 master-0 kubenswrapper[33572]: I1204 22:33:53.089916 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mk92t" event={"ID":"477cfefc-eaf1-4469-b639-74b14e997c6b","Type":"ContainerStarted","Data":"653f8efeabb9433800cad8893357f4ae353758573aeb69235b0160583ed03042"} Dec 04 22:33:54.112277 master-0 kubenswrapper[33572]: I1204 22:33:54.112214 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" event={"ID":"b449fa95-3750-4776-9c64-c2caf6d07fa5","Type":"ContainerStarted","Data":"850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b"} Dec 04 22:33:54.114469 master-0 kubenswrapper[33572]: I1204 22:33:54.113254 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:54.141704 master-0 kubenswrapper[33572]: I1204 22:33:54.139869 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" podStartSLOduration=9.13984534 podStartE2EDuration="9.13984534s" podCreationTimestamp="2025-12-04 22:33:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:33:54.13695214 +0000 UTC m=+897.864477799" watchObservedRunningTime="2025-12-04 22:33:54.13984534 +0000 UTC m=+897.867371019" Dec 04 22:33:55.128957 master-0 kubenswrapper[33572]: I1204 22:33:55.128898 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tnf6q" event={"ID":"e0220383-d3f8-4a3f-9923-570792aa3a13","Type":"ContainerStarted","Data":"3f0ac7b0b790baed6976ab91596cff425e3c3220e9566b1c86f703a2d2d3fc83"} Dec 04 22:33:55.133188 master-0 kubenswrapper[33572]: I1204 22:33:55.133152 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd01e232-2fcf-4ca1-9859-e6f6d7f8b68c","Type":"ContainerStarted","Data":"f335221508288ab40517cfd65006d1cb2d6f5896cd43ee5c05d379395913df11"} Dec 04 22:33:55.136527 master-0 kubenswrapper[33572]: I1204 22:33:55.136471 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mk92t" event={"ID":"477cfefc-eaf1-4469-b639-74b14e997c6b","Type":"ContainerStarted","Data":"52c153e394cb1a3392c0ef0060d2864dcf199d1610b03adc4300efbb9a12da99"} Dec 04 22:33:55.136746 master-0 kubenswrapper[33572]: I1204 22:33:55.136686 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:55.136785 master-0 kubenswrapper[33572]: I1204 22:33:55.136771 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:33:55.139523 master-0 kubenswrapper[33572]: I1204 22:33:55.139436 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3b00720a-09b8-4a70-a585-65807c6fbd56","Type":"ContainerStarted","Data":"a8878b7680f3a5bfc5ec37a579b0759ad83925866750148d8af310ebbde9965d"} Dec 04 22:33:55.169036 master-0 kubenswrapper[33572]: I1204 22:33:55.168951 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-tnf6q" podStartSLOduration=7.433720762 podStartE2EDuration="11.168924677s" podCreationTimestamp="2025-12-04 22:33:44 +0000 UTC" firstStartedPulling="2025-12-04 22:33:50.309146896 +0000 UTC m=+894.036672565" lastFinishedPulling="2025-12-04 22:33:54.044350791 +0000 UTC m=+897.771876480" observedRunningTime="2025-12-04 22:33:55.152413949 +0000 UTC m=+898.879939618" watchObservedRunningTime="2025-12-04 22:33:55.168924677 +0000 UTC m=+898.896450326" Dec 04 22:33:55.178627 master-0 kubenswrapper[33572]: I1204 22:33:55.177995 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.0000339 podStartE2EDuration="21.177969918s" podCreationTimestamp="2025-12-04 22:33:34 +0000 UTC" firstStartedPulling="2025-12-04 22:33:44.854427211 +0000 UTC m=+888.581952860" lastFinishedPulling="2025-12-04 22:33:54.032363229 +0000 UTC m=+897.759888878" observedRunningTime="2025-12-04 22:33:55.176960849 +0000 UTC m=+898.904486538" watchObservedRunningTime="2025-12-04 22:33:55.177969918 +0000 UTC m=+898.905495577" Dec 04 22:33:55.223712 master-0 kubenswrapper[33572]: I1204 22:33:55.223608 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-mk92t" podStartSLOduration=14.270266877 podStartE2EDuration="21.223585063s" podCreationTimestamp="2025-12-04 22:33:34 +0000 UTC" firstStartedPulling="2025-12-04 22:33:43.874375644 +0000 UTC m=+887.601901293" lastFinishedPulling="2025-12-04 22:33:50.82769383 +0000 UTC m=+894.555219479" observedRunningTime="2025-12-04 22:33:55.215719245 +0000 UTC m=+898.943244974" watchObservedRunningTime="2025-12-04 22:33:55.223585063 +0000 UTC m=+898.951110722" Dec 04 22:33:55.253309 master-0 kubenswrapper[33572]: I1204 22:33:55.253179 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.886445361 podStartE2EDuration="21.253146704s" podCreationTimestamp="2025-12-04 22:33:34 +0000 UTC" firstStartedPulling="2025-12-04 22:33:46.678248684 +0000 UTC m=+890.405774323" lastFinishedPulling="2025-12-04 22:33:54.044949977 +0000 UTC m=+897.772475666" observedRunningTime="2025-12-04 22:33:55.245423749 +0000 UTC m=+898.972949418" watchObservedRunningTime="2025-12-04 22:33:55.253146704 +0000 UTC m=+898.980672393" Dec 04 22:33:55.530524 master-0 kubenswrapper[33572]: I1204 22:33:55.528748 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b448889f-cx6r6"] Dec 04 22:33:55.597760 master-0 kubenswrapper[33572]: I1204 22:33:55.597342 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57545c5d5f-szxns"] Dec 04 22:33:55.599523 master-0 kubenswrapper[33572]: E1204 22:33:55.598222 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ad3678-93e9-4633-a3d8-0ca651d28bf2" containerName="init" Dec 04 22:33:55.599523 master-0 kubenswrapper[33572]: I1204 22:33:55.598244 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ad3678-93e9-4633-a3d8-0ca651d28bf2" containerName="init" Dec 04 22:33:55.599523 master-0 kubenswrapper[33572]: E1204 22:33:55.598264 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ad3678-93e9-4633-a3d8-0ca651d28bf2" containerName="dnsmasq-dns" Dec 04 22:33:55.599523 master-0 kubenswrapper[33572]: I1204 22:33:55.598270 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ad3678-93e9-4633-a3d8-0ca651d28bf2" containerName="dnsmasq-dns" Dec 04 22:33:55.599523 master-0 kubenswrapper[33572]: I1204 22:33:55.598463 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ad3678-93e9-4633-a3d8-0ca651d28bf2" containerName="dnsmasq-dns" Dec 04 22:33:55.599773 master-0 kubenswrapper[33572]: I1204 22:33:55.599627 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.607530 master-0 kubenswrapper[33572]: I1204 22:33:55.603158 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 04 22:33:55.611059 master-0 kubenswrapper[33572]: I1204 22:33:55.608793 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57545c5d5f-szxns"] Dec 04 22:33:55.739250 master-0 kubenswrapper[33572]: I1204 22:33:55.739181 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-nb\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.739522 master-0 kubenswrapper[33572]: I1204 22:33:55.739365 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7lpg\" (UniqueName: \"kubernetes.io/projected/d5a1b01b-53d3-4689-8713-170466122740-kube-api-access-b7lpg\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.739654 master-0 kubenswrapper[33572]: I1204 22:33:55.739587 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-config\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.739825 master-0 kubenswrapper[33572]: I1204 22:33:55.739793 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-sb\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.740068 master-0 kubenswrapper[33572]: I1204 22:33:55.740029 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-dns-svc\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.841437 master-0 kubenswrapper[33572]: I1204 22:33:55.841302 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-nb\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.841437 master-0 kubenswrapper[33572]: I1204 22:33:55.841368 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7lpg\" (UniqueName: \"kubernetes.io/projected/d5a1b01b-53d3-4689-8713-170466122740-kube-api-access-b7lpg\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.841714 master-0 kubenswrapper[33572]: I1204 22:33:55.841542 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-config\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.841714 master-0 kubenswrapper[33572]: I1204 22:33:55.841604 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-sb\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.841714 master-0 kubenswrapper[33572]: I1204 22:33:55.841666 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-dns-svc\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.842881 master-0 kubenswrapper[33572]: I1204 22:33:55.842840 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-config\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.843036 master-0 kubenswrapper[33572]: I1204 22:33:55.843000 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-sb\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.843101 master-0 kubenswrapper[33572]: I1204 22:33:55.843000 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-dns-svc\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.843277 master-0 kubenswrapper[33572]: I1204 22:33:55.843229 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-nb\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.857644 master-0 kubenswrapper[33572]: I1204 22:33:55.857581 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7lpg\" (UniqueName: \"kubernetes.io/projected/d5a1b01b-53d3-4689-8713-170466122740-kube-api-access-b7lpg\") pod \"dnsmasq-dns-57545c5d5f-szxns\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:55.915754 master-0 kubenswrapper[33572]: I1204 22:33:55.915645 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:56.177374 master-0 kubenswrapper[33572]: I1204 22:33:56.177250 33572 generic.go:334] "Generic (PLEG): container finished" podID="f5f0454e-05e4-4510-bbf1-b273079b0f1d" containerID="fd4848ab8331061910e1eab3d07787435b6c32ae1274d7cda48b15d917557f7c" exitCode=0 Dec 04 22:33:56.177990 master-0 kubenswrapper[33572]: I1204 22:33:56.177331 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5f0454e-05e4-4510-bbf1-b273079b0f1d","Type":"ContainerDied","Data":"fd4848ab8331061910e1eab3d07787435b6c32ae1274d7cda48b15d917557f7c"} Dec 04 22:33:56.189748 master-0 kubenswrapper[33572]: I1204 22:33:56.189687 33572 generic.go:334] "Generic (PLEG): container finished" podID="a95bf966-679b-4bad-8693-f1becf39685d" containerID="8d338d7d98eedb4ec1f67820cfabb507aa5e884f98c4801230cab93d45df8002" exitCode=0 Dec 04 22:33:56.189993 master-0 kubenswrapper[33572]: I1204 22:33:56.189930 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a95bf966-679b-4bad-8693-f1becf39685d","Type":"ContainerDied","Data":"8d338d7d98eedb4ec1f67820cfabb507aa5e884f98c4801230cab93d45df8002"} Dec 04 22:33:56.424122 master-0 kubenswrapper[33572]: I1204 22:33:56.424055 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:56.438732 master-0 kubenswrapper[33572]: I1204 22:33:56.436765 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57545c5d5f-szxns"] Dec 04 22:33:56.439685 master-0 kubenswrapper[33572]: W1204 22:33:56.439646 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5a1b01b_53d3_4689_8713_170466122740.slice/crio-00e4abdf0126f856691bc8261fa9c3f9fdc5b1c24ae98f514e9e98f1bba69f82 WatchSource:0}: Error finding container 00e4abdf0126f856691bc8261fa9c3f9fdc5b1c24ae98f514e9e98f1bba69f82: Status 404 returned error can't find the container with id 00e4abdf0126f856691bc8261fa9c3f9fdc5b1c24ae98f514e9e98f1bba69f82 Dec 04 22:33:56.473367 master-0 kubenswrapper[33572]: I1204 22:33:56.473298 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:57.201180 master-0 kubenswrapper[33572]: I1204 22:33:57.201112 33572 generic.go:334] "Generic (PLEG): container finished" podID="d5a1b01b-53d3-4689-8713-170466122740" containerID="01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6" exitCode=0 Dec 04 22:33:57.201750 master-0 kubenswrapper[33572]: I1204 22:33:57.201191 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" event={"ID":"d5a1b01b-53d3-4689-8713-170466122740","Type":"ContainerDied","Data":"01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6"} Dec 04 22:33:57.201750 master-0 kubenswrapper[33572]: I1204 22:33:57.201220 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" event={"ID":"d5a1b01b-53d3-4689-8713-170466122740","Type":"ContainerStarted","Data":"00e4abdf0126f856691bc8261fa9c3f9fdc5b1c24ae98f514e9e98f1bba69f82"} Dec 04 22:33:57.204890 master-0 kubenswrapper[33572]: I1204 22:33:57.204845 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5f0454e-05e4-4510-bbf1-b273079b0f1d","Type":"ContainerStarted","Data":"e19cbc3ca934c83f022346076701c04429e01b3081501b5d01b1309876ffa56c"} Dec 04 22:33:57.206919 master-0 kubenswrapper[33572]: I1204 22:33:57.206832 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a95bf966-679b-4bad-8693-f1becf39685d","Type":"ContainerStarted","Data":"1dbc9aa5185c6dc039f4d168268a03775688e78c5d98b0e70e1b64288a6afc8c"} Dec 04 22:33:57.207009 master-0 kubenswrapper[33572]: I1204 22:33:57.206967 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" podUID="b449fa95-3750-4776-9c64-c2caf6d07fa5" containerName="dnsmasq-dns" containerID="cri-o://850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b" gracePeriod=10 Dec 04 22:33:57.207357 master-0 kubenswrapper[33572]: I1204 22:33:57.207316 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:57.263933 master-0 kubenswrapper[33572]: I1204 22:33:57.263823 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.112107279 podStartE2EDuration="33.263801159s" podCreationTimestamp="2025-12-04 22:33:24 +0000 UTC" firstStartedPulling="2025-12-04 22:33:43.677246425 +0000 UTC m=+887.404772074" lastFinishedPulling="2025-12-04 22:33:50.828940305 +0000 UTC m=+894.556465954" observedRunningTime="2025-12-04 22:33:57.259795378 +0000 UTC m=+900.987321027" watchObservedRunningTime="2025-12-04 22:33:57.263801159 +0000 UTC m=+900.991326808" Dec 04 22:33:57.274486 master-0 kubenswrapper[33572]: I1204 22:33:57.272911 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 04 22:33:57.283897 master-0 kubenswrapper[33572]: I1204 22:33:57.283834 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.14169013 podStartE2EDuration="34.283815995s" podCreationTimestamp="2025-12-04 22:33:23 +0000 UTC" firstStartedPulling="2025-12-04 22:33:43.687471069 +0000 UTC m=+887.414996718" lastFinishedPulling="2025-12-04 22:33:50.829596933 +0000 UTC m=+894.557122583" observedRunningTime="2025-12-04 22:33:57.279089264 +0000 UTC m=+901.006614913" watchObservedRunningTime="2025-12-04 22:33:57.283815995 +0000 UTC m=+901.011341644" Dec 04 22:33:57.468480 master-0 kubenswrapper[33572]: E1204 22:33:57.468337 33572 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb449fa95_3750_4776_9c64_c2caf6d07fa5.slice/crio-850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b.scope\": RecentStats: unable to find data in memory cache]" Dec 04 22:33:57.695113 master-0 kubenswrapper[33572]: I1204 22:33:57.695000 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:57.813528 master-0 kubenswrapper[33572]: I1204 22:33:57.813419 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-config\") pod \"b449fa95-3750-4776-9c64-c2caf6d07fa5\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " Dec 04 22:33:57.813852 master-0 kubenswrapper[33572]: I1204 22:33:57.813608 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g48gb\" (UniqueName: \"kubernetes.io/projected/b449fa95-3750-4776-9c64-c2caf6d07fa5-kube-api-access-g48gb\") pod \"b449fa95-3750-4776-9c64-c2caf6d07fa5\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " Dec 04 22:33:57.813852 master-0 kubenswrapper[33572]: I1204 22:33:57.813679 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-ovsdbserver-sb\") pod \"b449fa95-3750-4776-9c64-c2caf6d07fa5\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " Dec 04 22:33:57.813852 master-0 kubenswrapper[33572]: I1204 22:33:57.813740 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-dns-svc\") pod \"b449fa95-3750-4776-9c64-c2caf6d07fa5\" (UID: \"b449fa95-3750-4776-9c64-c2caf6d07fa5\") " Dec 04 22:33:57.818636 master-0 kubenswrapper[33572]: I1204 22:33:57.818592 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b449fa95-3750-4776-9c64-c2caf6d07fa5-kube-api-access-g48gb" (OuterVolumeSpecName: "kube-api-access-g48gb") pod "b449fa95-3750-4776-9c64-c2caf6d07fa5" (UID: "b449fa95-3750-4776-9c64-c2caf6d07fa5"). InnerVolumeSpecName "kube-api-access-g48gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:33:57.865994 master-0 kubenswrapper[33572]: I1204 22:33:57.865928 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b449fa95-3750-4776-9c64-c2caf6d07fa5" (UID: "b449fa95-3750-4776-9c64-c2caf6d07fa5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:33:57.869373 master-0 kubenswrapper[33572]: I1204 22:33:57.869292 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b449fa95-3750-4776-9c64-c2caf6d07fa5" (UID: "b449fa95-3750-4776-9c64-c2caf6d07fa5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:33:57.903908 master-0 kubenswrapper[33572]: I1204 22:33:57.897700 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-config" (OuterVolumeSpecName: "config") pod "b449fa95-3750-4776-9c64-c2caf6d07fa5" (UID: "b449fa95-3750-4776-9c64-c2caf6d07fa5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:33:57.915278 master-0 kubenswrapper[33572]: I1204 22:33:57.915209 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:57.917058 master-0 kubenswrapper[33572]: I1204 22:33:57.916992 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:57.917058 master-0 kubenswrapper[33572]: I1204 22:33:57.917053 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g48gb\" (UniqueName: \"kubernetes.io/projected/b449fa95-3750-4776-9c64-c2caf6d07fa5-kube-api-access-g48gb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:57.917153 master-0 kubenswrapper[33572]: I1204 22:33:57.917072 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:57.917153 master-0 kubenswrapper[33572]: I1204 22:33:57.917093 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b449fa95-3750-4776-9c64-c2caf6d07fa5-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:33:57.957589 master-0 kubenswrapper[33572]: I1204 22:33:57.957524 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:58.222176 master-0 kubenswrapper[33572]: I1204 22:33:58.221643 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" event={"ID":"d5a1b01b-53d3-4689-8713-170466122740","Type":"ContainerStarted","Data":"1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8"} Dec 04 22:33:58.222176 master-0 kubenswrapper[33572]: I1204 22:33:58.221797 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:33:58.224251 master-0 kubenswrapper[33572]: I1204 22:33:58.224221 33572 generic.go:334] "Generic (PLEG): container finished" podID="b449fa95-3750-4776-9c64-c2caf6d07fa5" containerID="850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b" exitCode=0 Dec 04 22:33:58.224337 master-0 kubenswrapper[33572]: I1204 22:33:58.224273 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" Dec 04 22:33:58.224390 master-0 kubenswrapper[33572]: I1204 22:33:58.224326 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" event={"ID":"b449fa95-3750-4776-9c64-c2caf6d07fa5","Type":"ContainerDied","Data":"850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b"} Dec 04 22:33:58.224441 master-0 kubenswrapper[33572]: I1204 22:33:58.224409 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b448889f-cx6r6" event={"ID":"b449fa95-3750-4776-9c64-c2caf6d07fa5","Type":"ContainerDied","Data":"f77015ef5194e3c9fe3e7c149bd8f9e2c4894d576e472c148e849c246c34863b"} Dec 04 22:33:58.224490 master-0 kubenswrapper[33572]: I1204 22:33:58.224443 33572 scope.go:117] "RemoveContainer" containerID="850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b" Dec 04 22:33:58.224845 master-0 kubenswrapper[33572]: I1204 22:33:58.224811 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:58.253105 master-0 kubenswrapper[33572]: I1204 22:33:58.253015 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" podStartSLOduration=3.252995329 podStartE2EDuration="3.252995329s" podCreationTimestamp="2025-12-04 22:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:33:58.246362826 +0000 UTC m=+901.973888475" watchObservedRunningTime="2025-12-04 22:33:58.252995329 +0000 UTC m=+901.980520978" Dec 04 22:33:58.255668 master-0 kubenswrapper[33572]: I1204 22:33:58.255618 33572 scope.go:117] "RemoveContainer" containerID="adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b" Dec 04 22:33:58.274162 master-0 kubenswrapper[33572]: I1204 22:33:58.274121 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b448889f-cx6r6"] Dec 04 22:33:58.274321 master-0 kubenswrapper[33572]: I1204 22:33:58.274197 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 04 22:33:58.283344 master-0 kubenswrapper[33572]: I1204 22:33:58.283273 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b448889f-cx6r6"] Dec 04 22:33:58.315285 master-0 kubenswrapper[33572]: I1204 22:33:58.315225 33572 scope.go:117] "RemoveContainer" containerID="850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b" Dec 04 22:33:58.315786 master-0 kubenswrapper[33572]: E1204 22:33:58.315748 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b\": container with ID starting with 850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b not found: ID does not exist" containerID="850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b" Dec 04 22:33:58.315887 master-0 kubenswrapper[33572]: I1204 22:33:58.315780 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b"} err="failed to get container status \"850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b\": rpc error: code = NotFound desc = could not find container \"850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b\": container with ID starting with 850a0d98bd8e2f33a3d459dcd1efe29799cb6f62fabdd02811ed430c50e29f3b not found: ID does not exist" Dec 04 22:33:58.315887 master-0 kubenswrapper[33572]: I1204 22:33:58.315802 33572 scope.go:117] "RemoveContainer" containerID="adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b" Dec 04 22:33:58.316171 master-0 kubenswrapper[33572]: E1204 22:33:58.316147 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b\": container with ID starting with adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b not found: ID does not exist" containerID="adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b" Dec 04 22:33:58.316234 master-0 kubenswrapper[33572]: I1204 22:33:58.316177 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b"} err="failed to get container status \"adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b\": rpc error: code = NotFound desc = could not find container \"adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b\": container with ID starting with adc82813582267f9a717ad59aa007ccf526cde965a0aa5f34449cfd78f3b213b not found: ID does not exist" Dec 04 22:33:58.546380 master-0 kubenswrapper[33572]: I1204 22:33:58.546314 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b449fa95-3750-4776-9c64-c2caf6d07fa5" path="/var/lib/kubelet/pods/b449fa95-3750-4776-9c64-c2caf6d07fa5/volumes" Dec 04 22:33:58.557299 master-0 kubenswrapper[33572]: I1204 22:33:58.556396 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57545c5d5f-szxns"] Dec 04 22:33:58.596522 master-0 kubenswrapper[33572]: I1204 22:33:58.595029 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5789dc4cf-2rbv6"] Dec 04 22:33:58.596522 master-0 kubenswrapper[33572]: E1204 22:33:58.595596 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b449fa95-3750-4776-9c64-c2caf6d07fa5" containerName="dnsmasq-dns" Dec 04 22:33:58.596522 master-0 kubenswrapper[33572]: I1204 22:33:58.595613 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b449fa95-3750-4776-9c64-c2caf6d07fa5" containerName="dnsmasq-dns" Dec 04 22:33:58.596522 master-0 kubenswrapper[33572]: E1204 22:33:58.595642 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b449fa95-3750-4776-9c64-c2caf6d07fa5" containerName="init" Dec 04 22:33:58.596522 master-0 kubenswrapper[33572]: I1204 22:33:58.595650 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b449fa95-3750-4776-9c64-c2caf6d07fa5" containerName="init" Dec 04 22:33:58.596522 master-0 kubenswrapper[33572]: I1204 22:33:58.595926 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b449fa95-3750-4776-9c64-c2caf6d07fa5" containerName="dnsmasq-dns" Dec 04 22:33:58.611527 master-0 kubenswrapper[33572]: I1204 22:33:58.609294 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.649100 master-0 kubenswrapper[33572]: I1204 22:33:58.648764 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5789dc4cf-2rbv6"] Dec 04 22:33:58.751514 master-0 kubenswrapper[33572]: I1204 22:33:58.748774 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-nb\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.751514 master-0 kubenswrapper[33572]: I1204 22:33:58.748850 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-dns-svc\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.751514 master-0 kubenswrapper[33572]: I1204 22:33:58.750237 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kww6g\" (UniqueName: \"kubernetes.io/projected/0797bdd1-388c-4686-982d-ce11deae84c9-kube-api-access-kww6g\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.751514 master-0 kubenswrapper[33572]: I1204 22:33:58.750346 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-config\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.751514 master-0 kubenswrapper[33572]: I1204 22:33:58.750403 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-sb\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.759094 master-0 kubenswrapper[33572]: I1204 22:33:58.759023 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 04 22:33:58.760887 master-0 kubenswrapper[33572]: I1204 22:33:58.760851 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 22:33:58.763067 master-0 kubenswrapper[33572]: I1204 22:33:58.763014 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 04 22:33:58.763162 master-0 kubenswrapper[33572]: I1204 22:33:58.763146 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 04 22:33:58.763262 master-0 kubenswrapper[33572]: I1204 22:33:58.763241 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 04 22:33:58.767837 master-0 kubenswrapper[33572]: I1204 22:33:58.767797 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 22:33:58.852256 master-0 kubenswrapper[33572]: I1204 22:33:58.852137 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddb5e10-d309-4f90-be98-ce86057661f1-config\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.852256 master-0 kubenswrapper[33572]: I1204 22:33:58.852194 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ddb5e10-d309-4f90-be98-ce86057661f1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.852483 master-0 kubenswrapper[33572]: I1204 22:33:58.852260 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6v5v\" (UniqueName: \"kubernetes.io/projected/1ddb5e10-d309-4f90-be98-ce86057661f1-kube-api-access-g6v5v\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.852483 master-0 kubenswrapper[33572]: I1204 22:33:58.852282 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ddb5e10-d309-4f90-be98-ce86057661f1-scripts\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.852483 master-0 kubenswrapper[33572]: I1204 22:33:58.852312 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ddb5e10-d309-4f90-be98-ce86057661f1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.852483 master-0 kubenswrapper[33572]: I1204 22:33:58.852379 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-nb\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.852483 master-0 kubenswrapper[33572]: I1204 22:33:58.852408 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ddb5e10-d309-4f90-be98-ce86057661f1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.853986 master-0 kubenswrapper[33572]: I1204 22:33:58.852486 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-dns-svc\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.853986 master-0 kubenswrapper[33572]: I1204 22:33:58.852531 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ddb5e10-d309-4f90-be98-ce86057661f1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.853986 master-0 kubenswrapper[33572]: I1204 22:33:58.852575 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kww6g\" (UniqueName: \"kubernetes.io/projected/0797bdd1-388c-4686-982d-ce11deae84c9-kube-api-access-kww6g\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.853986 master-0 kubenswrapper[33572]: I1204 22:33:58.852623 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-config\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.853986 master-0 kubenswrapper[33572]: I1204 22:33:58.852653 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-sb\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.853986 master-0 kubenswrapper[33572]: I1204 22:33:58.853827 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-sb\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.854400 master-0 kubenswrapper[33572]: I1204 22:33:58.854378 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-nb\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.857555 master-0 kubenswrapper[33572]: I1204 22:33:58.855063 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-dns-svc\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.857555 master-0 kubenswrapper[33572]: I1204 22:33:58.855620 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-config\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.871701 master-0 kubenswrapper[33572]: I1204 22:33:58.871661 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kww6g\" (UniqueName: \"kubernetes.io/projected/0797bdd1-388c-4686-982d-ce11deae84c9-kube-api-access-kww6g\") pod \"dnsmasq-dns-5789dc4cf-2rbv6\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:58.953844 master-0 kubenswrapper[33572]: I1204 22:33:58.953777 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ddb5e10-d309-4f90-be98-ce86057661f1-scripts\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.953844 master-0 kubenswrapper[33572]: I1204 22:33:58.953843 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ddb5e10-d309-4f90-be98-ce86057661f1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.954248 master-0 kubenswrapper[33572]: I1204 22:33:58.953906 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ddb5e10-d309-4f90-be98-ce86057661f1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.954248 master-0 kubenswrapper[33572]: I1204 22:33:58.953938 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ddb5e10-d309-4f90-be98-ce86057661f1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.954579 master-0 kubenswrapper[33572]: I1204 22:33:58.954521 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddb5e10-d309-4f90-be98-ce86057661f1-config\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.954698 master-0 kubenswrapper[33572]: I1204 22:33:58.954611 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ddb5e10-d309-4f90-be98-ce86057661f1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.954698 master-0 kubenswrapper[33572]: I1204 22:33:58.954626 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ddb5e10-d309-4f90-be98-ce86057661f1-scripts\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.954834 master-0 kubenswrapper[33572]: I1204 22:33:58.954788 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6v5v\" (UniqueName: \"kubernetes.io/projected/1ddb5e10-d309-4f90-be98-ce86057661f1-kube-api-access-g6v5v\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.954834 master-0 kubenswrapper[33572]: I1204 22:33:58.954800 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1ddb5e10-d309-4f90-be98-ce86057661f1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.955369 master-0 kubenswrapper[33572]: I1204 22:33:58.955335 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddb5e10-d309-4f90-be98-ce86057661f1-config\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.958337 master-0 kubenswrapper[33572]: I1204 22:33:58.957906 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ddb5e10-d309-4f90-be98-ce86057661f1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.958455 master-0 kubenswrapper[33572]: I1204 22:33:58.958360 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ddb5e10-d309-4f90-be98-ce86057661f1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.958609 master-0 kubenswrapper[33572]: I1204 22:33:58.958560 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ddb5e10-d309-4f90-be98-ce86057661f1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.972927 master-0 kubenswrapper[33572]: I1204 22:33:58.972868 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6v5v\" (UniqueName: \"kubernetes.io/projected/1ddb5e10-d309-4f90-be98-ce86057661f1-kube-api-access-g6v5v\") pod \"ovn-northd-0\" (UID: \"1ddb5e10-d309-4f90-be98-ce86057661f1\") " pod="openstack/ovn-northd-0" Dec 04 22:33:58.994458 master-0 kubenswrapper[33572]: I1204 22:33:58.994393 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:33:59.082051 master-0 kubenswrapper[33572]: I1204 22:33:59.081990 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 22:33:59.612099 master-0 kubenswrapper[33572]: I1204 22:33:59.612049 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5789dc4cf-2rbv6"] Dec 04 22:33:59.665838 master-0 kubenswrapper[33572]: I1204 22:33:59.665783 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 22:33:59.666670 master-0 kubenswrapper[33572]: W1204 22:33:59.666632 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ddb5e10_d309_4f90_be98_ce86057661f1.slice/crio-9e449a94909b1a3560b323e0fc51a7e1abfad4d2766edf5906ace771364688cb WatchSource:0}: Error finding container 9e449a94909b1a3560b323e0fc51a7e1abfad4d2766edf5906ace771364688cb: Status 404 returned error can't find the container with id 9e449a94909b1a3560b323e0fc51a7e1abfad4d2766edf5906ace771364688cb Dec 04 22:33:59.671595 master-0 kubenswrapper[33572]: E1204 22:33:59.671344 33572 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.32.10:33474->192.168.32.10:37473: read tcp 192.168.32.10:33474->192.168.32.10:37473: read: connection reset by peer Dec 04 22:34:00.263058 master-0 kubenswrapper[33572]: I1204 22:34:00.262843 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1ddb5e10-d309-4f90-be98-ce86057661f1","Type":"ContainerStarted","Data":"9e449a94909b1a3560b323e0fc51a7e1abfad4d2766edf5906ace771364688cb"} Dec 04 22:34:00.266915 master-0 kubenswrapper[33572]: I1204 22:34:00.266839 33572 generic.go:334] "Generic (PLEG): container finished" podID="0797bdd1-388c-4686-982d-ce11deae84c9" containerID="ee07aae4310cb02d674cb636221f915e8a92620e0383a42c6e782c8c34272891" exitCode=0 Dec 04 22:34:00.267323 master-0 kubenswrapper[33572]: I1204 22:34:00.267135 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" podUID="d5a1b01b-53d3-4689-8713-170466122740" containerName="dnsmasq-dns" containerID="cri-o://1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8" gracePeriod=10 Dec 04 22:34:00.268697 master-0 kubenswrapper[33572]: I1204 22:34:00.268637 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" event={"ID":"0797bdd1-388c-4686-982d-ce11deae84c9","Type":"ContainerDied","Data":"ee07aae4310cb02d674cb636221f915e8a92620e0383a42c6e782c8c34272891"} Dec 04 22:34:00.268839 master-0 kubenswrapper[33572]: I1204 22:34:00.268724 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" event={"ID":"0797bdd1-388c-4686-982d-ce11deae84c9","Type":"ContainerStarted","Data":"438ca4995d6d21e65fd08cf2b07037f869568400e627146f43547dc9656630ce"} Dec 04 22:34:00.602290 master-0 kubenswrapper[33572]: I1204 22:34:00.602197 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 04 22:34:00.612994 master-0 kubenswrapper[33572]: I1204 22:34:00.612921 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 22:34:00.615265 master-0 kubenswrapper[33572]: I1204 22:34:00.615170 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 04 22:34:00.617958 master-0 kubenswrapper[33572]: I1204 22:34:00.616207 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 04 22:34:00.617958 master-0 kubenswrapper[33572]: I1204 22:34:00.616424 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 04 22:34:00.750622 master-0 kubenswrapper[33572]: I1204 22:34:00.749719 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 22:34:00.793071 master-0 kubenswrapper[33572]: I1204 22:34:00.792967 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 04 22:34:00.793677 master-0 kubenswrapper[33572]: I1204 22:34:00.793628 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 04 22:34:01.045402 master-0 kubenswrapper[33572]: I1204 22:34:01.045279 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:34:01.121527 master-0 kubenswrapper[33572]: I1204 22:34:01.120849 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7lpg\" (UniqueName: \"kubernetes.io/projected/d5a1b01b-53d3-4689-8713-170466122740-kube-api-access-b7lpg\") pod \"d5a1b01b-53d3-4689-8713-170466122740\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " Dec 04 22:34:01.121527 master-0 kubenswrapper[33572]: I1204 22:34:01.120975 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-dns-svc\") pod \"d5a1b01b-53d3-4689-8713-170466122740\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " Dec 04 22:34:01.121527 master-0 kubenswrapper[33572]: I1204 22:34:01.121024 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-nb\") pod \"d5a1b01b-53d3-4689-8713-170466122740\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " Dec 04 22:34:01.121527 master-0 kubenswrapper[33572]: I1204 22:34:01.121054 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-sb\") pod \"d5a1b01b-53d3-4689-8713-170466122740\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " Dec 04 22:34:01.121527 master-0 kubenswrapper[33572]: I1204 22:34:01.121078 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-config\") pod \"d5a1b01b-53d3-4689-8713-170466122740\" (UID: \"d5a1b01b-53d3-4689-8713-170466122740\") " Dec 04 22:34:01.122026 master-0 kubenswrapper[33572]: I1204 22:34:01.121866 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-lock\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.122026 master-0 kubenswrapper[33572]: I1204 22:34:01.121919 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.122026 master-0 kubenswrapper[33572]: I1204 22:34:01.121968 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-cache\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.122129 master-0 kubenswrapper[33572]: I1204 22:34:01.122044 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c76430ca-e122-4616-94e7-49135eb7cad4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c39c75f7-1a6b-4a48-ac08-54ead83a5fe4\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.122129 master-0 kubenswrapper[33572]: I1204 22:34:01.122082 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46s69\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-kube-api-access-46s69\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.135541 master-0 kubenswrapper[33572]: I1204 22:34:01.125463 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a1b01b-53d3-4689-8713-170466122740-kube-api-access-b7lpg" (OuterVolumeSpecName: "kube-api-access-b7lpg") pod "d5a1b01b-53d3-4689-8713-170466122740" (UID: "d5a1b01b-53d3-4689-8713-170466122740"). InnerVolumeSpecName "kube-api-access-b7lpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:01.173752 master-0 kubenswrapper[33572]: I1204 22:34:01.173026 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5a1b01b-53d3-4689-8713-170466122740" (UID: "d5a1b01b-53d3-4689-8713-170466122740"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:01.179520 master-0 kubenswrapper[33572]: I1204 22:34:01.179434 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5a1b01b-53d3-4689-8713-170466122740" (UID: "d5a1b01b-53d3-4689-8713-170466122740"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:01.180361 master-0 kubenswrapper[33572]: I1204 22:34:01.180245 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-config" (OuterVolumeSpecName: "config") pod "d5a1b01b-53d3-4689-8713-170466122740" (UID: "d5a1b01b-53d3-4689-8713-170466122740"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:01.202064 master-0 kubenswrapper[33572]: I1204 22:34:01.200495 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5a1b01b-53d3-4689-8713-170466122740" (UID: "d5a1b01b-53d3-4689-8713-170466122740"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.223641 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c76430ca-e122-4616-94e7-49135eb7cad4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c39c75f7-1a6b-4a48-ac08-54ead83a5fe4\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.223709 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46s69\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-kube-api-access-46s69\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.223881 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-lock\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.223984 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.224049 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-cache\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.224149 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7lpg\" (UniqueName: \"kubernetes.io/projected/d5a1b01b-53d3-4689-8713-170466122740-kube-api-access-b7lpg\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.224168 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.224180 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.224190 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.224201 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a1b01b-53d3-4689-8713-170466122740-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.224818 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-cache\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: E1204 22:34:01.224982 33572 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: E1204 22:34:01.225010 33572 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: I1204 22:34:01.225011 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-lock\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.225573 master-0 kubenswrapper[33572]: E1204 22:34:01.225062 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift podName:d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:34:01.725040314 +0000 UTC m=+905.452566053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift") pod "swift-storage-0" (UID: "d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee") : configmap "swift-ring-files" not found Dec 04 22:34:01.227313 master-0 kubenswrapper[33572]: I1204 22:34:01.226658 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:34:01.227313 master-0 kubenswrapper[33572]: I1204 22:34:01.226696 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c76430ca-e122-4616-94e7-49135eb7cad4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c39c75f7-1a6b-4a48-ac08-54ead83a5fe4\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/d3dc8c0a0faae33ead6b58d2a1bb6e3a95a1e6cd3f1a53cb46b0293d6c7d04ca/globalmount\"" pod="openstack/swift-storage-0" Dec 04 22:34:01.296242 master-0 kubenswrapper[33572]: I1204 22:34:01.296108 33572 generic.go:334] "Generic (PLEG): container finished" podID="d5a1b01b-53d3-4689-8713-170466122740" containerID="1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8" exitCode=0 Dec 04 22:34:01.296439 master-0 kubenswrapper[33572]: I1204 22:34:01.296237 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" Dec 04 22:34:01.296439 master-0 kubenswrapper[33572]: I1204 22:34:01.296240 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" event={"ID":"d5a1b01b-53d3-4689-8713-170466122740","Type":"ContainerDied","Data":"1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8"} Dec 04 22:34:01.296439 master-0 kubenswrapper[33572]: I1204 22:34:01.296394 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57545c5d5f-szxns" event={"ID":"d5a1b01b-53d3-4689-8713-170466122740","Type":"ContainerDied","Data":"00e4abdf0126f856691bc8261fa9c3f9fdc5b1c24ae98f514e9e98f1bba69f82"} Dec 04 22:34:01.296439 master-0 kubenswrapper[33572]: I1204 22:34:01.296425 33572 scope.go:117] "RemoveContainer" containerID="1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8" Dec 04 22:34:01.300887 master-0 kubenswrapper[33572]: I1204 22:34:01.300849 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" event={"ID":"0797bdd1-388c-4686-982d-ce11deae84c9","Type":"ContainerStarted","Data":"a77e2f6b23a1dd5188388200f1d1c1d2bc318e534b37f612da2a01f0daa29b1b"} Dec 04 22:34:01.300887 master-0 kubenswrapper[33572]: I1204 22:34:01.300882 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:34:01.742080 master-0 kubenswrapper[33572]: I1204 22:34:01.741944 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.742920 master-0 kubenswrapper[33572]: E1204 22:34:01.742202 33572 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 22:34:01.742920 master-0 kubenswrapper[33572]: E1204 22:34:01.742259 33572 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 22:34:01.742920 master-0 kubenswrapper[33572]: E1204 22:34:01.742357 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift podName:d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:34:02.742335305 +0000 UTC m=+906.469860964 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift") pod "swift-storage-0" (UID: "d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee") : configmap "swift-ring-files" not found Dec 04 22:34:01.838212 master-0 kubenswrapper[33572]: I1204 22:34:01.838144 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46s69\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-kube-api-access-46s69\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:01.876020 master-0 kubenswrapper[33572]: I1204 22:34:01.875644 33572 scope.go:117] "RemoveContainer" containerID="01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6" Dec 04 22:34:01.898418 master-0 kubenswrapper[33572]: I1204 22:34:01.898336 33572 scope.go:117] "RemoveContainer" containerID="1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8" Dec 04 22:34:01.899387 master-0 kubenswrapper[33572]: E1204 22:34:01.899333 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8\": container with ID starting with 1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8 not found: ID does not exist" containerID="1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8" Dec 04 22:34:01.899466 master-0 kubenswrapper[33572]: I1204 22:34:01.899388 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8"} err="failed to get container status \"1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8\": rpc error: code = NotFound desc = could not find container \"1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8\": container with ID starting with 1edf9f1ae53cdf3e4526590c2ff0bc6e0c2ccdee0887d73c09449383e21054c8 not found: ID does not exist" Dec 04 22:34:01.899466 master-0 kubenswrapper[33572]: I1204 22:34:01.899414 33572 scope.go:117] "RemoveContainer" containerID="01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6" Dec 04 22:34:01.899842 master-0 kubenswrapper[33572]: E1204 22:34:01.899797 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6\": container with ID starting with 01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6 not found: ID does not exist" containerID="01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6" Dec 04 22:34:01.899908 master-0 kubenswrapper[33572]: I1204 22:34:01.899868 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6"} err="failed to get container status \"01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6\": rpc error: code = NotFound desc = could not find container \"01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6\": container with ID starting with 01676529fdf90831d67596b25610c1820eb1eff35cd98506c01dbfdc38f845b6 not found: ID does not exist" Dec 04 22:34:02.448550 master-0 kubenswrapper[33572]: I1204 22:34:02.445052 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" podStartSLOduration=4.445014087 podStartE2EDuration="4.445014087s" podCreationTimestamp="2025-12-04 22:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:34:02.430032022 +0000 UTC m=+906.157557711" watchObservedRunningTime="2025-12-04 22:34:02.445014087 +0000 UTC m=+906.172539786" Dec 04 22:34:02.616895 master-0 kubenswrapper[33572]: I1204 22:34:02.616822 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c76430ca-e122-4616-94e7-49135eb7cad4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c39c75f7-1a6b-4a48-ac08-54ead83a5fe4\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:02.763662 master-0 kubenswrapper[33572]: I1204 22:34:02.763418 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:02.763662 master-0 kubenswrapper[33572]: E1204 22:34:02.763619 33572 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 22:34:02.763662 master-0 kubenswrapper[33572]: E1204 22:34:02.763651 33572 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 22:34:02.764742 master-0 kubenswrapper[33572]: E1204 22:34:02.763737 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift podName:d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:34:04.763710918 +0000 UTC m=+908.491236577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift") pod "swift-storage-0" (UID: "d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee") : configmap "swift-ring-files" not found Dec 04 22:34:03.651092 master-0 kubenswrapper[33572]: I1204 22:34:03.651014 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2vjbq"] Dec 04 22:34:03.651670 master-0 kubenswrapper[33572]: E1204 22:34:03.651635 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a1b01b-53d3-4689-8713-170466122740" containerName="init" Dec 04 22:34:03.651670 master-0 kubenswrapper[33572]: I1204 22:34:03.651668 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a1b01b-53d3-4689-8713-170466122740" containerName="init" Dec 04 22:34:03.651819 master-0 kubenswrapper[33572]: E1204 22:34:03.651739 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a1b01b-53d3-4689-8713-170466122740" containerName="dnsmasq-dns" Dec 04 22:34:03.651819 master-0 kubenswrapper[33572]: I1204 22:34:03.651751 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a1b01b-53d3-4689-8713-170466122740" containerName="dnsmasq-dns" Dec 04 22:34:03.652088 master-0 kubenswrapper[33572]: I1204 22:34:03.652058 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a1b01b-53d3-4689-8713-170466122740" containerName="dnsmasq-dns" Dec 04 22:34:03.653125 master-0 kubenswrapper[33572]: I1204 22:34:03.653089 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.655839 master-0 kubenswrapper[33572]: I1204 22:34:03.655793 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 04 22:34:03.655958 master-0 kubenswrapper[33572]: I1204 22:34:03.655794 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 22:34:03.656029 master-0 kubenswrapper[33572]: I1204 22:34:03.655971 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 04 22:34:03.677151 master-0 kubenswrapper[33572]: I1204 22:34:03.677106 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 04 22:34:03.677305 master-0 kubenswrapper[33572]: I1204 22:34:03.677170 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 04 22:34:03.784846 master-0 kubenswrapper[33572]: I1204 22:34:03.784756 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-dispersionconf\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.784846 master-0 kubenswrapper[33572]: I1204 22:34:03.784845 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-ring-data-devices\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.785949 master-0 kubenswrapper[33572]: I1204 22:34:03.785033 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-swiftconf\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.785949 master-0 kubenswrapper[33572]: I1204 22:34:03.785439 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-scripts\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.785949 master-0 kubenswrapper[33572]: I1204 22:34:03.785592 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtz9s\" (UniqueName: \"kubernetes.io/projected/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-kube-api-access-qtz9s\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.785949 master-0 kubenswrapper[33572]: I1204 22:34:03.785681 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-etc-swift\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.785949 master-0 kubenswrapper[33572]: I1204 22:34:03.785738 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-combined-ca-bundle\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.798694 master-0 kubenswrapper[33572]: I1204 22:34:03.798638 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 04 22:34:03.888076 master-0 kubenswrapper[33572]: I1204 22:34:03.887979 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-ring-data-devices\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.888076 master-0 kubenswrapper[33572]: I1204 22:34:03.888095 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-swiftconf\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.888497 master-0 kubenswrapper[33572]: I1204 22:34:03.888213 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-scripts\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.888497 master-0 kubenswrapper[33572]: I1204 22:34:03.888245 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtz9s\" (UniqueName: \"kubernetes.io/projected/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-kube-api-access-qtz9s\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.888497 master-0 kubenswrapper[33572]: I1204 22:34:03.888281 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-etc-swift\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.888497 master-0 kubenswrapper[33572]: I1204 22:34:03.888319 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-combined-ca-bundle\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.888497 master-0 kubenswrapper[33572]: I1204 22:34:03.888364 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-dispersionconf\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.889021 master-0 kubenswrapper[33572]: I1204 22:34:03.888960 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-ring-data-devices\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.890644 master-0 kubenswrapper[33572]: I1204 22:34:03.889593 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-etc-swift\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.890644 master-0 kubenswrapper[33572]: I1204 22:34:03.889788 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-scripts\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.891652 master-0 kubenswrapper[33572]: I1204 22:34:03.891609 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-swiftconf\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.892903 master-0 kubenswrapper[33572]: I1204 22:34:03.892858 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-combined-ca-bundle\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:03.909602 master-0 kubenswrapper[33572]: I1204 22:34:03.909408 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-dispersionconf\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:04.165199 master-0 kubenswrapper[33572]: I1204 22:34:04.165025 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2vjbq"] Dec 04 22:34:04.462005 master-0 kubenswrapper[33572]: I1204 22:34:04.461822 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 04 22:34:04.809954 master-0 kubenswrapper[33572]: I1204 22:34:04.809872 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:04.810754 master-0 kubenswrapper[33572]: E1204 22:34:04.810081 33572 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 22:34:04.810754 master-0 kubenswrapper[33572]: E1204 22:34:04.810101 33572 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 22:34:04.810754 master-0 kubenswrapper[33572]: E1204 22:34:04.810162 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift podName:d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:34:08.810141566 +0000 UTC m=+912.537667215 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift") pod "swift-storage-0" (UID: "d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee") : configmap "swift-ring-files" not found Dec 04 22:34:04.870972 master-0 kubenswrapper[33572]: I1204 22:34:04.870838 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtz9s\" (UniqueName: \"kubernetes.io/projected/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-kube-api-access-qtz9s\") pod \"swift-ring-rebalance-2vjbq\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:04.871418 master-0 kubenswrapper[33572]: I1204 22:34:04.871394 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:04.876594 master-0 kubenswrapper[33572]: I1204 22:34:04.876114 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57545c5d5f-szxns"] Dec 04 22:34:05.253118 master-0 kubenswrapper[33572]: I1204 22:34:05.249695 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57545c5d5f-szxns"] Dec 04 22:34:05.820644 master-0 kubenswrapper[33572]: I1204 22:34:05.820582 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2vjbq"] Dec 04 22:34:05.826243 master-0 kubenswrapper[33572]: W1204 22:34:05.826194 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6914cf0c_7fd5_47d0_8a01_fc58a6c83abb.slice/crio-02ecbd2b0fc96c3b0369559f435470eed3f42bb8d3765827c131939b9e33a9ca WatchSource:0}: Error finding container 02ecbd2b0fc96c3b0369559f435470eed3f42bb8d3765827c131939b9e33a9ca: Status 404 returned error can't find the container with id 02ecbd2b0fc96c3b0369559f435470eed3f42bb8d3765827c131939b9e33a9ca Dec 04 22:34:06.369974 master-0 kubenswrapper[33572]: I1204 22:34:06.369917 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1ddb5e10-d309-4f90-be98-ce86057661f1","Type":"ContainerStarted","Data":"41e7eee3cb1b7393e89fe171eb50ebd0323bc1ccfc4e7ae9433a8a93ae8969a8"} Dec 04 22:34:06.369974 master-0 kubenswrapper[33572]: I1204 22:34:06.369968 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1ddb5e10-d309-4f90-be98-ce86057661f1","Type":"ContainerStarted","Data":"32096da4b282598023fee67879cbaef96a10f99ea517c7b049e64ac299700bd0"} Dec 04 22:34:06.370235 master-0 kubenswrapper[33572]: I1204 22:34:06.370078 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 04 22:34:06.371965 master-0 kubenswrapper[33572]: I1204 22:34:06.371906 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2vjbq" event={"ID":"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb","Type":"ContainerStarted","Data":"02ecbd2b0fc96c3b0369559f435470eed3f42bb8d3765827c131939b9e33a9ca"} Dec 04 22:34:06.400626 master-0 kubenswrapper[33572]: I1204 22:34:06.400495 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.796895498 podStartE2EDuration="8.400475413s" podCreationTimestamp="2025-12-04 22:33:58 +0000 UTC" firstStartedPulling="2025-12-04 22:33:59.683986725 +0000 UTC m=+903.411512394" lastFinishedPulling="2025-12-04 22:34:05.28756664 +0000 UTC m=+909.015092309" observedRunningTime="2025-12-04 22:34:06.39607573 +0000 UTC m=+910.123601379" watchObservedRunningTime="2025-12-04 22:34:06.400475413 +0000 UTC m=+910.128001062" Dec 04 22:34:06.553930 master-0 kubenswrapper[33572]: I1204 22:34:06.553584 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a1b01b-53d3-4689-8713-170466122740" path="/var/lib/kubelet/pods/d5a1b01b-53d3-4689-8713-170466122740/volumes" Dec 04 22:34:07.052862 master-0 kubenswrapper[33572]: I1204 22:34:07.052812 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 04 22:34:07.146535 master-0 kubenswrapper[33572]: I1204 22:34:07.146480 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 04 22:34:08.831020 master-0 kubenswrapper[33572]: I1204 22:34:08.830965 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:08.831525 master-0 kubenswrapper[33572]: E1204 22:34:08.831220 33572 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 22:34:08.831525 master-0 kubenswrapper[33572]: E1204 22:34:08.831237 33572 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 22:34:08.831525 master-0 kubenswrapper[33572]: E1204 22:34:08.831285 33572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift podName:d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee nodeName:}" failed. No retries permitted until 2025-12-04 22:34:16.831270104 +0000 UTC m=+920.558795753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift") pod "swift-storage-0" (UID: "d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee") : configmap "swift-ring-files" not found Dec 04 22:34:08.996883 master-0 kubenswrapper[33572]: I1204 22:34:08.996794 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:34:09.108470 master-0 kubenswrapper[33572]: I1204 22:34:09.101796 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9bff68687-2kwsl"] Dec 04 22:34:09.108470 master-0 kubenswrapper[33572]: I1204 22:34:09.102046 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" podUID="9210fc14-13be-430c-a269-be48b495a428" containerName="dnsmasq-dns" containerID="cri-o://d91ca408628685973ee5f939d43797e6800953db5f0ac1bd2220d2cd13c1e724" gracePeriod=10 Dec 04 22:34:09.415581 master-0 kubenswrapper[33572]: I1204 22:34:09.415148 33572 generic.go:334] "Generic (PLEG): container finished" podID="9210fc14-13be-430c-a269-be48b495a428" containerID="d91ca408628685973ee5f939d43797e6800953db5f0ac1bd2220d2cd13c1e724" exitCode=0 Dec 04 22:34:09.415581 master-0 kubenswrapper[33572]: I1204 22:34:09.415190 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" event={"ID":"9210fc14-13be-430c-a269-be48b495a428","Type":"ContainerDied","Data":"d91ca408628685973ee5f939d43797e6800953db5f0ac1bd2220d2cd13c1e724"} Dec 04 22:34:09.683667 master-0 kubenswrapper[33572]: I1204 22:34:09.683558 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:34:09.757316 master-0 kubenswrapper[33572]: I1204 22:34:09.756981 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-dns-svc\") pod \"9210fc14-13be-430c-a269-be48b495a428\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " Dec 04 22:34:09.757316 master-0 kubenswrapper[33572]: I1204 22:34:09.757227 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhc5w\" (UniqueName: \"kubernetes.io/projected/9210fc14-13be-430c-a269-be48b495a428-kube-api-access-nhc5w\") pod \"9210fc14-13be-430c-a269-be48b495a428\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " Dec 04 22:34:09.757316 master-0 kubenswrapper[33572]: I1204 22:34:09.757248 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-config\") pod \"9210fc14-13be-430c-a269-be48b495a428\" (UID: \"9210fc14-13be-430c-a269-be48b495a428\") " Dec 04 22:34:09.770848 master-0 kubenswrapper[33572]: I1204 22:34:09.762831 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9210fc14-13be-430c-a269-be48b495a428-kube-api-access-nhc5w" (OuterVolumeSpecName: "kube-api-access-nhc5w") pod "9210fc14-13be-430c-a269-be48b495a428" (UID: "9210fc14-13be-430c-a269-be48b495a428"). InnerVolumeSpecName "kube-api-access-nhc5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:09.805021 master-0 kubenswrapper[33572]: I1204 22:34:09.804943 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-config" (OuterVolumeSpecName: "config") pod "9210fc14-13be-430c-a269-be48b495a428" (UID: "9210fc14-13be-430c-a269-be48b495a428"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:09.806360 master-0 kubenswrapper[33572]: I1204 22:34:09.806292 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9210fc14-13be-430c-a269-be48b495a428" (UID: "9210fc14-13be-430c-a269-be48b495a428"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:09.859681 master-0 kubenswrapper[33572]: I1204 22:34:09.859622 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhc5w\" (UniqueName: \"kubernetes.io/projected/9210fc14-13be-430c-a269-be48b495a428-kube-api-access-nhc5w\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:09.859681 master-0 kubenswrapper[33572]: I1204 22:34:09.859659 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:09.859681 master-0 kubenswrapper[33572]: I1204 22:34:09.859672 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9210fc14-13be-430c-a269-be48b495a428-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:10.445217 master-0 kubenswrapper[33572]: I1204 22:34:10.445158 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" Dec 04 22:34:10.445217 master-0 kubenswrapper[33572]: I1204 22:34:10.445175 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bff68687-2kwsl" event={"ID":"9210fc14-13be-430c-a269-be48b495a428","Type":"ContainerDied","Data":"d4051d793b183763bff507fccbc9b1f09deaf49d1d873c128fd1d96a94e961cc"} Dec 04 22:34:10.445491 master-0 kubenswrapper[33572]: I1204 22:34:10.445257 33572 scope.go:117] "RemoveContainer" containerID="d91ca408628685973ee5f939d43797e6800953db5f0ac1bd2220d2cd13c1e724" Dec 04 22:34:10.448658 master-0 kubenswrapper[33572]: I1204 22:34:10.448169 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2vjbq" event={"ID":"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb","Type":"ContainerStarted","Data":"907de62b16faa4b9b56e7adeed53ba1d1bd70102326b3845ae2d9d9816e4aba3"} Dec 04 22:34:10.474040 master-0 kubenswrapper[33572]: I1204 22:34:10.473945 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2vjbq" podStartSLOduration=5.112546905 podStartE2EDuration="8.473924051s" podCreationTimestamp="2025-12-04 22:34:02 +0000 UTC" firstStartedPulling="2025-12-04 22:34:05.828833335 +0000 UTC m=+909.556359004" lastFinishedPulling="2025-12-04 22:34:09.190210501 +0000 UTC m=+912.917736150" observedRunningTime="2025-12-04 22:34:10.473057747 +0000 UTC m=+914.200583456" watchObservedRunningTime="2025-12-04 22:34:10.473924051 +0000 UTC m=+914.201449740" Dec 04 22:34:10.476306 master-0 kubenswrapper[33572]: I1204 22:34:10.476248 33572 scope.go:117] "RemoveContainer" containerID="7de6d891a052880605bc1258a0f158129737cf872a2328a68ccfc82bf4d26184" Dec 04 22:34:10.509812 master-0 kubenswrapper[33572]: I1204 22:34:10.509448 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9bff68687-2kwsl"] Dec 04 22:34:10.519185 master-0 kubenswrapper[33572]: I1204 22:34:10.519143 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9bff68687-2kwsl"] Dec 04 22:34:10.545101 master-0 kubenswrapper[33572]: I1204 22:34:10.545021 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9210fc14-13be-430c-a269-be48b495a428" path="/var/lib/kubelet/pods/9210fc14-13be-430c-a269-be48b495a428/volumes" Dec 04 22:34:11.961349 master-0 kubenswrapper[33572]: I1204 22:34:11.961271 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f80f-account-create-update-mtpsx"] Dec 04 22:34:11.962429 master-0 kubenswrapper[33572]: E1204 22:34:11.962377 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9210fc14-13be-430c-a269-be48b495a428" containerName="init" Dec 04 22:34:11.962429 master-0 kubenswrapper[33572]: I1204 22:34:11.962398 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9210fc14-13be-430c-a269-be48b495a428" containerName="init" Dec 04 22:34:11.962610 master-0 kubenswrapper[33572]: E1204 22:34:11.962482 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9210fc14-13be-430c-a269-be48b495a428" containerName="dnsmasq-dns" Dec 04 22:34:11.962610 master-0 kubenswrapper[33572]: I1204 22:34:11.962492 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9210fc14-13be-430c-a269-be48b495a428" containerName="dnsmasq-dns" Dec 04 22:34:11.963125 master-0 kubenswrapper[33572]: I1204 22:34:11.963091 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="9210fc14-13be-430c-a269-be48b495a428" containerName="dnsmasq-dns" Dec 04 22:34:11.964280 master-0 kubenswrapper[33572]: I1204 22:34:11.964247 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f80f-account-create-update-mtpsx" Dec 04 22:34:11.971301 master-0 kubenswrapper[33572]: I1204 22:34:11.971226 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 22:34:11.990466 master-0 kubenswrapper[33572]: I1204 22:34:11.990400 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nn74s"] Dec 04 22:34:12.002922 master-0 kubenswrapper[33572]: I1204 22:34:11.999330 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nn74s" Dec 04 22:34:12.009617 master-0 kubenswrapper[33572]: I1204 22:34:12.009564 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f80f-account-create-update-mtpsx"] Dec 04 22:34:12.027844 master-0 kubenswrapper[33572]: I1204 22:34:12.027790 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nn74s"] Dec 04 22:34:12.122876 master-0 kubenswrapper[33572]: I1204 22:34:12.122796 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-operator-scripts\") pod \"glance-db-create-nn74s\" (UID: \"7d9f3fad-681e-4cf1-8224-99ae62df7ad9\") " pod="openstack/glance-db-create-nn74s" Dec 04 22:34:12.122876 master-0 kubenswrapper[33572]: I1204 22:34:12.122873 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mg8k\" (UniqueName: \"kubernetes.io/projected/3418162d-92a3-429f-b73f-13e85aab7f44-kube-api-access-6mg8k\") pod \"glance-f80f-account-create-update-mtpsx\" (UID: \"3418162d-92a3-429f-b73f-13e85aab7f44\") " pod="openstack/glance-f80f-account-create-update-mtpsx" Dec 04 22:34:12.123159 master-0 kubenswrapper[33572]: I1204 22:34:12.122935 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3418162d-92a3-429f-b73f-13e85aab7f44-operator-scripts\") pod \"glance-f80f-account-create-update-mtpsx\" (UID: \"3418162d-92a3-429f-b73f-13e85aab7f44\") " pod="openstack/glance-f80f-account-create-update-mtpsx" Dec 04 22:34:12.123458 master-0 kubenswrapper[33572]: I1204 22:34:12.123189 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvdll\" (UniqueName: \"kubernetes.io/projected/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-kube-api-access-xvdll\") pod \"glance-db-create-nn74s\" (UID: \"7d9f3fad-681e-4cf1-8224-99ae62df7ad9\") " pod="openstack/glance-db-create-nn74s" Dec 04 22:34:12.225610 master-0 kubenswrapper[33572]: I1204 22:34:12.225461 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvdll\" (UniqueName: \"kubernetes.io/projected/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-kube-api-access-xvdll\") pod \"glance-db-create-nn74s\" (UID: \"7d9f3fad-681e-4cf1-8224-99ae62df7ad9\") " pod="openstack/glance-db-create-nn74s" Dec 04 22:34:12.225610 master-0 kubenswrapper[33572]: I1204 22:34:12.225601 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-operator-scripts\") pod \"glance-db-create-nn74s\" (UID: \"7d9f3fad-681e-4cf1-8224-99ae62df7ad9\") " pod="openstack/glance-db-create-nn74s" Dec 04 22:34:12.225862 master-0 kubenswrapper[33572]: I1204 22:34:12.225643 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mg8k\" (UniqueName: \"kubernetes.io/projected/3418162d-92a3-429f-b73f-13e85aab7f44-kube-api-access-6mg8k\") pod \"glance-f80f-account-create-update-mtpsx\" (UID: \"3418162d-92a3-429f-b73f-13e85aab7f44\") " pod="openstack/glance-f80f-account-create-update-mtpsx" Dec 04 22:34:12.225862 master-0 kubenswrapper[33572]: I1204 22:34:12.225698 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3418162d-92a3-429f-b73f-13e85aab7f44-operator-scripts\") pod \"glance-f80f-account-create-update-mtpsx\" (UID: \"3418162d-92a3-429f-b73f-13e85aab7f44\") " pod="openstack/glance-f80f-account-create-update-mtpsx" Dec 04 22:34:12.226332 master-0 kubenswrapper[33572]: I1204 22:34:12.226291 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-operator-scripts\") pod \"glance-db-create-nn74s\" (UID: \"7d9f3fad-681e-4cf1-8224-99ae62df7ad9\") " pod="openstack/glance-db-create-nn74s" Dec 04 22:34:12.226943 master-0 kubenswrapper[33572]: I1204 22:34:12.226908 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3418162d-92a3-429f-b73f-13e85aab7f44-operator-scripts\") pod \"glance-f80f-account-create-update-mtpsx\" (UID: \"3418162d-92a3-429f-b73f-13e85aab7f44\") " pod="openstack/glance-f80f-account-create-update-mtpsx" Dec 04 22:34:12.251856 master-0 kubenswrapper[33572]: I1204 22:34:12.251792 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvdll\" (UniqueName: \"kubernetes.io/projected/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-kube-api-access-xvdll\") pod \"glance-db-create-nn74s\" (UID: \"7d9f3fad-681e-4cf1-8224-99ae62df7ad9\") " pod="openstack/glance-db-create-nn74s" Dec 04 22:34:12.253606 master-0 kubenswrapper[33572]: I1204 22:34:12.253553 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mg8k\" (UniqueName: \"kubernetes.io/projected/3418162d-92a3-429f-b73f-13e85aab7f44-kube-api-access-6mg8k\") pod \"glance-f80f-account-create-update-mtpsx\" (UID: \"3418162d-92a3-429f-b73f-13e85aab7f44\") " pod="openstack/glance-f80f-account-create-update-mtpsx" Dec 04 22:34:12.290024 master-0 kubenswrapper[33572]: I1204 22:34:12.289970 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f80f-account-create-update-mtpsx" Dec 04 22:34:12.330973 master-0 kubenswrapper[33572]: I1204 22:34:12.330897 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nn74s" Dec 04 22:34:12.832560 master-0 kubenswrapper[33572]: I1204 22:34:12.832362 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f80f-account-create-update-mtpsx"] Dec 04 22:34:12.978994 master-0 kubenswrapper[33572]: W1204 22:34:12.978892 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d9f3fad_681e_4cf1_8224_99ae62df7ad9.slice/crio-c5772130806bec5c8636423e4141bc374980fc1b6dcd42f5c300b96a07bb7b76 WatchSource:0}: Error finding container c5772130806bec5c8636423e4141bc374980fc1b6dcd42f5c300b96a07bb7b76: Status 404 returned error can't find the container with id c5772130806bec5c8636423e4141bc374980fc1b6dcd42f5c300b96a07bb7b76 Dec 04 22:34:12.980486 master-0 kubenswrapper[33572]: I1204 22:34:12.979431 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nn74s"] Dec 04 22:34:13.494114 master-0 kubenswrapper[33572]: I1204 22:34:13.494009 33572 generic.go:334] "Generic (PLEG): container finished" podID="3418162d-92a3-429f-b73f-13e85aab7f44" containerID="c7d104dae7313232538706504d7586995262ea9fea00a7509ae95447fd126569" exitCode=0 Dec 04 22:34:13.494114 master-0 kubenswrapper[33572]: I1204 22:34:13.494078 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f80f-account-create-update-mtpsx" event={"ID":"3418162d-92a3-429f-b73f-13e85aab7f44","Type":"ContainerDied","Data":"c7d104dae7313232538706504d7586995262ea9fea00a7509ae95447fd126569"} Dec 04 22:34:13.494429 master-0 kubenswrapper[33572]: I1204 22:34:13.494159 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f80f-account-create-update-mtpsx" event={"ID":"3418162d-92a3-429f-b73f-13e85aab7f44","Type":"ContainerStarted","Data":"ffc47432cd331416fd5d64c68b80b692ed9dd5e1e38336e68318aec1ae2387c7"} Dec 04 22:34:13.498713 master-0 kubenswrapper[33572]: I1204 22:34:13.498676 33572 generic.go:334] "Generic (PLEG): container finished" podID="7d9f3fad-681e-4cf1-8224-99ae62df7ad9" containerID="823485e52d9ee5d9d0786946d18b7c233caf1cf514e6f0afb6f9b0adfb248c1b" exitCode=0 Dec 04 22:34:13.498806 master-0 kubenswrapper[33572]: I1204 22:34:13.498722 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nn74s" event={"ID":"7d9f3fad-681e-4cf1-8224-99ae62df7ad9","Type":"ContainerDied","Data":"823485e52d9ee5d9d0786946d18b7c233caf1cf514e6f0afb6f9b0adfb248c1b"} Dec 04 22:34:13.498806 master-0 kubenswrapper[33572]: I1204 22:34:13.498754 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nn74s" event={"ID":"7d9f3fad-681e-4cf1-8224-99ae62df7ad9","Type":"ContainerStarted","Data":"c5772130806bec5c8636423e4141bc374980fc1b6dcd42f5c300b96a07bb7b76"} Dec 04 22:34:15.124067 master-0 kubenswrapper[33572]: I1204 22:34:15.124008 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nn74s" Dec 04 22:34:15.132222 master-0 kubenswrapper[33572]: I1204 22:34:15.132169 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f80f-account-create-update-mtpsx" Dec 04 22:34:15.305245 master-0 kubenswrapper[33572]: I1204 22:34:15.305170 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mg8k\" (UniqueName: \"kubernetes.io/projected/3418162d-92a3-429f-b73f-13e85aab7f44-kube-api-access-6mg8k\") pod \"3418162d-92a3-429f-b73f-13e85aab7f44\" (UID: \"3418162d-92a3-429f-b73f-13e85aab7f44\") " Dec 04 22:34:15.305492 master-0 kubenswrapper[33572]: I1204 22:34:15.305461 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-operator-scripts\") pod \"7d9f3fad-681e-4cf1-8224-99ae62df7ad9\" (UID: \"7d9f3fad-681e-4cf1-8224-99ae62df7ad9\") " Dec 04 22:34:15.305780 master-0 kubenswrapper[33572]: I1204 22:34:15.305747 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3418162d-92a3-429f-b73f-13e85aab7f44-operator-scripts\") pod \"3418162d-92a3-429f-b73f-13e85aab7f44\" (UID: \"3418162d-92a3-429f-b73f-13e85aab7f44\") " Dec 04 22:34:15.305830 master-0 kubenswrapper[33572]: I1204 22:34:15.305808 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvdll\" (UniqueName: \"kubernetes.io/projected/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-kube-api-access-xvdll\") pod \"7d9f3fad-681e-4cf1-8224-99ae62df7ad9\" (UID: \"7d9f3fad-681e-4cf1-8224-99ae62df7ad9\") " Dec 04 22:34:15.306717 master-0 kubenswrapper[33572]: I1204 22:34:15.306631 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d9f3fad-681e-4cf1-8224-99ae62df7ad9" (UID: "7d9f3fad-681e-4cf1-8224-99ae62df7ad9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:15.306963 master-0 kubenswrapper[33572]: I1204 22:34:15.306882 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3418162d-92a3-429f-b73f-13e85aab7f44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3418162d-92a3-429f-b73f-13e85aab7f44" (UID: "3418162d-92a3-429f-b73f-13e85aab7f44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:15.309777 master-0 kubenswrapper[33572]: I1204 22:34:15.309748 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3418162d-92a3-429f-b73f-13e85aab7f44-kube-api-access-6mg8k" (OuterVolumeSpecName: "kube-api-access-6mg8k") pod "3418162d-92a3-429f-b73f-13e85aab7f44" (UID: "3418162d-92a3-429f-b73f-13e85aab7f44"). InnerVolumeSpecName "kube-api-access-6mg8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:15.309929 master-0 kubenswrapper[33572]: I1204 22:34:15.309800 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-kube-api-access-xvdll" (OuterVolumeSpecName: "kube-api-access-xvdll") pod "7d9f3fad-681e-4cf1-8224-99ae62df7ad9" (UID: "7d9f3fad-681e-4cf1-8224-99ae62df7ad9"). InnerVolumeSpecName "kube-api-access-xvdll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:15.408358 master-0 kubenswrapper[33572]: I1204 22:34:15.408282 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3418162d-92a3-429f-b73f-13e85aab7f44-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:15.408358 master-0 kubenswrapper[33572]: I1204 22:34:15.408335 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvdll\" (UniqueName: \"kubernetes.io/projected/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-kube-api-access-xvdll\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:15.408358 master-0 kubenswrapper[33572]: I1204 22:34:15.408355 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mg8k\" (UniqueName: \"kubernetes.io/projected/3418162d-92a3-429f-b73f-13e85aab7f44-kube-api-access-6mg8k\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:15.408358 master-0 kubenswrapper[33572]: I1204 22:34:15.408372 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d9f3fad-681e-4cf1-8224-99ae62df7ad9-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:15.527228 master-0 kubenswrapper[33572]: I1204 22:34:15.527134 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f80f-account-create-update-mtpsx" Dec 04 22:34:15.527228 master-0 kubenswrapper[33572]: I1204 22:34:15.527170 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f80f-account-create-update-mtpsx" event={"ID":"3418162d-92a3-429f-b73f-13e85aab7f44","Type":"ContainerDied","Data":"ffc47432cd331416fd5d64c68b80b692ed9dd5e1e38336e68318aec1ae2387c7"} Dec 04 22:34:15.527228 master-0 kubenswrapper[33572]: I1204 22:34:15.527241 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffc47432cd331416fd5d64c68b80b692ed9dd5e1e38336e68318aec1ae2387c7" Dec 04 22:34:15.530146 master-0 kubenswrapper[33572]: I1204 22:34:15.530016 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nn74s" event={"ID":"7d9f3fad-681e-4cf1-8224-99ae62df7ad9","Type":"ContainerDied","Data":"c5772130806bec5c8636423e4141bc374980fc1b6dcd42f5c300b96a07bb7b76"} Dec 04 22:34:15.530146 master-0 kubenswrapper[33572]: I1204 22:34:15.530090 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5772130806bec5c8636423e4141bc374980fc1b6dcd42f5c300b96a07bb7b76" Dec 04 22:34:15.530146 master-0 kubenswrapper[33572]: I1204 22:34:15.530135 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nn74s" Dec 04 22:34:16.256843 master-0 kubenswrapper[33572]: I1204 22:34:16.256790 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4n4f2"] Dec 04 22:34:16.257365 master-0 kubenswrapper[33572]: E1204 22:34:16.257230 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3418162d-92a3-429f-b73f-13e85aab7f44" containerName="mariadb-account-create-update" Dec 04 22:34:16.257365 master-0 kubenswrapper[33572]: I1204 22:34:16.257246 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3418162d-92a3-429f-b73f-13e85aab7f44" containerName="mariadb-account-create-update" Dec 04 22:34:16.257365 master-0 kubenswrapper[33572]: E1204 22:34:16.257306 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9f3fad-681e-4cf1-8224-99ae62df7ad9" containerName="mariadb-database-create" Dec 04 22:34:16.257365 master-0 kubenswrapper[33572]: I1204 22:34:16.257313 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9f3fad-681e-4cf1-8224-99ae62df7ad9" containerName="mariadb-database-create" Dec 04 22:34:16.257529 master-0 kubenswrapper[33572]: I1204 22:34:16.257488 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9f3fad-681e-4cf1-8224-99ae62df7ad9" containerName="mariadb-database-create" Dec 04 22:34:16.257568 master-0 kubenswrapper[33572]: I1204 22:34:16.257540 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="3418162d-92a3-429f-b73f-13e85aab7f44" containerName="mariadb-account-create-update" Dec 04 22:34:16.258335 master-0 kubenswrapper[33572]: I1204 22:34:16.258306 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4n4f2" Dec 04 22:34:16.273575 master-0 kubenswrapper[33572]: I1204 22:34:16.269187 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4n4f2"] Dec 04 22:34:16.365986 master-0 kubenswrapper[33572]: I1204 22:34:16.365888 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a06d-account-create-update-pk8fp"] Dec 04 22:34:16.367620 master-0 kubenswrapper[33572]: I1204 22:34:16.367584 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a06d-account-create-update-pk8fp" Dec 04 22:34:16.370019 master-0 kubenswrapper[33572]: I1204 22:34:16.369975 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 04 22:34:16.372007 master-0 kubenswrapper[33572]: I1204 22:34:16.371964 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a06d-account-create-update-pk8fp"] Dec 04 22:34:16.437066 master-0 kubenswrapper[33572]: I1204 22:34:16.437016 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95a2cfd-787d-4d6a-94b6-38938be00535-operator-scripts\") pod \"keystone-db-create-4n4f2\" (UID: \"a95a2cfd-787d-4d6a-94b6-38938be00535\") " pod="openstack/keystone-db-create-4n4f2" Dec 04 22:34:16.437268 master-0 kubenswrapper[33572]: I1204 22:34:16.437083 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzzb6\" (UniqueName: \"kubernetes.io/projected/a95a2cfd-787d-4d6a-94b6-38938be00535-kube-api-access-zzzb6\") pod \"keystone-db-create-4n4f2\" (UID: \"a95a2cfd-787d-4d6a-94b6-38938be00535\") " pod="openstack/keystone-db-create-4n4f2" Dec 04 22:34:16.539155 master-0 kubenswrapper[33572]: I1204 22:34:16.539078 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4fg4\" (UniqueName: \"kubernetes.io/projected/b52dac30-a647-4928-a2c3-849bab534073-kube-api-access-d4fg4\") pod \"keystone-a06d-account-create-update-pk8fp\" (UID: \"b52dac30-a647-4928-a2c3-849bab534073\") " pod="openstack/keystone-a06d-account-create-update-pk8fp" Dec 04 22:34:16.539337 master-0 kubenswrapper[33572]: I1204 22:34:16.539204 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95a2cfd-787d-4d6a-94b6-38938be00535-operator-scripts\") pod \"keystone-db-create-4n4f2\" (UID: \"a95a2cfd-787d-4d6a-94b6-38938be00535\") " pod="openstack/keystone-db-create-4n4f2" Dec 04 22:34:16.539539 master-0 kubenswrapper[33572]: I1204 22:34:16.539455 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b52dac30-a647-4928-a2c3-849bab534073-operator-scripts\") pod \"keystone-a06d-account-create-update-pk8fp\" (UID: \"b52dac30-a647-4928-a2c3-849bab534073\") " pod="openstack/keystone-a06d-account-create-update-pk8fp" Dec 04 22:34:16.539671 master-0 kubenswrapper[33572]: I1204 22:34:16.539634 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzzb6\" (UniqueName: \"kubernetes.io/projected/a95a2cfd-787d-4d6a-94b6-38938be00535-kube-api-access-zzzb6\") pod \"keystone-db-create-4n4f2\" (UID: \"a95a2cfd-787d-4d6a-94b6-38938be00535\") " pod="openstack/keystone-db-create-4n4f2" Dec 04 22:34:16.540195 master-0 kubenswrapper[33572]: I1204 22:34:16.540175 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95a2cfd-787d-4d6a-94b6-38938be00535-operator-scripts\") pod \"keystone-db-create-4n4f2\" (UID: \"a95a2cfd-787d-4d6a-94b6-38938be00535\") " pod="openstack/keystone-db-create-4n4f2" Dec 04 22:34:16.544360 master-0 kubenswrapper[33572]: I1204 22:34:16.544310 33572 generic.go:334] "Generic (PLEG): container finished" podID="6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" containerID="907de62b16faa4b9b56e7adeed53ba1d1bd70102326b3845ae2d9d9816e4aba3" exitCode=0 Dec 04 22:34:16.550877 master-0 kubenswrapper[33572]: I1204 22:34:16.550605 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2vjbq" event={"ID":"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb","Type":"ContainerDied","Data":"907de62b16faa4b9b56e7adeed53ba1d1bd70102326b3845ae2d9d9816e4aba3"} Dec 04 22:34:16.568777 master-0 kubenswrapper[33572]: I1204 22:34:16.568732 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzzb6\" (UniqueName: \"kubernetes.io/projected/a95a2cfd-787d-4d6a-94b6-38938be00535-kube-api-access-zzzb6\") pod \"keystone-db-create-4n4f2\" (UID: \"a95a2cfd-787d-4d6a-94b6-38938be00535\") " pod="openstack/keystone-db-create-4n4f2" Dec 04 22:34:16.583259 master-0 kubenswrapper[33572]: I1204 22:34:16.583170 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4n4f2" Dec 04 22:34:16.642457 master-0 kubenswrapper[33572]: I1204 22:34:16.642377 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4fg4\" (UniqueName: \"kubernetes.io/projected/b52dac30-a647-4928-a2c3-849bab534073-kube-api-access-d4fg4\") pod \"keystone-a06d-account-create-update-pk8fp\" (UID: \"b52dac30-a647-4928-a2c3-849bab534073\") " pod="openstack/keystone-a06d-account-create-update-pk8fp" Dec 04 22:34:16.642764 master-0 kubenswrapper[33572]: I1204 22:34:16.642652 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b52dac30-a647-4928-a2c3-849bab534073-operator-scripts\") pod \"keystone-a06d-account-create-update-pk8fp\" (UID: \"b52dac30-a647-4928-a2c3-849bab534073\") " pod="openstack/keystone-a06d-account-create-update-pk8fp" Dec 04 22:34:16.644386 master-0 kubenswrapper[33572]: I1204 22:34:16.644272 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b52dac30-a647-4928-a2c3-849bab534073-operator-scripts\") pod \"keystone-a06d-account-create-update-pk8fp\" (UID: \"b52dac30-a647-4928-a2c3-849bab534073\") " pod="openstack/keystone-a06d-account-create-update-pk8fp" Dec 04 22:34:16.662296 master-0 kubenswrapper[33572]: I1204 22:34:16.662209 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4fg4\" (UniqueName: \"kubernetes.io/projected/b52dac30-a647-4928-a2c3-849bab534073-kube-api-access-d4fg4\") pod \"keystone-a06d-account-create-update-pk8fp\" (UID: \"b52dac30-a647-4928-a2c3-849bab534073\") " pod="openstack/keystone-a06d-account-create-update-pk8fp" Dec 04 22:34:16.717153 master-0 kubenswrapper[33572]: I1204 22:34:16.717041 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4jgvt"] Dec 04 22:34:16.718794 master-0 kubenswrapper[33572]: I1204 22:34:16.718757 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4jgvt" Dec 04 22:34:16.737565 master-0 kubenswrapper[33572]: I1204 22:34:16.736210 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4jgvt"] Dec 04 22:34:16.737565 master-0 kubenswrapper[33572]: I1204 22:34:16.736758 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a06d-account-create-update-pk8fp" Dec 04 22:34:16.819267 master-0 kubenswrapper[33572]: I1204 22:34:16.819091 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c742-account-create-update-22tqm"] Dec 04 22:34:16.824360 master-0 kubenswrapper[33572]: I1204 22:34:16.824306 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c742-account-create-update-22tqm" Dec 04 22:34:16.826659 master-0 kubenswrapper[33572]: I1204 22:34:16.826601 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 04 22:34:16.846400 master-0 kubenswrapper[33572]: I1204 22:34:16.846229 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:16.846493 master-0 kubenswrapper[33572]: I1204 22:34:16.846428 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5gbc\" (UniqueName: \"kubernetes.io/projected/036de423-2892-492c-a9d4-d38ea619f55c-kube-api-access-l5gbc\") pod \"placement-db-create-4jgvt\" (UID: \"036de423-2892-492c-a9d4-d38ea619f55c\") " pod="openstack/placement-db-create-4jgvt" Dec 04 22:34:16.846493 master-0 kubenswrapper[33572]: I1204 22:34:16.846466 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036de423-2892-492c-a9d4-d38ea619f55c-operator-scripts\") pod \"placement-db-create-4jgvt\" (UID: \"036de423-2892-492c-a9d4-d38ea619f55c\") " pod="openstack/placement-db-create-4jgvt" Dec 04 22:34:16.848381 master-0 kubenswrapper[33572]: I1204 22:34:16.848333 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c742-account-create-update-22tqm"] Dec 04 22:34:16.854207 master-0 kubenswrapper[33572]: I1204 22:34:16.853979 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee-etc-swift\") pod \"swift-storage-0\" (UID: \"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee\") " pod="openstack/swift-storage-0" Dec 04 22:34:16.857361 master-0 kubenswrapper[33572]: I1204 22:34:16.857313 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 22:34:16.948327 master-0 kubenswrapper[33572]: I1204 22:34:16.948244 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t47kc\" (UniqueName: \"kubernetes.io/projected/793ae801-c2fd-4d81-82b1-b72e577668a4-kube-api-access-t47kc\") pod \"placement-c742-account-create-update-22tqm\" (UID: \"793ae801-c2fd-4d81-82b1-b72e577668a4\") " pod="openstack/placement-c742-account-create-update-22tqm" Dec 04 22:34:16.950741 master-0 kubenswrapper[33572]: I1204 22:34:16.950700 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/793ae801-c2fd-4d81-82b1-b72e577668a4-operator-scripts\") pod \"placement-c742-account-create-update-22tqm\" (UID: \"793ae801-c2fd-4d81-82b1-b72e577668a4\") " pod="openstack/placement-c742-account-create-update-22tqm" Dec 04 22:34:16.950873 master-0 kubenswrapper[33572]: I1204 22:34:16.950831 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5gbc\" (UniqueName: \"kubernetes.io/projected/036de423-2892-492c-a9d4-d38ea619f55c-kube-api-access-l5gbc\") pod \"placement-db-create-4jgvt\" (UID: \"036de423-2892-492c-a9d4-d38ea619f55c\") " pod="openstack/placement-db-create-4jgvt" Dec 04 22:34:16.950873 master-0 kubenswrapper[33572]: I1204 22:34:16.950864 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036de423-2892-492c-a9d4-d38ea619f55c-operator-scripts\") pod \"placement-db-create-4jgvt\" (UID: \"036de423-2892-492c-a9d4-d38ea619f55c\") " pod="openstack/placement-db-create-4jgvt" Dec 04 22:34:16.951639 master-0 kubenswrapper[33572]: I1204 22:34:16.951608 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036de423-2892-492c-a9d4-d38ea619f55c-operator-scripts\") pod \"placement-db-create-4jgvt\" (UID: \"036de423-2892-492c-a9d4-d38ea619f55c\") " pod="openstack/placement-db-create-4jgvt" Dec 04 22:34:16.972486 master-0 kubenswrapper[33572]: I1204 22:34:16.972430 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5gbc\" (UniqueName: \"kubernetes.io/projected/036de423-2892-492c-a9d4-d38ea619f55c-kube-api-access-l5gbc\") pod \"placement-db-create-4jgvt\" (UID: \"036de423-2892-492c-a9d4-d38ea619f55c\") " pod="openstack/placement-db-create-4jgvt" Dec 04 22:34:17.053381 master-0 kubenswrapper[33572]: I1204 22:34:17.053299 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/793ae801-c2fd-4d81-82b1-b72e577668a4-operator-scripts\") pod \"placement-c742-account-create-update-22tqm\" (UID: \"793ae801-c2fd-4d81-82b1-b72e577668a4\") " pod="openstack/placement-c742-account-create-update-22tqm" Dec 04 22:34:17.053730 master-0 kubenswrapper[33572]: I1204 22:34:17.053682 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t47kc\" (UniqueName: \"kubernetes.io/projected/793ae801-c2fd-4d81-82b1-b72e577668a4-kube-api-access-t47kc\") pod \"placement-c742-account-create-update-22tqm\" (UID: \"793ae801-c2fd-4d81-82b1-b72e577668a4\") " pod="openstack/placement-c742-account-create-update-22tqm" Dec 04 22:34:17.055383 master-0 kubenswrapper[33572]: I1204 22:34:17.055292 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/793ae801-c2fd-4d81-82b1-b72e577668a4-operator-scripts\") pod \"placement-c742-account-create-update-22tqm\" (UID: \"793ae801-c2fd-4d81-82b1-b72e577668a4\") " pod="openstack/placement-c742-account-create-update-22tqm" Dec 04 22:34:17.057654 master-0 kubenswrapper[33572]: I1204 22:34:17.057595 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4jgvt" Dec 04 22:34:17.074981 master-0 kubenswrapper[33572]: I1204 22:34:17.074737 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t47kc\" (UniqueName: \"kubernetes.io/projected/793ae801-c2fd-4d81-82b1-b72e577668a4-kube-api-access-t47kc\") pod \"placement-c742-account-create-update-22tqm\" (UID: \"793ae801-c2fd-4d81-82b1-b72e577668a4\") " pod="openstack/placement-c742-account-create-update-22tqm" Dec 04 22:34:17.139702 master-0 kubenswrapper[33572]: I1204 22:34:17.139624 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4n4f2"] Dec 04 22:34:17.152489 master-0 kubenswrapper[33572]: I1204 22:34:17.152422 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qbxvp"] Dec 04 22:34:17.154163 master-0 kubenswrapper[33572]: I1204 22:34:17.154116 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.156782 master-0 kubenswrapper[33572]: I1204 22:34:17.155588 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c742-account-create-update-22tqm" Dec 04 22:34:17.157008 master-0 kubenswrapper[33572]: I1204 22:34:17.156953 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-7675d-config-data" Dec 04 22:34:17.169903 master-0 kubenswrapper[33572]: I1204 22:34:17.169822 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qbxvp"] Dec 04 22:34:17.259515 master-0 kubenswrapper[33572]: I1204 22:34:17.259450 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-config-data\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.264440 master-0 kubenswrapper[33572]: I1204 22:34:17.259534 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-combined-ca-bundle\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.264440 master-0 kubenswrapper[33572]: I1204 22:34:17.259575 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grzj2\" (UniqueName: \"kubernetes.io/projected/13c75f01-7399-4114-96b7-9435a6ba089b-kube-api-access-grzj2\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.264440 master-0 kubenswrapper[33572]: I1204 22:34:17.259686 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-db-sync-config-data\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.310678 master-0 kubenswrapper[33572]: I1204 22:34:17.310626 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a06d-account-create-update-pk8fp"] Dec 04 22:34:17.351935 master-0 kubenswrapper[33572]: I1204 22:34:17.351868 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 22:34:17.361818 master-0 kubenswrapper[33572]: I1204 22:34:17.361755 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grzj2\" (UniqueName: \"kubernetes.io/projected/13c75f01-7399-4114-96b7-9435a6ba089b-kube-api-access-grzj2\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.362381 master-0 kubenswrapper[33572]: I1204 22:34:17.362328 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-db-sync-config-data\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.362801 master-0 kubenswrapper[33572]: I1204 22:34:17.362766 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-config-data\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.364064 master-0 kubenswrapper[33572]: I1204 22:34:17.363489 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-combined-ca-bundle\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.372358 master-0 kubenswrapper[33572]: I1204 22:34:17.367011 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-combined-ca-bundle\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.372358 master-0 kubenswrapper[33572]: I1204 22:34:17.367268 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-config-data\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.372358 master-0 kubenswrapper[33572]: I1204 22:34:17.367307 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-db-sync-config-data\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.382526 master-0 kubenswrapper[33572]: I1204 22:34:17.382473 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grzj2\" (UniqueName: \"kubernetes.io/projected/13c75f01-7399-4114-96b7-9435a6ba089b-kube-api-access-grzj2\") pod \"glance-db-sync-qbxvp\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.508979 master-0 kubenswrapper[33572]: I1204 22:34:17.508898 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:17.555424 master-0 kubenswrapper[33572]: I1204 22:34:17.555318 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a06d-account-create-update-pk8fp" event={"ID":"b52dac30-a647-4928-a2c3-849bab534073","Type":"ContainerStarted","Data":"203e6d9e055cb578d4e071e7d202699c3ba107aa9608ee0c6e906b73833ec6b5"} Dec 04 22:34:17.558701 master-0 kubenswrapper[33572]: I1204 22:34:17.558654 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4n4f2" event={"ID":"a95a2cfd-787d-4d6a-94b6-38938be00535","Type":"ContainerStarted","Data":"ade358974030732d6f6c32378d336f49dc4370ee70236beddd36048f301cbb18"} Dec 04 22:34:17.558808 master-0 kubenswrapper[33572]: I1204 22:34:17.558713 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4n4f2" event={"ID":"a95a2cfd-787d-4d6a-94b6-38938be00535","Type":"ContainerStarted","Data":"a79e17a5160629d5d8c1b519375ea1e6394ea7604069d407c8afe1be71597b89"} Dec 04 22:34:17.561455 master-0 kubenswrapper[33572]: I1204 22:34:17.561408 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"3dd09176e81ee8be8e4934c61cd9f0f0fe951c9b105b1d6cb0b6effac9a7b272"} Dec 04 22:34:17.594410 master-0 kubenswrapper[33572]: I1204 22:34:17.594246 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-4n4f2" podStartSLOduration=1.594223189 podStartE2EDuration="1.594223189s" podCreationTimestamp="2025-12-04 22:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:34:17.589005414 +0000 UTC m=+921.316531063" watchObservedRunningTime="2025-12-04 22:34:17.594223189 +0000 UTC m=+921.321748838" Dec 04 22:34:17.666151 master-0 kubenswrapper[33572]: I1204 22:34:17.666063 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4jgvt"] Dec 04 22:34:17.676688 master-0 kubenswrapper[33572]: I1204 22:34:17.676608 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c742-account-create-update-22tqm"] Dec 04 22:34:18.024538 master-0 kubenswrapper[33572]: I1204 22:34:18.019878 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:18.178551 master-0 kubenswrapper[33572]: I1204 22:34:18.178399 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-scripts\") pod \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " Dec 04 22:34:18.178551 master-0 kubenswrapper[33572]: I1204 22:34:18.178465 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-etc-swift\") pod \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " Dec 04 22:34:18.178551 master-0 kubenswrapper[33572]: I1204 22:34:18.178541 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-dispersionconf\") pod \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " Dec 04 22:34:18.178810 master-0 kubenswrapper[33572]: I1204 22:34:18.178566 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-swiftconf\") pod \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " Dec 04 22:34:18.178810 master-0 kubenswrapper[33572]: I1204 22:34:18.178601 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-combined-ca-bundle\") pod \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " Dec 04 22:34:18.178810 master-0 kubenswrapper[33572]: I1204 22:34:18.178685 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-ring-data-devices\") pod \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " Dec 04 22:34:18.178810 master-0 kubenswrapper[33572]: I1204 22:34:18.178716 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtz9s\" (UniqueName: \"kubernetes.io/projected/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-kube-api-access-qtz9s\") pod \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\" (UID: \"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb\") " Dec 04 22:34:18.180444 master-0 kubenswrapper[33572]: I1204 22:34:18.180176 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" (UID: "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:18.180917 master-0 kubenswrapper[33572]: I1204 22:34:18.180834 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" (UID: "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:34:18.187784 master-0 kubenswrapper[33572]: I1204 22:34:18.187431 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-kube-api-access-qtz9s" (OuterVolumeSpecName: "kube-api-access-qtz9s") pod "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" (UID: "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb"). InnerVolumeSpecName "kube-api-access-qtz9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:18.190936 master-0 kubenswrapper[33572]: I1204 22:34:18.190848 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" (UID: "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:18.200832 master-0 kubenswrapper[33572]: I1204 22:34:18.200773 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qbxvp"] Dec 04 22:34:18.212916 master-0 kubenswrapper[33572]: I1204 22:34:18.212590 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-scripts" (OuterVolumeSpecName: "scripts") pod "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" (UID: "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:18.213844 master-0 kubenswrapper[33572]: I1204 22:34:18.213777 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" (UID: "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:18.217117 master-0 kubenswrapper[33572]: W1204 22:34:18.217072 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13c75f01_7399_4114_96b7_9435a6ba089b.slice/crio-023868215ee1c48075a12fc29c17dd2c0751522b9c071b6d823ef01813330a2f WatchSource:0}: Error finding container 023868215ee1c48075a12fc29c17dd2c0751522b9c071b6d823ef01813330a2f: Status 404 returned error can't find the container with id 023868215ee1c48075a12fc29c17dd2c0751522b9c071b6d823ef01813330a2f Dec 04 22:34:18.230852 master-0 kubenswrapper[33572]: I1204 22:34:18.230787 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" (UID: "6914cf0c-7fd5-47d0-8a01-fc58a6c83abb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:18.281926 master-0 kubenswrapper[33572]: I1204 22:34:18.281857 33572 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:18.281926 master-0 kubenswrapper[33572]: I1204 22:34:18.281918 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtz9s\" (UniqueName: \"kubernetes.io/projected/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-kube-api-access-qtz9s\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:18.282478 master-0 kubenswrapper[33572]: I1204 22:34:18.281939 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:18.282478 master-0 kubenswrapper[33572]: I1204 22:34:18.281950 33572 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-etc-swift\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:18.282478 master-0 kubenswrapper[33572]: I1204 22:34:18.281960 33572 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-dispersionconf\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:18.282478 master-0 kubenswrapper[33572]: I1204 22:34:18.281971 33572 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-swiftconf\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:18.282478 master-0 kubenswrapper[33572]: I1204 22:34:18.281981 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6914cf0c-7fd5-47d0-8a01-fc58a6c83abb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:18.609545 master-0 kubenswrapper[33572]: I1204 22:34:18.609283 33572 generic.go:334] "Generic (PLEG): container finished" podID="a95a2cfd-787d-4d6a-94b6-38938be00535" containerID="ade358974030732d6f6c32378d336f49dc4370ee70236beddd36048f301cbb18" exitCode=0 Dec 04 22:34:18.609545 master-0 kubenswrapper[33572]: I1204 22:34:18.609340 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4n4f2" event={"ID":"a95a2cfd-787d-4d6a-94b6-38938be00535","Type":"ContainerDied","Data":"ade358974030732d6f6c32378d336f49dc4370ee70236beddd36048f301cbb18"} Dec 04 22:34:18.611574 master-0 kubenswrapper[33572]: I1204 22:34:18.611531 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qbxvp" event={"ID":"13c75f01-7399-4114-96b7-9435a6ba089b","Type":"ContainerStarted","Data":"023868215ee1c48075a12fc29c17dd2c0751522b9c071b6d823ef01813330a2f"} Dec 04 22:34:18.613227 master-0 kubenswrapper[33572]: I1204 22:34:18.612799 33572 generic.go:334] "Generic (PLEG): container finished" podID="793ae801-c2fd-4d81-82b1-b72e577668a4" containerID="6e66a7b6018782d15f4a77164d01c739390756258f36da8cfc42f73c273c24bf" exitCode=0 Dec 04 22:34:18.613227 master-0 kubenswrapper[33572]: I1204 22:34:18.612885 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c742-account-create-update-22tqm" event={"ID":"793ae801-c2fd-4d81-82b1-b72e577668a4","Type":"ContainerDied","Data":"6e66a7b6018782d15f4a77164d01c739390756258f36da8cfc42f73c273c24bf"} Dec 04 22:34:18.613227 master-0 kubenswrapper[33572]: I1204 22:34:18.612917 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c742-account-create-update-22tqm" event={"ID":"793ae801-c2fd-4d81-82b1-b72e577668a4","Type":"ContainerStarted","Data":"57ded11c278f6caa079597022b70a4edadc8bf56a21248475e941303c27cbf29"} Dec 04 22:34:18.621256 master-0 kubenswrapper[33572]: I1204 22:34:18.620833 33572 generic.go:334] "Generic (PLEG): container finished" podID="036de423-2892-492c-a9d4-d38ea619f55c" containerID="42139ff7bbf274e84e342d108accf059c395a55f884268c0d278fd9ac303b7e9" exitCode=0 Dec 04 22:34:18.621256 master-0 kubenswrapper[33572]: I1204 22:34:18.620920 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4jgvt" event={"ID":"036de423-2892-492c-a9d4-d38ea619f55c","Type":"ContainerDied","Data":"42139ff7bbf274e84e342d108accf059c395a55f884268c0d278fd9ac303b7e9"} Dec 04 22:34:18.621256 master-0 kubenswrapper[33572]: I1204 22:34:18.620959 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4jgvt" event={"ID":"036de423-2892-492c-a9d4-d38ea619f55c","Type":"ContainerStarted","Data":"be8879f70e453987208d2c5da549ea6e44b7ea0d307e5e1d60905167b557853a"} Dec 04 22:34:18.622686 master-0 kubenswrapper[33572]: I1204 22:34:18.622629 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2vjbq" event={"ID":"6914cf0c-7fd5-47d0-8a01-fc58a6c83abb","Type":"ContainerDied","Data":"02ecbd2b0fc96c3b0369559f435470eed3f42bb8d3765827c131939b9e33a9ca"} Dec 04 22:34:18.622686 master-0 kubenswrapper[33572]: I1204 22:34:18.622661 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ecbd2b0fc96c3b0369559f435470eed3f42bb8d3765827c131939b9e33a9ca" Dec 04 22:34:18.622960 master-0 kubenswrapper[33572]: I1204 22:34:18.622706 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2vjbq" Dec 04 22:34:18.626018 master-0 kubenswrapper[33572]: I1204 22:34:18.625970 33572 generic.go:334] "Generic (PLEG): container finished" podID="b52dac30-a647-4928-a2c3-849bab534073" containerID="d76d7f47dc8ce011a7120baf31993be5c69c3cdf49281070022020d1a3fd0584" exitCode=0 Dec 04 22:34:18.626018 master-0 kubenswrapper[33572]: I1204 22:34:18.626005 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a06d-account-create-update-pk8fp" event={"ID":"b52dac30-a647-4928-a2c3-849bab534073","Type":"ContainerDied","Data":"d76d7f47dc8ce011a7120baf31993be5c69c3cdf49281070022020d1a3fd0584"} Dec 04 22:34:18.722190 master-0 kubenswrapper[33572]: E1204 22:34:18.721151 33572 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6914cf0c_7fd5_47d0_8a01_fc58a6c83abb.slice\": RecentStats: unable to find data in memory cache]" Dec 04 22:34:19.162844 master-0 kubenswrapper[33572]: I1204 22:34:19.162691 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 04 22:34:19.642462 master-0 kubenswrapper[33572]: I1204 22:34:19.642400 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"9c3632194a9cad35d559faa6769fee3c83a42f73e5a502314f504952c6f215a7"} Dec 04 22:34:19.642890 master-0 kubenswrapper[33572]: I1204 22:34:19.642478 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"60d5af23c42dee45bbd06e5408514004e84a751a5aa286c142d51f3b19fc19af"} Dec 04 22:34:19.642890 master-0 kubenswrapper[33572]: I1204 22:34:19.642526 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"5e8aa63c9b318152c6257d0b043c7c4cd94bbef9adbaf633dcd9003153a75c8a"} Dec 04 22:34:19.642890 master-0 kubenswrapper[33572]: I1204 22:34:19.642548 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"8873504d97f7f898e1d2ae625bfcfaa709a4fe2728a73f144869c8144a763d0e"} Dec 04 22:34:20.958130 master-0 kubenswrapper[33572]: I1204 22:34:20.918774 33572 generic.go:334] "Generic (PLEG): container finished" podID="1d1338e8-405c-4439-ae7d-02034960a5c5" containerID="4115e7db102073368d69ec1f7682282b2a553bc4eba90cce35c1dc2cb12172a7" exitCode=0 Dec 04 22:34:20.984798 master-0 kubenswrapper[33572]: I1204 22:34:20.980039 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1d1338e8-405c-4439-ae7d-02034960a5c5","Type":"ContainerDied","Data":"4115e7db102073368d69ec1f7682282b2a553bc4eba90cce35c1dc2cb12172a7"} Dec 04 22:34:21.359373 master-0 kubenswrapper[33572]: I1204 22:34:21.359336 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4jgvt" Dec 04 22:34:21.365132 master-0 kubenswrapper[33572]: I1204 22:34:21.365075 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4n4f2" Dec 04 22:34:21.373280 master-0 kubenswrapper[33572]: I1204 22:34:21.373195 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a06d-account-create-update-pk8fp" Dec 04 22:34:21.474231 master-0 kubenswrapper[33572]: I1204 22:34:21.474164 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzzb6\" (UniqueName: \"kubernetes.io/projected/a95a2cfd-787d-4d6a-94b6-38938be00535-kube-api-access-zzzb6\") pod \"a95a2cfd-787d-4d6a-94b6-38938be00535\" (UID: \"a95a2cfd-787d-4d6a-94b6-38938be00535\") " Dec 04 22:34:21.474423 master-0 kubenswrapper[33572]: I1204 22:34:21.474325 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95a2cfd-787d-4d6a-94b6-38938be00535-operator-scripts\") pod \"a95a2cfd-787d-4d6a-94b6-38938be00535\" (UID: \"a95a2cfd-787d-4d6a-94b6-38938be00535\") " Dec 04 22:34:21.474473 master-0 kubenswrapper[33572]: I1204 22:34:21.474428 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b52dac30-a647-4928-a2c3-849bab534073-operator-scripts\") pod \"b52dac30-a647-4928-a2c3-849bab534073\" (UID: \"b52dac30-a647-4928-a2c3-849bab534073\") " Dec 04 22:34:21.474604 master-0 kubenswrapper[33572]: I1204 22:34:21.474585 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5gbc\" (UniqueName: \"kubernetes.io/projected/036de423-2892-492c-a9d4-d38ea619f55c-kube-api-access-l5gbc\") pod \"036de423-2892-492c-a9d4-d38ea619f55c\" (UID: \"036de423-2892-492c-a9d4-d38ea619f55c\") " Dec 04 22:34:21.474662 master-0 kubenswrapper[33572]: I1204 22:34:21.474632 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036de423-2892-492c-a9d4-d38ea619f55c-operator-scripts\") pod \"036de423-2892-492c-a9d4-d38ea619f55c\" (UID: \"036de423-2892-492c-a9d4-d38ea619f55c\") " Dec 04 22:34:21.474701 master-0 kubenswrapper[33572]: I1204 22:34:21.474668 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4fg4\" (UniqueName: \"kubernetes.io/projected/b52dac30-a647-4928-a2c3-849bab534073-kube-api-access-d4fg4\") pod \"b52dac30-a647-4928-a2c3-849bab534073\" (UID: \"b52dac30-a647-4928-a2c3-849bab534073\") " Dec 04 22:34:21.474872 master-0 kubenswrapper[33572]: I1204 22:34:21.474830 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95a2cfd-787d-4d6a-94b6-38938be00535-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a95a2cfd-787d-4d6a-94b6-38938be00535" (UID: "a95a2cfd-787d-4d6a-94b6-38938be00535"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:21.475200 master-0 kubenswrapper[33572]: I1204 22:34:21.475166 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95a2cfd-787d-4d6a-94b6-38938be00535-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:21.475247 master-0 kubenswrapper[33572]: I1204 22:34:21.475167 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b52dac30-a647-4928-a2c3-849bab534073-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b52dac30-a647-4928-a2c3-849bab534073" (UID: "b52dac30-a647-4928-a2c3-849bab534073"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:21.475247 master-0 kubenswrapper[33572]: I1204 22:34:21.475229 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/036de423-2892-492c-a9d4-d38ea619f55c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "036de423-2892-492c-a9d4-d38ea619f55c" (UID: "036de423-2892-492c-a9d4-d38ea619f55c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:21.477966 master-0 kubenswrapper[33572]: I1204 22:34:21.477924 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/036de423-2892-492c-a9d4-d38ea619f55c-kube-api-access-l5gbc" (OuterVolumeSpecName: "kube-api-access-l5gbc") pod "036de423-2892-492c-a9d4-d38ea619f55c" (UID: "036de423-2892-492c-a9d4-d38ea619f55c"). InnerVolumeSpecName "kube-api-access-l5gbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:21.478283 master-0 kubenswrapper[33572]: I1204 22:34:21.478232 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b52dac30-a647-4928-a2c3-849bab534073-kube-api-access-d4fg4" (OuterVolumeSpecName: "kube-api-access-d4fg4") pod "b52dac30-a647-4928-a2c3-849bab534073" (UID: "b52dac30-a647-4928-a2c3-849bab534073"). InnerVolumeSpecName "kube-api-access-d4fg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:21.479677 master-0 kubenswrapper[33572]: I1204 22:34:21.479648 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a95a2cfd-787d-4d6a-94b6-38938be00535-kube-api-access-zzzb6" (OuterVolumeSpecName: "kube-api-access-zzzb6") pod "a95a2cfd-787d-4d6a-94b6-38938be00535" (UID: "a95a2cfd-787d-4d6a-94b6-38938be00535"). InnerVolumeSpecName "kube-api-access-zzzb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:21.576946 master-0 kubenswrapper[33572]: I1204 22:34:21.576871 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b52dac30-a647-4928-a2c3-849bab534073-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:21.576946 master-0 kubenswrapper[33572]: I1204 22:34:21.576929 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5gbc\" (UniqueName: \"kubernetes.io/projected/036de423-2892-492c-a9d4-d38ea619f55c-kube-api-access-l5gbc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:21.576946 master-0 kubenswrapper[33572]: I1204 22:34:21.576946 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/036de423-2892-492c-a9d4-d38ea619f55c-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:21.576946 master-0 kubenswrapper[33572]: I1204 22:34:21.576957 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4fg4\" (UniqueName: \"kubernetes.io/projected/b52dac30-a647-4928-a2c3-849bab534073-kube-api-access-d4fg4\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:21.577379 master-0 kubenswrapper[33572]: I1204 22:34:21.576968 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzzb6\" (UniqueName: \"kubernetes.io/projected/a95a2cfd-787d-4d6a-94b6-38938be00535-kube-api-access-zzzb6\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:21.598888 master-0 kubenswrapper[33572]: I1204 22:34:21.598856 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c742-account-create-update-22tqm" Dec 04 22:34:21.678757 master-0 kubenswrapper[33572]: I1204 22:34:21.678706 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/793ae801-c2fd-4d81-82b1-b72e577668a4-operator-scripts\") pod \"793ae801-c2fd-4d81-82b1-b72e577668a4\" (UID: \"793ae801-c2fd-4d81-82b1-b72e577668a4\") " Dec 04 22:34:21.679541 master-0 kubenswrapper[33572]: I1204 22:34:21.679432 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/793ae801-c2fd-4d81-82b1-b72e577668a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "793ae801-c2fd-4d81-82b1-b72e577668a4" (UID: "793ae801-c2fd-4d81-82b1-b72e577668a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:21.679675 master-0 kubenswrapper[33572]: I1204 22:34:21.679638 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t47kc\" (UniqueName: \"kubernetes.io/projected/793ae801-c2fd-4d81-82b1-b72e577668a4-kube-api-access-t47kc\") pod \"793ae801-c2fd-4d81-82b1-b72e577668a4\" (UID: \"793ae801-c2fd-4d81-82b1-b72e577668a4\") " Dec 04 22:34:21.680872 master-0 kubenswrapper[33572]: I1204 22:34:21.680837 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/793ae801-c2fd-4d81-82b1-b72e577668a4-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:21.689897 master-0 kubenswrapper[33572]: I1204 22:34:21.689824 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/793ae801-c2fd-4d81-82b1-b72e577668a4-kube-api-access-t47kc" (OuterVolumeSpecName: "kube-api-access-t47kc") pod "793ae801-c2fd-4d81-82b1-b72e577668a4" (UID: "793ae801-c2fd-4d81-82b1-b72e577668a4"). InnerVolumeSpecName "kube-api-access-t47kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:21.783017 master-0 kubenswrapper[33572]: I1204 22:34:21.782962 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t47kc\" (UniqueName: \"kubernetes.io/projected/793ae801-c2fd-4d81-82b1-b72e577668a4-kube-api-access-t47kc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:21.936099 master-0 kubenswrapper[33572]: I1204 22:34:21.936062 33572 generic.go:334] "Generic (PLEG): container finished" podID="077ecfc2-ed81-4de5-993c-0f1084df9734" containerID="a3f0c1d136cea32b8f4e971882ddc52d40a18227ee3ad022f0524e308fee4383" exitCode=0 Dec 04 22:34:21.936201 master-0 kubenswrapper[33572]: I1204 22:34:21.936126 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"077ecfc2-ed81-4de5-993c-0f1084df9734","Type":"ContainerDied","Data":"a3f0c1d136cea32b8f4e971882ddc52d40a18227ee3ad022f0524e308fee4383"} Dec 04 22:34:21.941533 master-0 kubenswrapper[33572]: I1204 22:34:21.941301 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a06d-account-create-update-pk8fp" event={"ID":"b52dac30-a647-4928-a2c3-849bab534073","Type":"ContainerDied","Data":"203e6d9e055cb578d4e071e7d202699c3ba107aa9608ee0c6e906b73833ec6b5"} Dec 04 22:34:21.941533 master-0 kubenswrapper[33572]: I1204 22:34:21.941351 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="203e6d9e055cb578d4e071e7d202699c3ba107aa9608ee0c6e906b73833ec6b5" Dec 04 22:34:21.941533 master-0 kubenswrapper[33572]: I1204 22:34:21.941420 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a06d-account-create-update-pk8fp" Dec 04 22:34:21.945116 master-0 kubenswrapper[33572]: I1204 22:34:21.945058 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4n4f2" event={"ID":"a95a2cfd-787d-4d6a-94b6-38938be00535","Type":"ContainerDied","Data":"a79e17a5160629d5d8c1b519375ea1e6394ea7604069d407c8afe1be71597b89"} Dec 04 22:34:21.945116 master-0 kubenswrapper[33572]: I1204 22:34:21.945112 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a79e17a5160629d5d8c1b519375ea1e6394ea7604069d407c8afe1be71597b89" Dec 04 22:34:21.945270 master-0 kubenswrapper[33572]: I1204 22:34:21.945170 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4n4f2" Dec 04 22:34:21.952182 master-0 kubenswrapper[33572]: I1204 22:34:21.952135 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1d1338e8-405c-4439-ae7d-02034960a5c5","Type":"ContainerStarted","Data":"1fb5be21226d245c7c994a8fee42c7f2fe7c792c270bf48b1ba69d2dd8f94d62"} Dec 04 22:34:21.953094 master-0 kubenswrapper[33572]: I1204 22:34:21.953056 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:34:21.955030 master-0 kubenswrapper[33572]: I1204 22:34:21.955008 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c742-account-create-update-22tqm" Dec 04 22:34:21.955147 master-0 kubenswrapper[33572]: I1204 22:34:21.955013 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c742-account-create-update-22tqm" event={"ID":"793ae801-c2fd-4d81-82b1-b72e577668a4","Type":"ContainerDied","Data":"57ded11c278f6caa079597022b70a4edadc8bf56a21248475e941303c27cbf29"} Dec 04 22:34:21.955199 master-0 kubenswrapper[33572]: I1204 22:34:21.955158 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57ded11c278f6caa079597022b70a4edadc8bf56a21248475e941303c27cbf29" Dec 04 22:34:21.959717 master-0 kubenswrapper[33572]: I1204 22:34:21.959691 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"0a831eee4067eca3ffc833fbd0b4695864d7d66e0e9a889c5afaf6c40d407b71"} Dec 04 22:34:21.962471 master-0 kubenswrapper[33572]: I1204 22:34:21.962441 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4jgvt" event={"ID":"036de423-2892-492c-a9d4-d38ea619f55c","Type":"ContainerDied","Data":"be8879f70e453987208d2c5da549ea6e44b7ea0d307e5e1d60905167b557853a"} Dec 04 22:34:21.962544 master-0 kubenswrapper[33572]: I1204 22:34:21.962475 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8879f70e453987208d2c5da549ea6e44b7ea0d307e5e1d60905167b557853a" Dec 04 22:34:21.962577 master-0 kubenswrapper[33572]: I1204 22:34:21.962552 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4jgvt" Dec 04 22:34:22.006681 master-0 kubenswrapper[33572]: I1204 22:34:22.006475 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.876559861 podStartE2EDuration="1m0.006449065s" podCreationTimestamp="2025-12-04 22:33:22 +0000 UTC" firstStartedPulling="2025-12-04 22:33:42.634474788 +0000 UTC m=+886.362000437" lastFinishedPulling="2025-12-04 22:33:46.764363992 +0000 UTC m=+890.491889641" observedRunningTime="2025-12-04 22:34:21.99691814 +0000 UTC m=+925.724443819" watchObservedRunningTime="2025-12-04 22:34:22.006449065 +0000 UTC m=+925.733974744" Dec 04 22:34:22.978552 master-0 kubenswrapper[33572]: I1204 22:34:22.978415 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"b7c186391385e9f63d8f2eb2fd10217e04452e6e9c27d6c59092195bb21bf3ca"} Dec 04 22:34:22.980575 master-0 kubenswrapper[33572]: I1204 22:34:22.980445 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"077ecfc2-ed81-4de5-993c-0f1084df9734","Type":"ContainerStarted","Data":"427ccf412157be4eb332b5af6a4635ab0a3cd35e6fb47746b9f7bdb339501443"} Dec 04 22:34:23.384071 master-0 kubenswrapper[33572]: I1204 22:34:23.383986 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=58.277501343 podStartE2EDuration="1m1.383970967s" podCreationTimestamp="2025-12-04 22:33:22 +0000 UTC" firstStartedPulling="2025-12-04 22:33:43.669304025 +0000 UTC m=+887.396829674" lastFinishedPulling="2025-12-04 22:33:46.775773649 +0000 UTC m=+890.503299298" observedRunningTime="2025-12-04 22:34:23.360297281 +0000 UTC m=+927.087822930" watchObservedRunningTime="2025-12-04 22:34:23.383970967 +0000 UTC m=+927.111496616" Dec 04 22:34:23.999784 master-0 kubenswrapper[33572]: I1204 22:34:23.999650 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"4c874300683385f3a4997d9f0abc9151837e769df65d809302cfe27e4f3e1a9d"} Dec 04 22:34:23.999784 master-0 kubenswrapper[33572]: I1204 22:34:23.999721 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"c20decc7d43f6985a022ad48b9f3a4e2fc63e7e8e6fc58ff26cbdd3b0fb54c80"} Dec 04 22:34:24.983246 master-0 kubenswrapper[33572]: I1204 22:34:24.983132 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rk7z7" podUID="b7512b9d-7adc-4af3-a29f-b154799338cb" containerName="ovn-controller" probeResult="failure" output=< Dec 04 22:34:24.983246 master-0 kubenswrapper[33572]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 22:34:24.983246 master-0 kubenswrapper[33572]: > Dec 04 22:34:25.080099 master-0 kubenswrapper[33572]: I1204 22:34:25.080038 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:34:25.084916 master-0 kubenswrapper[33572]: I1204 22:34:25.084862 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mk92t" Dec 04 22:34:25.337760 master-0 kubenswrapper[33572]: I1204 22:34:25.337692 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rk7z7-config-pmf49"] Dec 04 22:34:25.338122 master-0 kubenswrapper[33572]: E1204 22:34:25.338095 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="793ae801-c2fd-4d81-82b1-b72e577668a4" containerName="mariadb-account-create-update" Dec 04 22:34:25.338122 master-0 kubenswrapper[33572]: I1204 22:34:25.338114 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="793ae801-c2fd-4d81-82b1-b72e577668a4" containerName="mariadb-account-create-update" Dec 04 22:34:25.338305 master-0 kubenswrapper[33572]: E1204 22:34:25.338132 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b52dac30-a647-4928-a2c3-849bab534073" containerName="mariadb-account-create-update" Dec 04 22:34:25.338305 master-0 kubenswrapper[33572]: I1204 22:34:25.338139 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b52dac30-a647-4928-a2c3-849bab534073" containerName="mariadb-account-create-update" Dec 04 22:34:25.338305 master-0 kubenswrapper[33572]: E1204 22:34:25.338169 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a95a2cfd-787d-4d6a-94b6-38938be00535" containerName="mariadb-database-create" Dec 04 22:34:25.338305 master-0 kubenswrapper[33572]: I1204 22:34:25.338176 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a95a2cfd-787d-4d6a-94b6-38938be00535" containerName="mariadb-database-create" Dec 04 22:34:25.338305 master-0 kubenswrapper[33572]: E1204 22:34:25.338190 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036de423-2892-492c-a9d4-d38ea619f55c" containerName="mariadb-database-create" Dec 04 22:34:25.338305 master-0 kubenswrapper[33572]: I1204 22:34:25.338196 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="036de423-2892-492c-a9d4-d38ea619f55c" containerName="mariadb-database-create" Dec 04 22:34:25.338305 master-0 kubenswrapper[33572]: E1204 22:34:25.338206 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" containerName="swift-ring-rebalance" Dec 04 22:34:25.338305 master-0 kubenswrapper[33572]: I1204 22:34:25.338213 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" containerName="swift-ring-rebalance" Dec 04 22:34:25.339128 master-0 kubenswrapper[33572]: I1204 22:34:25.338598 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="793ae801-c2fd-4d81-82b1-b72e577668a4" containerName="mariadb-account-create-update" Dec 04 22:34:25.339128 master-0 kubenswrapper[33572]: I1204 22:34:25.338620 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="036de423-2892-492c-a9d4-d38ea619f55c" containerName="mariadb-database-create" Dec 04 22:34:25.339128 master-0 kubenswrapper[33572]: I1204 22:34:25.338653 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="6914cf0c-7fd5-47d0-8a01-fc58a6c83abb" containerName="swift-ring-rebalance" Dec 04 22:34:25.339128 master-0 kubenswrapper[33572]: I1204 22:34:25.338665 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="a95a2cfd-787d-4d6a-94b6-38938be00535" containerName="mariadb-database-create" Dec 04 22:34:25.339128 master-0 kubenswrapper[33572]: I1204 22:34:25.338675 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b52dac30-a647-4928-a2c3-849bab534073" containerName="mariadb-account-create-update" Dec 04 22:34:25.339699 master-0 kubenswrapper[33572]: I1204 22:34:25.339307 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.342429 master-0 kubenswrapper[33572]: I1204 22:34:25.341646 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 22:34:25.376914 master-0 kubenswrapper[33572]: I1204 22:34:25.376854 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.376914 master-0 kubenswrapper[33572]: I1204 22:34:25.376934 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run-ovn\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.377273 master-0 kubenswrapper[33572]: I1204 22:34:25.376955 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-log-ovn\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.377273 master-0 kubenswrapper[33572]: I1204 22:34:25.377040 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpn7q\" (UniqueName: \"kubernetes.io/projected/f54c116e-f8d0-42e0-8481-5d4ff73af74a-kube-api-access-rpn7q\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.377273 master-0 kubenswrapper[33572]: I1204 22:34:25.377093 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-scripts\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.377273 master-0 kubenswrapper[33572]: I1204 22:34:25.377146 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-additional-scripts\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.377273 master-0 kubenswrapper[33572]: I1204 22:34:25.377248 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rk7z7-config-pmf49"] Dec 04 22:34:25.478477 master-0 kubenswrapper[33572]: I1204 22:34:25.478390 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run-ovn\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.478477 master-0 kubenswrapper[33572]: I1204 22:34:25.478482 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-log-ovn\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.478931 master-0 kubenswrapper[33572]: I1204 22:34:25.478596 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpn7q\" (UniqueName: \"kubernetes.io/projected/f54c116e-f8d0-42e0-8481-5d4ff73af74a-kube-api-access-rpn7q\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.478931 master-0 kubenswrapper[33572]: I1204 22:34:25.478656 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-scripts\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.478931 master-0 kubenswrapper[33572]: I1204 22:34:25.478711 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-additional-scripts\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.478931 master-0 kubenswrapper[33572]: I1204 22:34:25.478741 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run-ovn\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.478931 master-0 kubenswrapper[33572]: I1204 22:34:25.478769 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.478931 master-0 kubenswrapper[33572]: I1204 22:34:25.478868 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.480404 master-0 kubenswrapper[33572]: I1204 22:34:25.480353 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-log-ovn\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.484636 master-0 kubenswrapper[33572]: I1204 22:34:25.484255 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-scripts\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.484931 master-0 kubenswrapper[33572]: I1204 22:34:25.484885 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-additional-scripts\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.497961 master-0 kubenswrapper[33572]: I1204 22:34:25.497925 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpn7q\" (UniqueName: \"kubernetes.io/projected/f54c116e-f8d0-42e0-8481-5d4ff73af74a-kube-api-access-rpn7q\") pod \"ovn-controller-rk7z7-config-pmf49\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:25.656421 master-0 kubenswrapper[33572]: I1204 22:34:25.656297 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:29.751834 master-0 kubenswrapper[33572]: I1204 22:34:29.751769 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 22:34:29.985136 master-0 kubenswrapper[33572]: I1204 22:34:29.985059 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rk7z7" podUID="b7512b9d-7adc-4af3-a29f-b154799338cb" containerName="ovn-controller" probeResult="failure" output=< Dec 04 22:34:29.985136 master-0 kubenswrapper[33572]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 22:34:29.985136 master-0 kubenswrapper[33572]: > Dec 04 22:34:32.049672 master-0 kubenswrapper[33572]: I1204 22:34:32.049632 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rk7z7-config-pmf49"] Dec 04 22:34:32.108407 master-0 kubenswrapper[33572]: I1204 22:34:32.108348 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rk7z7-config-pmf49" event={"ID":"f54c116e-f8d0-42e0-8481-5d4ff73af74a","Type":"ContainerStarted","Data":"02c0f455693b6f16b3b4fa2ea231d9807e2b1ef202cf93bc1a81fc0f15b131fd"} Dec 04 22:34:32.113712 master-0 kubenswrapper[33572]: I1204 22:34:32.113637 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"d03ca2bfbde8b5ca402f9f1c5c5e1407f1efc1b6d908379db672712360a3345c"} Dec 04 22:34:32.113712 master-0 kubenswrapper[33572]: I1204 22:34:32.113680 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"7aab892a1310a22b89148366903d0dce7f63d986d318b79e2e7ab1bada704e5c"} Dec 04 22:34:33.127333 master-0 kubenswrapper[33572]: I1204 22:34:33.127277 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qbxvp" event={"ID":"13c75f01-7399-4114-96b7-9435a6ba089b","Type":"ContainerStarted","Data":"0d636103e4ae963e2e654a6eae6dddd7b9985787031941a9ef697cd6da609156"} Dec 04 22:34:33.137265 master-0 kubenswrapper[33572]: I1204 22:34:33.137193 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"d9f9044a14c70545ed71942ec707fb0b024a2d74abc944963bb55052a64b8465"} Dec 04 22:34:33.137265 master-0 kubenswrapper[33572]: I1204 22:34:33.137267 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"65ef2ee92eb464dc37bf272aaeab5558cb0e0712b5920b29babb53467be53529"} Dec 04 22:34:33.137576 master-0 kubenswrapper[33572]: I1204 22:34:33.137288 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"af29575f3afe60dcd198fd029126e16d4c393093c5e34b4f0eeb8ae2a1c2daeb"} Dec 04 22:34:33.137576 master-0 kubenswrapper[33572]: I1204 22:34:33.137306 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"e8f221c5b880a890285b395f9c896277a2e2c424148ba85161c1f166526883b3"} Dec 04 22:34:33.137576 master-0 kubenswrapper[33572]: I1204 22:34:33.137324 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d2b1ee7c-8aa9-4e49-a875-7ba2ffb4b5ee","Type":"ContainerStarted","Data":"c6309d1bb6263c58072fda084cf1be09b3fd53a66a81f9cb7ca711ea716df8fa"} Dec 04 22:34:33.142202 master-0 kubenswrapper[33572]: I1204 22:34:33.142145 33572 generic.go:334] "Generic (PLEG): container finished" podID="f54c116e-f8d0-42e0-8481-5d4ff73af74a" containerID="83ac0904d547478fbfb25940f7fe81c541748b3e02f60b2cc361e5ac58be87c9" exitCode=0 Dec 04 22:34:33.142311 master-0 kubenswrapper[33572]: I1204 22:34:33.142205 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rk7z7-config-pmf49" event={"ID":"f54c116e-f8d0-42e0-8481-5d4ff73af74a","Type":"ContainerDied","Data":"83ac0904d547478fbfb25940f7fe81c541748b3e02f60b2cc361e5ac58be87c9"} Dec 04 22:34:33.175633 master-0 kubenswrapper[33572]: I1204 22:34:33.175424 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qbxvp" podStartSLOduration=2.762898619 podStartE2EDuration="16.175395364s" podCreationTimestamp="2025-12-04 22:34:17 +0000 UTC" firstStartedPulling="2025-12-04 22:34:18.223965278 +0000 UTC m=+921.951490927" lastFinishedPulling="2025-12-04 22:34:31.636462023 +0000 UTC m=+935.363987672" observedRunningTime="2025-12-04 22:34:33.163642477 +0000 UTC m=+936.891168166" watchObservedRunningTime="2025-12-04 22:34:33.175395364 +0000 UTC m=+936.902921053" Dec 04 22:34:33.240652 master-0 kubenswrapper[33572]: I1204 22:34:33.240540 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.992307353 podStartE2EDuration="35.24048595s" podCreationTimestamp="2025-12-04 22:33:58 +0000 UTC" firstStartedPulling="2025-12-04 22:34:17.362681296 +0000 UTC m=+921.090206945" lastFinishedPulling="2025-12-04 22:34:31.610859893 +0000 UTC m=+935.338385542" observedRunningTime="2025-12-04 22:34:33.220202037 +0000 UTC m=+936.947727736" watchObservedRunningTime="2025-12-04 22:34:33.24048595 +0000 UTC m=+936.968011639" Dec 04 22:34:33.649668 master-0 kubenswrapper[33572]: I1204 22:34:33.649600 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c6f8c9899-t2px9"] Dec 04 22:34:33.651385 master-0 kubenswrapper[33572]: I1204 22:34:33.651352 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.654206 master-0 kubenswrapper[33572]: I1204 22:34:33.654151 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 04 22:34:33.669469 master-0 kubenswrapper[33572]: I1204 22:34:33.669398 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c6f8c9899-t2px9"] Dec 04 22:34:33.775035 master-0 kubenswrapper[33572]: I1204 22:34:33.774958 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-svc\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.775252 master-0 kubenswrapper[33572]: I1204 22:34:33.775045 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.775252 master-0 kubenswrapper[33572]: I1204 22:34:33.775083 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-config\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.775252 master-0 kubenswrapper[33572]: I1204 22:34:33.775135 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.775252 master-0 kubenswrapper[33572]: I1204 22:34:33.775173 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wn6z\" (UniqueName: \"kubernetes.io/projected/9d6a1d9e-6849-435b-aee9-caf23bce31bb-kube-api-access-7wn6z\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.775252 master-0 kubenswrapper[33572]: I1204 22:34:33.775189 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.877225 master-0 kubenswrapper[33572]: I1204 22:34:33.877156 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.877531 master-0 kubenswrapper[33572]: I1204 22:34:33.877437 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wn6z\" (UniqueName: \"kubernetes.io/projected/9d6a1d9e-6849-435b-aee9-caf23bce31bb-kube-api-access-7wn6z\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.877608 master-0 kubenswrapper[33572]: I1204 22:34:33.877556 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.877869 master-0 kubenswrapper[33572]: I1204 22:34:33.877843 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-svc\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.878077 master-0 kubenswrapper[33572]: I1204 22:34:33.878037 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.878205 master-0 kubenswrapper[33572]: I1204 22:34:33.878177 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-config\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.878430 master-0 kubenswrapper[33572]: I1204 22:34:33.878387 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.878736 master-0 kubenswrapper[33572]: I1204 22:34:33.878703 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-svc\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.878827 master-0 kubenswrapper[33572]: I1204 22:34:33.878790 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.879174 master-0 kubenswrapper[33572]: I1204 22:34:33.879136 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.879260 master-0 kubenswrapper[33572]: I1204 22:34:33.879232 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-config\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:33.891988 master-0 kubenswrapper[33572]: I1204 22:34:33.891951 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wn6z\" (UniqueName: \"kubernetes.io/projected/9d6a1d9e-6849-435b-aee9-caf23bce31bb-kube-api-access-7wn6z\") pod \"dnsmasq-dns-5c6f8c9899-t2px9\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:34.023782 master-0 kubenswrapper[33572]: I1204 22:34:34.023724 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:34.489791 master-0 kubenswrapper[33572]: I1204 22:34:34.489694 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c6f8c9899-t2px9"] Dec 04 22:34:34.510260 master-0 kubenswrapper[33572]: W1204 22:34:34.510190 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d6a1d9e_6849_435b_aee9_caf23bce31bb.slice/crio-ff6d4e507f9376dfc3c933e8ab1fec8384f97096a9a39993d72629b0acc30c05 WatchSource:0}: Error finding container ff6d4e507f9376dfc3c933e8ab1fec8384f97096a9a39993d72629b0acc30c05: Status 404 returned error can't find the container with id ff6d4e507f9376dfc3c933e8ab1fec8384f97096a9a39993d72629b0acc30c05 Dec 04 22:34:34.601581 master-0 kubenswrapper[33572]: I1204 22:34:34.601537 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:34.698218 master-0 kubenswrapper[33572]: I1204 22:34:34.698171 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpn7q\" (UniqueName: \"kubernetes.io/projected/f54c116e-f8d0-42e0-8481-5d4ff73af74a-kube-api-access-rpn7q\") pod \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " Dec 04 22:34:34.698320 master-0 kubenswrapper[33572]: I1204 22:34:34.698226 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-log-ovn\") pod \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " Dec 04 22:34:34.698320 master-0 kubenswrapper[33572]: I1204 22:34:34.698295 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-scripts\") pod \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " Dec 04 22:34:34.698320 master-0 kubenswrapper[33572]: I1204 22:34:34.698312 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run\") pod \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " Dec 04 22:34:34.698452 master-0 kubenswrapper[33572]: I1204 22:34:34.698427 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-additional-scripts\") pod \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " Dec 04 22:34:34.698580 master-0 kubenswrapper[33572]: I1204 22:34:34.698562 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run-ovn\") pod \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\" (UID: \"f54c116e-f8d0-42e0-8481-5d4ff73af74a\") " Dec 04 22:34:34.698889 master-0 kubenswrapper[33572]: I1204 22:34:34.698826 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run" (OuterVolumeSpecName: "var-run") pod "f54c116e-f8d0-42e0-8481-5d4ff73af74a" (UID: "f54c116e-f8d0-42e0-8481-5d4ff73af74a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:34:34.698955 master-0 kubenswrapper[33572]: I1204 22:34:34.698904 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f54c116e-f8d0-42e0-8481-5d4ff73af74a" (UID: "f54c116e-f8d0-42e0-8481-5d4ff73af74a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:34:34.699067 master-0 kubenswrapper[33572]: I1204 22:34:34.699033 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f54c116e-f8d0-42e0-8481-5d4ff73af74a" (UID: "f54c116e-f8d0-42e0-8481-5d4ff73af74a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:34:34.699455 master-0 kubenswrapper[33572]: I1204 22:34:34.699430 33572 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:34.699525 master-0 kubenswrapper[33572]: I1204 22:34:34.699455 33572 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:34.699576 master-0 kubenswrapper[33572]: I1204 22:34:34.699493 33572 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f54c116e-f8d0-42e0-8481-5d4ff73af74a-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:34.700741 master-0 kubenswrapper[33572]: I1204 22:34:34.700689 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f54c116e-f8d0-42e0-8481-5d4ff73af74a" (UID: "f54c116e-f8d0-42e0-8481-5d4ff73af74a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:34.702085 master-0 kubenswrapper[33572]: I1204 22:34:34.702032 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-scripts" (OuterVolumeSpecName: "scripts") pod "f54c116e-f8d0-42e0-8481-5d4ff73af74a" (UID: "f54c116e-f8d0-42e0-8481-5d4ff73af74a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:34.705713 master-0 kubenswrapper[33572]: I1204 22:34:34.705642 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54c116e-f8d0-42e0-8481-5d4ff73af74a-kube-api-access-rpn7q" (OuterVolumeSpecName: "kube-api-access-rpn7q") pod "f54c116e-f8d0-42e0-8481-5d4ff73af74a" (UID: "f54c116e-f8d0-42e0-8481-5d4ff73af74a"). InnerVolumeSpecName "kube-api-access-rpn7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:34.802013 master-0 kubenswrapper[33572]: I1204 22:34:34.801931 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpn7q\" (UniqueName: \"kubernetes.io/projected/f54c116e-f8d0-42e0-8481-5d4ff73af74a-kube-api-access-rpn7q\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:34.802013 master-0 kubenswrapper[33572]: I1204 22:34:34.802010 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:34.802276 master-0 kubenswrapper[33572]: I1204 22:34:34.802029 33572 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f54c116e-f8d0-42e0-8481-5d4ff73af74a-additional-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:34.990449 master-0 kubenswrapper[33572]: I1204 22:34:34.990386 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rk7z7" Dec 04 22:34:35.172583 master-0 kubenswrapper[33572]: I1204 22:34:35.172482 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rk7z7-config-pmf49" event={"ID":"f54c116e-f8d0-42e0-8481-5d4ff73af74a","Type":"ContainerDied","Data":"02c0f455693b6f16b3b4fa2ea231d9807e2b1ef202cf93bc1a81fc0f15b131fd"} Dec 04 22:34:35.172583 master-0 kubenswrapper[33572]: I1204 22:34:35.172556 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02c0f455693b6f16b3b4fa2ea231d9807e2b1ef202cf93bc1a81fc0f15b131fd" Dec 04 22:34:35.172583 master-0 kubenswrapper[33572]: I1204 22:34:35.172557 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rk7z7-config-pmf49" Dec 04 22:34:35.174649 master-0 kubenswrapper[33572]: I1204 22:34:35.174590 33572 generic.go:334] "Generic (PLEG): container finished" podID="9d6a1d9e-6849-435b-aee9-caf23bce31bb" containerID="6fbf6f3ece7242bb6a6b324c838350c3db0ead36498a57a5fccc127dc570d169" exitCode=0 Dec 04 22:34:35.174649 master-0 kubenswrapper[33572]: I1204 22:34:35.174647 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" event={"ID":"9d6a1d9e-6849-435b-aee9-caf23bce31bb","Type":"ContainerDied","Data":"6fbf6f3ece7242bb6a6b324c838350c3db0ead36498a57a5fccc127dc570d169"} Dec 04 22:34:35.174858 master-0 kubenswrapper[33572]: I1204 22:34:35.174674 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" event={"ID":"9d6a1d9e-6849-435b-aee9-caf23bce31bb","Type":"ContainerStarted","Data":"ff6d4e507f9376dfc3c933e8ab1fec8384f97096a9a39993d72629b0acc30c05"} Dec 04 22:34:35.755354 master-0 kubenswrapper[33572]: I1204 22:34:35.755268 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rk7z7-config-pmf49"] Dec 04 22:34:35.769000 master-0 kubenswrapper[33572]: I1204 22:34:35.768934 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rk7z7-config-pmf49"] Dec 04 22:34:35.844987 master-0 kubenswrapper[33572]: I1204 22:34:35.844932 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rk7z7-config-glczx"] Dec 04 22:34:35.892405 master-0 kubenswrapper[33572]: E1204 22:34:35.845444 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54c116e-f8d0-42e0-8481-5d4ff73af74a" containerName="ovn-config" Dec 04 22:34:35.892405 master-0 kubenswrapper[33572]: I1204 22:34:35.845463 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54c116e-f8d0-42e0-8481-5d4ff73af74a" containerName="ovn-config" Dec 04 22:34:35.892405 master-0 kubenswrapper[33572]: I1204 22:34:35.845715 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54c116e-f8d0-42e0-8481-5d4ff73af74a" containerName="ovn-config" Dec 04 22:34:35.892405 master-0 kubenswrapper[33572]: I1204 22:34:35.846331 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:35.892405 master-0 kubenswrapper[33572]: I1204 22:34:35.847914 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 22:34:35.892405 master-0 kubenswrapper[33572]: I1204 22:34:35.871630 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rk7z7-config-glczx"] Dec 04 22:34:35.923997 master-0 kubenswrapper[33572]: I1204 22:34:35.923931 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg2m4\" (UniqueName: \"kubernetes.io/projected/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-kube-api-access-jg2m4\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:35.924256 master-0 kubenswrapper[33572]: I1204 22:34:35.924010 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:35.924256 master-0 kubenswrapper[33572]: I1204 22:34:35.924115 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run-ovn\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:35.924256 master-0 kubenswrapper[33572]: I1204 22:34:35.924150 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-scripts\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:35.924256 master-0 kubenswrapper[33572]: I1204 22:34:35.924174 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-log-ovn\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:35.924256 master-0 kubenswrapper[33572]: I1204 22:34:35.924204 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-additional-scripts\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.025968 master-0 kubenswrapper[33572]: I1204 22:34:36.025898 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.026262 master-0 kubenswrapper[33572]: I1204 22:34:36.026054 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.026262 master-0 kubenswrapper[33572]: I1204 22:34:36.026150 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run-ovn\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.026358 master-0 kubenswrapper[33572]: I1204 22:34:36.026316 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-scripts\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.026431 master-0 kubenswrapper[33572]: I1204 22:34:36.026342 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run-ovn\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.026536 master-0 kubenswrapper[33572]: I1204 22:34:36.026492 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-log-ovn\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.026609 master-0 kubenswrapper[33572]: I1204 22:34:36.026591 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-additional-scripts\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.026770 master-0 kubenswrapper[33572]: I1204 22:34:36.026708 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg2m4\" (UniqueName: \"kubernetes.io/projected/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-kube-api-access-jg2m4\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.026891 master-0 kubenswrapper[33572]: I1204 22:34:36.026707 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-log-ovn\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.028119 master-0 kubenswrapper[33572]: I1204 22:34:36.028082 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-additional-scripts\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.033254 master-0 kubenswrapper[33572]: I1204 22:34:36.033187 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-scripts\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.049975 master-0 kubenswrapper[33572]: I1204 22:34:36.049914 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg2m4\" (UniqueName: \"kubernetes.io/projected/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-kube-api-access-jg2m4\") pod \"ovn-controller-rk7z7-config-glczx\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.190182 master-0 kubenswrapper[33572]: I1204 22:34:36.190088 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" event={"ID":"9d6a1d9e-6849-435b-aee9-caf23bce31bb","Type":"ContainerStarted","Data":"9170bbe0cf510948e1a4873299e247decf36ca3049c92c66997524dafe4d60a0"} Dec 04 22:34:36.190899 master-0 kubenswrapper[33572]: I1204 22:34:36.190848 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:36.205597 master-0 kubenswrapper[33572]: I1204 22:34:36.205549 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:36.226371 master-0 kubenswrapper[33572]: I1204 22:34:36.226240 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" podStartSLOduration=3.226217655 podStartE2EDuration="3.226217655s" podCreationTimestamp="2025-12-04 22:34:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:34:36.220323261 +0000 UTC m=+939.947848910" watchObservedRunningTime="2025-12-04 22:34:36.226217655 +0000 UTC m=+939.953743304" Dec 04 22:34:36.539249 master-0 kubenswrapper[33572]: I1204 22:34:36.539159 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54c116e-f8d0-42e0-8481-5d4ff73af74a" path="/var/lib/kubelet/pods/f54c116e-f8d0-42e0-8481-5d4ff73af74a/volumes" Dec 04 22:34:36.761717 master-0 kubenswrapper[33572]: I1204 22:34:36.761564 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rk7z7-config-glczx"] Dec 04 22:34:37.275991 master-0 kubenswrapper[33572]: I1204 22:34:37.275887 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rk7z7-config-glczx" event={"ID":"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3","Type":"ContainerStarted","Data":"5adde3abc3e3f18bf0d31605d95049d1715c87900158c6a85b651c0730015dca"} Dec 04 22:34:37.275991 master-0 kubenswrapper[33572]: I1204 22:34:37.275992 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rk7z7-config-glczx" event={"ID":"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3","Type":"ContainerStarted","Data":"0e0cd54f048c65f13dd95b1eafa6d9517b6ead9a95cf088fe82a77fa3e22fcb6"} Dec 04 22:34:37.312561 master-0 kubenswrapper[33572]: I1204 22:34:37.312389 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rk7z7-config-glczx" podStartSLOduration=2.312361784 podStartE2EDuration="2.312361784s" podCreationTimestamp="2025-12-04 22:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:34:37.305008331 +0000 UTC m=+941.032533990" watchObservedRunningTime="2025-12-04 22:34:37.312361784 +0000 UTC m=+941.039887433" Dec 04 22:34:38.154843 master-0 kubenswrapper[33572]: I1204 22:34:38.154764 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 22:34:38.292526 master-0 kubenswrapper[33572]: I1204 22:34:38.292453 33572 generic.go:334] "Generic (PLEG): container finished" podID="7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" containerID="5adde3abc3e3f18bf0d31605d95049d1715c87900158c6a85b651c0730015dca" exitCode=0 Dec 04 22:34:38.292731 master-0 kubenswrapper[33572]: I1204 22:34:38.292534 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rk7z7-config-glczx" event={"ID":"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3","Type":"ContainerDied","Data":"5adde3abc3e3f18bf0d31605d95049d1715c87900158c6a85b651c0730015dca"} Dec 04 22:34:39.741963 master-0 kubenswrapper[33572]: I1204 22:34:39.741868 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:39.756757 master-0 kubenswrapper[33572]: I1204 22:34:39.756678 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 22:34:39.938255 master-0 kubenswrapper[33572]: I1204 22:34:39.937898 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-scripts\") pod \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " Dec 04 22:34:39.938255 master-0 kubenswrapper[33572]: I1204 22:34:39.937974 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run-ovn\") pod \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " Dec 04 22:34:39.938628 master-0 kubenswrapper[33572]: I1204 22:34:39.938324 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg2m4\" (UniqueName: \"kubernetes.io/projected/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-kube-api-access-jg2m4\") pod \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " Dec 04 22:34:39.938628 master-0 kubenswrapper[33572]: I1204 22:34:39.938384 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-additional-scripts\") pod \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " Dec 04 22:34:39.938628 master-0 kubenswrapper[33572]: I1204 22:34:39.938441 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run\") pod \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " Dec 04 22:34:39.938628 master-0 kubenswrapper[33572]: I1204 22:34:39.938572 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-log-ovn\") pod \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\" (UID: \"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3\") " Dec 04 22:34:39.939177 master-0 kubenswrapper[33572]: I1204 22:34:39.939085 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" (UID: "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:34:39.939177 master-0 kubenswrapper[33572]: I1204 22:34:39.939100 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" (UID: "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:34:39.939177 master-0 kubenswrapper[33572]: I1204 22:34:39.939114 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run" (OuterVolumeSpecName: "var-run") pod "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" (UID: "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:34:39.939578 master-0 kubenswrapper[33572]: I1204 22:34:39.939237 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-scripts" (OuterVolumeSpecName: "scripts") pod "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" (UID: "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:39.939843 master-0 kubenswrapper[33572]: I1204 22:34:39.939767 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" (UID: "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:39.945652 master-0 kubenswrapper[33572]: I1204 22:34:39.941011 33572 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:39.945652 master-0 kubenswrapper[33572]: I1204 22:34:39.941059 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:39.945652 master-0 kubenswrapper[33572]: I1204 22:34:39.941080 33572 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:39.945652 master-0 kubenswrapper[33572]: I1204 22:34:39.941096 33572 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-additional-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:39.945652 master-0 kubenswrapper[33572]: I1204 22:34:39.941114 33572 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-var-run\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:39.951907 master-0 kubenswrapper[33572]: I1204 22:34:39.951870 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-kube-api-access-jg2m4" (OuterVolumeSpecName: "kube-api-access-jg2m4") pod "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" (UID: "7e1b569e-ab10-4b0c-9bba-34f005d5a2d3"). InnerVolumeSpecName "kube-api-access-jg2m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:40.047339 master-0 kubenswrapper[33572]: I1204 22:34:40.047295 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg2m4\" (UniqueName: \"kubernetes.io/projected/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3-kube-api-access-jg2m4\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:40.319970 master-0 kubenswrapper[33572]: I1204 22:34:40.319207 33572 generic.go:334] "Generic (PLEG): container finished" podID="13c75f01-7399-4114-96b7-9435a6ba089b" containerID="0d636103e4ae963e2e654a6eae6dddd7b9985787031941a9ef697cd6da609156" exitCode=0 Dec 04 22:34:40.319970 master-0 kubenswrapper[33572]: I1204 22:34:40.319329 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qbxvp" event={"ID":"13c75f01-7399-4114-96b7-9435a6ba089b","Type":"ContainerDied","Data":"0d636103e4ae963e2e654a6eae6dddd7b9985787031941a9ef697cd6da609156"} Dec 04 22:34:40.321856 master-0 kubenswrapper[33572]: I1204 22:34:40.321809 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rk7z7-config-glczx" event={"ID":"7e1b569e-ab10-4b0c-9bba-34f005d5a2d3","Type":"ContainerDied","Data":"0e0cd54f048c65f13dd95b1eafa6d9517b6ead9a95cf088fe82a77fa3e22fcb6"} Dec 04 22:34:40.321856 master-0 kubenswrapper[33572]: I1204 22:34:40.321853 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e0cd54f048c65f13dd95b1eafa6d9517b6ead9a95cf088fe82a77fa3e22fcb6" Dec 04 22:34:40.322087 master-0 kubenswrapper[33572]: I1204 22:34:40.321900 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rk7z7-config-glczx" Dec 04 22:34:40.858362 master-0 kubenswrapper[33572]: I1204 22:34:40.858279 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rk7z7-config-glczx"] Dec 04 22:34:40.871599 master-0 kubenswrapper[33572]: I1204 22:34:40.871529 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rk7z7-config-glczx"] Dec 04 22:34:41.726669 master-0 kubenswrapper[33572]: I1204 22:34:41.726533 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-kjf8b"] Dec 04 22:34:41.727341 master-0 kubenswrapper[33572]: E1204 22:34:41.727314 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" containerName="ovn-config" Dec 04 22:34:41.727430 master-0 kubenswrapper[33572]: I1204 22:34:41.727343 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" containerName="ovn-config" Dec 04 22:34:41.727652 master-0 kubenswrapper[33572]: I1204 22:34:41.727628 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" containerName="ovn-config" Dec 04 22:34:41.728705 master-0 kubenswrapper[33572]: I1204 22:34:41.728669 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kjf8b" Dec 04 22:34:41.743569 master-0 kubenswrapper[33572]: I1204 22:34:41.738157 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kjf8b"] Dec 04 22:34:41.824686 master-0 kubenswrapper[33572]: I1204 22:34:41.822464 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-d6jvr"] Dec 04 22:34:41.824686 master-0 kubenswrapper[33572]: I1204 22:34:41.823934 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d6jvr" Dec 04 22:34:41.851670 master-0 kubenswrapper[33572]: I1204 22:34:41.850596 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-88ec-account-create-update-g22k5"] Dec 04 22:34:41.852741 master-0 kubenswrapper[33572]: I1204 22:34:41.852046 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-88ec-account-create-update-g22k5" Dec 04 22:34:41.855393 master-0 kubenswrapper[33572]: I1204 22:34:41.854884 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 22:34:41.877722 master-0 kubenswrapper[33572]: I1204 22:34:41.877340 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d6jvr"] Dec 04 22:34:41.887665 master-0 kubenswrapper[33572]: I1204 22:34:41.887585 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z8sb\" (UniqueName: \"kubernetes.io/projected/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-kube-api-access-9z8sb\") pod \"cinder-db-create-kjf8b\" (UID: \"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49\") " pod="openstack/cinder-db-create-kjf8b" Dec 04 22:34:41.887874 master-0 kubenswrapper[33572]: I1204 22:34:41.887753 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-operator-scripts\") pod \"cinder-db-create-kjf8b\" (UID: \"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49\") " pod="openstack/cinder-db-create-kjf8b" Dec 04 22:34:41.888771 master-0 kubenswrapper[33572]: I1204 22:34:41.888727 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-88ec-account-create-update-g22k5"] Dec 04 22:34:41.951808 master-0 kubenswrapper[33572]: I1204 22:34:41.951753 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:41.990093 master-0 kubenswrapper[33572]: I1204 22:34:41.989092 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z8sb\" (UniqueName: \"kubernetes.io/projected/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-kube-api-access-9z8sb\") pod \"cinder-db-create-kjf8b\" (UID: \"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49\") " pod="openstack/cinder-db-create-kjf8b" Dec 04 22:34:41.990093 master-0 kubenswrapper[33572]: I1204 22:34:41.989181 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4vjx\" (UniqueName: \"kubernetes.io/projected/9c0fabb4-e989-4e6b-b5f1-27f23997389b-kube-api-access-x4vjx\") pod \"cinder-88ec-account-create-update-g22k5\" (UID: \"9c0fabb4-e989-4e6b-b5f1-27f23997389b\") " pod="openstack/cinder-88ec-account-create-update-g22k5" Dec 04 22:34:41.990093 master-0 kubenswrapper[33572]: I1204 22:34:41.989241 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-operator-scripts\") pod \"cinder-db-create-kjf8b\" (UID: \"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49\") " pod="openstack/cinder-db-create-kjf8b" Dec 04 22:34:41.990093 master-0 kubenswrapper[33572]: I1204 22:34:41.989303 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2a41736-e271-4aef-a67f-77937bc3e446-operator-scripts\") pod \"neutron-db-create-d6jvr\" (UID: \"d2a41736-e271-4aef-a67f-77937bc3e446\") " pod="openstack/neutron-db-create-d6jvr" Dec 04 22:34:41.990093 master-0 kubenswrapper[33572]: I1204 22:34:41.989381 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qjmp\" (UniqueName: \"kubernetes.io/projected/d2a41736-e271-4aef-a67f-77937bc3e446-kube-api-access-7qjmp\") pod \"neutron-db-create-d6jvr\" (UID: \"d2a41736-e271-4aef-a67f-77937bc3e446\") " pod="openstack/neutron-db-create-d6jvr" Dec 04 22:34:41.990093 master-0 kubenswrapper[33572]: I1204 22:34:41.989414 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c0fabb4-e989-4e6b-b5f1-27f23997389b-operator-scripts\") pod \"cinder-88ec-account-create-update-g22k5\" (UID: \"9c0fabb4-e989-4e6b-b5f1-27f23997389b\") " pod="openstack/cinder-88ec-account-create-update-g22k5" Dec 04 22:34:41.996358 master-0 kubenswrapper[33572]: I1204 22:34:41.995223 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-operator-scripts\") pod \"cinder-db-create-kjf8b\" (UID: \"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49\") " pod="openstack/cinder-db-create-kjf8b" Dec 04 22:34:42.016597 master-0 kubenswrapper[33572]: I1204 22:34:42.016553 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z8sb\" (UniqueName: \"kubernetes.io/projected/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-kube-api-access-9z8sb\") pod \"cinder-db-create-kjf8b\" (UID: \"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49\") " pod="openstack/cinder-db-create-kjf8b" Dec 04 22:34:42.025861 master-0 kubenswrapper[33572]: I1204 22:34:42.025294 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2ef1-account-create-update-nzfgp"] Dec 04 22:34:42.025861 master-0 kubenswrapper[33572]: E1204 22:34:42.025833 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c75f01-7399-4114-96b7-9435a6ba089b" containerName="glance-db-sync" Dec 04 22:34:42.025861 master-0 kubenswrapper[33572]: I1204 22:34:42.025847 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c75f01-7399-4114-96b7-9435a6ba089b" containerName="glance-db-sync" Dec 04 22:34:42.026145 master-0 kubenswrapper[33572]: I1204 22:34:42.026088 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c75f01-7399-4114-96b7-9435a6ba089b" containerName="glance-db-sync" Dec 04 22:34:42.026792 master-0 kubenswrapper[33572]: I1204 22:34:42.026771 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2ef1-account-create-update-nzfgp" Dec 04 22:34:42.028691 master-0 kubenswrapper[33572]: I1204 22:34:42.028658 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 04 22:34:42.036324 master-0 kubenswrapper[33572]: I1204 22:34:42.036262 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2ef1-account-create-update-nzfgp"] Dec 04 22:34:42.045858 master-0 kubenswrapper[33572]: I1204 22:34:42.045793 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kjf8b" Dec 04 22:34:42.092114 master-0 kubenswrapper[33572]: I1204 22:34:42.092068 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-config-data\") pod \"13c75f01-7399-4114-96b7-9435a6ba089b\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " Dec 04 22:34:42.092260 master-0 kubenswrapper[33572]: I1204 22:34:42.092129 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grzj2\" (UniqueName: \"kubernetes.io/projected/13c75f01-7399-4114-96b7-9435a6ba089b-kube-api-access-grzj2\") pod \"13c75f01-7399-4114-96b7-9435a6ba089b\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " Dec 04 22:34:42.092260 master-0 kubenswrapper[33572]: I1204 22:34:42.092204 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-db-sync-config-data\") pod \"13c75f01-7399-4114-96b7-9435a6ba089b\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " Dec 04 22:34:42.092260 master-0 kubenswrapper[33572]: I1204 22:34:42.092227 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-combined-ca-bundle\") pod \"13c75f01-7399-4114-96b7-9435a6ba089b\" (UID: \"13c75f01-7399-4114-96b7-9435a6ba089b\") " Dec 04 22:34:42.092634 master-0 kubenswrapper[33572]: I1204 22:34:42.092601 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4vjx\" (UniqueName: \"kubernetes.io/projected/9c0fabb4-e989-4e6b-b5f1-27f23997389b-kube-api-access-x4vjx\") pod \"cinder-88ec-account-create-update-g22k5\" (UID: \"9c0fabb4-e989-4e6b-b5f1-27f23997389b\") " pod="openstack/cinder-88ec-account-create-update-g22k5" Dec 04 22:34:42.092796 master-0 kubenswrapper[33572]: I1204 22:34:42.092769 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2a41736-e271-4aef-a67f-77937bc3e446-operator-scripts\") pod \"neutron-db-create-d6jvr\" (UID: \"d2a41736-e271-4aef-a67f-77937bc3e446\") " pod="openstack/neutron-db-create-d6jvr" Dec 04 22:34:42.092869 master-0 kubenswrapper[33572]: I1204 22:34:42.092848 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qjmp\" (UniqueName: \"kubernetes.io/projected/d2a41736-e271-4aef-a67f-77937bc3e446-kube-api-access-7qjmp\") pod \"neutron-db-create-d6jvr\" (UID: \"d2a41736-e271-4aef-a67f-77937bc3e446\") " pod="openstack/neutron-db-create-d6jvr" Dec 04 22:34:42.092906 master-0 kubenswrapper[33572]: I1204 22:34:42.092894 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c0fabb4-e989-4e6b-b5f1-27f23997389b-operator-scripts\") pod \"cinder-88ec-account-create-update-g22k5\" (UID: \"9c0fabb4-e989-4e6b-b5f1-27f23997389b\") " pod="openstack/cinder-88ec-account-create-update-g22k5" Dec 04 22:34:42.093794 master-0 kubenswrapper[33572]: I1204 22:34:42.093762 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c0fabb4-e989-4e6b-b5f1-27f23997389b-operator-scripts\") pod \"cinder-88ec-account-create-update-g22k5\" (UID: \"9c0fabb4-e989-4e6b-b5f1-27f23997389b\") " pod="openstack/cinder-88ec-account-create-update-g22k5" Dec 04 22:34:42.096422 master-0 kubenswrapper[33572]: I1204 22:34:42.096277 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c75f01-7399-4114-96b7-9435a6ba089b-kube-api-access-grzj2" (OuterVolumeSpecName: "kube-api-access-grzj2") pod "13c75f01-7399-4114-96b7-9435a6ba089b" (UID: "13c75f01-7399-4114-96b7-9435a6ba089b"). InnerVolumeSpecName "kube-api-access-grzj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:42.096955 master-0 kubenswrapper[33572]: I1204 22:34:42.096929 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2a41736-e271-4aef-a67f-77937bc3e446-operator-scripts\") pod \"neutron-db-create-d6jvr\" (UID: \"d2a41736-e271-4aef-a67f-77937bc3e446\") " pod="openstack/neutron-db-create-d6jvr" Dec 04 22:34:42.097221 master-0 kubenswrapper[33572]: I1204 22:34:42.097200 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-rln28"] Dec 04 22:34:42.097379 master-0 kubenswrapper[33572]: I1204 22:34:42.097329 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "13c75f01-7399-4114-96b7-9435a6ba089b" (UID: "13c75f01-7399-4114-96b7-9435a6ba089b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:42.098785 master-0 kubenswrapper[33572]: I1204 22:34:42.098715 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.102102 master-0 kubenswrapper[33572]: I1204 22:34:42.102054 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 22:34:42.102192 master-0 kubenswrapper[33572]: I1204 22:34:42.102134 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 22:34:42.102345 master-0 kubenswrapper[33572]: I1204 22:34:42.102321 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 22:34:42.114244 master-0 kubenswrapper[33572]: I1204 22:34:42.114158 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rln28"] Dec 04 22:34:42.119966 master-0 kubenswrapper[33572]: I1204 22:34:42.119829 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qjmp\" (UniqueName: \"kubernetes.io/projected/d2a41736-e271-4aef-a67f-77937bc3e446-kube-api-access-7qjmp\") pod \"neutron-db-create-d6jvr\" (UID: \"d2a41736-e271-4aef-a67f-77937bc3e446\") " pod="openstack/neutron-db-create-d6jvr" Dec 04 22:34:42.132557 master-0 kubenswrapper[33572]: I1204 22:34:42.132483 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4vjx\" (UniqueName: \"kubernetes.io/projected/9c0fabb4-e989-4e6b-b5f1-27f23997389b-kube-api-access-x4vjx\") pod \"cinder-88ec-account-create-update-g22k5\" (UID: \"9c0fabb4-e989-4e6b-b5f1-27f23997389b\") " pod="openstack/cinder-88ec-account-create-update-g22k5" Dec 04 22:34:42.136575 master-0 kubenswrapper[33572]: I1204 22:34:42.136526 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13c75f01-7399-4114-96b7-9435a6ba089b" (UID: "13c75f01-7399-4114-96b7-9435a6ba089b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:42.157284 master-0 kubenswrapper[33572]: I1204 22:34:42.157215 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-config-data" (OuterVolumeSpecName: "config-data") pod "13c75f01-7399-4114-96b7-9435a6ba089b" (UID: "13c75f01-7399-4114-96b7-9435a6ba089b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:42.195127 master-0 kubenswrapper[33572]: I1204 22:34:42.195078 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxr5t\" (UniqueName: \"kubernetes.io/projected/2ff98c13-5ff9-4ffd-a779-e3121642011e-kube-api-access-bxr5t\") pod \"keystone-db-sync-rln28\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.195331 master-0 kubenswrapper[33572]: I1204 22:34:42.195147 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wllp\" (UniqueName: \"kubernetes.io/projected/c4f6a92e-1188-4a7d-94d8-0989924196a7-kube-api-access-6wllp\") pod \"neutron-2ef1-account-create-update-nzfgp\" (UID: \"c4f6a92e-1188-4a7d-94d8-0989924196a7\") " pod="openstack/neutron-2ef1-account-create-update-nzfgp" Dec 04 22:34:42.195606 master-0 kubenswrapper[33572]: I1204 22:34:42.195539 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4f6a92e-1188-4a7d-94d8-0989924196a7-operator-scripts\") pod \"neutron-2ef1-account-create-update-nzfgp\" (UID: \"c4f6a92e-1188-4a7d-94d8-0989924196a7\") " pod="openstack/neutron-2ef1-account-create-update-nzfgp" Dec 04 22:34:42.195777 master-0 kubenswrapper[33572]: I1204 22:34:42.195755 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-config-data\") pod \"keystone-db-sync-rln28\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.195923 master-0 kubenswrapper[33572]: I1204 22:34:42.195892 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-combined-ca-bundle\") pod \"keystone-db-sync-rln28\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.196087 master-0 kubenswrapper[33572]: I1204 22:34:42.196066 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:42.196134 master-0 kubenswrapper[33572]: I1204 22:34:42.196092 33572 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:42.196134 master-0 kubenswrapper[33572]: I1204 22:34:42.196110 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c75f01-7399-4114-96b7-9435a6ba089b-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:42.196134 master-0 kubenswrapper[33572]: I1204 22:34:42.196123 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grzj2\" (UniqueName: \"kubernetes.io/projected/13c75f01-7399-4114-96b7-9435a6ba089b-kube-api-access-grzj2\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:42.252604 master-0 kubenswrapper[33572]: I1204 22:34:42.252332 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d6jvr" Dec 04 22:34:42.265366 master-0 kubenswrapper[33572]: I1204 22:34:42.265338 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-88ec-account-create-update-g22k5" Dec 04 22:34:42.297445 master-0 kubenswrapper[33572]: I1204 22:34:42.297346 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4f6a92e-1188-4a7d-94d8-0989924196a7-operator-scripts\") pod \"neutron-2ef1-account-create-update-nzfgp\" (UID: \"c4f6a92e-1188-4a7d-94d8-0989924196a7\") " pod="openstack/neutron-2ef1-account-create-update-nzfgp" Dec 04 22:34:42.297445 master-0 kubenswrapper[33572]: I1204 22:34:42.297452 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-config-data\") pod \"keystone-db-sync-rln28\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.297728 master-0 kubenswrapper[33572]: I1204 22:34:42.297518 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-combined-ca-bundle\") pod \"keystone-db-sync-rln28\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.297728 master-0 kubenswrapper[33572]: I1204 22:34:42.297576 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxr5t\" (UniqueName: \"kubernetes.io/projected/2ff98c13-5ff9-4ffd-a779-e3121642011e-kube-api-access-bxr5t\") pod \"keystone-db-sync-rln28\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.297728 master-0 kubenswrapper[33572]: I1204 22:34:42.297598 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wllp\" (UniqueName: \"kubernetes.io/projected/c4f6a92e-1188-4a7d-94d8-0989924196a7-kube-api-access-6wllp\") pod \"neutron-2ef1-account-create-update-nzfgp\" (UID: \"c4f6a92e-1188-4a7d-94d8-0989924196a7\") " pod="openstack/neutron-2ef1-account-create-update-nzfgp" Dec 04 22:34:42.299279 master-0 kubenswrapper[33572]: I1204 22:34:42.299226 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4f6a92e-1188-4a7d-94d8-0989924196a7-operator-scripts\") pod \"neutron-2ef1-account-create-update-nzfgp\" (UID: \"c4f6a92e-1188-4a7d-94d8-0989924196a7\") " pod="openstack/neutron-2ef1-account-create-update-nzfgp" Dec 04 22:34:42.305906 master-0 kubenswrapper[33572]: I1204 22:34:42.304616 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-combined-ca-bundle\") pod \"keystone-db-sync-rln28\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.306993 master-0 kubenswrapper[33572]: I1204 22:34:42.306961 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-config-data\") pod \"keystone-db-sync-rln28\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.316547 master-0 kubenswrapper[33572]: I1204 22:34:42.316490 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wllp\" (UniqueName: \"kubernetes.io/projected/c4f6a92e-1188-4a7d-94d8-0989924196a7-kube-api-access-6wllp\") pod \"neutron-2ef1-account-create-update-nzfgp\" (UID: \"c4f6a92e-1188-4a7d-94d8-0989924196a7\") " pod="openstack/neutron-2ef1-account-create-update-nzfgp" Dec 04 22:34:42.317168 master-0 kubenswrapper[33572]: I1204 22:34:42.317095 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxr5t\" (UniqueName: \"kubernetes.io/projected/2ff98c13-5ff9-4ffd-a779-e3121642011e-kube-api-access-bxr5t\") pod \"keystone-db-sync-rln28\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.364621 master-0 kubenswrapper[33572]: I1204 22:34:42.362728 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qbxvp" event={"ID":"13c75f01-7399-4114-96b7-9435a6ba089b","Type":"ContainerDied","Data":"023868215ee1c48075a12fc29c17dd2c0751522b9c071b6d823ef01813330a2f"} Dec 04 22:34:42.364621 master-0 kubenswrapper[33572]: I1204 22:34:42.362809 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="023868215ee1c48075a12fc29c17dd2c0751522b9c071b6d823ef01813330a2f" Dec 04 22:34:42.364621 master-0 kubenswrapper[33572]: I1204 22:34:42.362949 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qbxvp" Dec 04 22:34:42.488819 master-0 kubenswrapper[33572]: I1204 22:34:42.483202 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2ef1-account-create-update-nzfgp" Dec 04 22:34:42.515715 master-0 kubenswrapper[33572]: I1204 22:34:42.512585 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:42.560119 master-0 kubenswrapper[33572]: W1204 22:34:42.557720 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29e3bb4b_ddd7_4df1_8ad6_cf8fbded2f49.slice/crio-0601ddbb431836a04e26e7b6200a53cea6715c46520119938b3e03803eaabb2c WatchSource:0}: Error finding container 0601ddbb431836a04e26e7b6200a53cea6715c46520119938b3e03803eaabb2c: Status 404 returned error can't find the container with id 0601ddbb431836a04e26e7b6200a53cea6715c46520119938b3e03803eaabb2c Dec 04 22:34:42.567193 master-0 kubenswrapper[33572]: I1204 22:34:42.567070 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1b569e-ab10-4b0c-9bba-34f005d5a2d3" path="/var/lib/kubelet/pods/7e1b569e-ab10-4b0c-9bba-34f005d5a2d3/volumes" Dec 04 22:34:42.568325 master-0 kubenswrapper[33572]: I1204 22:34:42.568291 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kjf8b"] Dec 04 22:34:42.765699 master-0 kubenswrapper[33572]: I1204 22:34:42.765644 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c6f8c9899-t2px9"] Dec 04 22:34:42.765988 master-0 kubenswrapper[33572]: I1204 22:34:42.765901 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" podUID="9d6a1d9e-6849-435b-aee9-caf23bce31bb" containerName="dnsmasq-dns" containerID="cri-o://9170bbe0cf510948e1a4873299e247decf36ca3049c92c66997524dafe4d60a0" gracePeriod=10 Dec 04 22:34:42.787212 master-0 kubenswrapper[33572]: I1204 22:34:42.787130 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:42.815378 master-0 kubenswrapper[33572]: I1204 22:34:42.815308 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d6jvr"] Dec 04 22:34:42.832392 master-0 kubenswrapper[33572]: I1204 22:34:42.831745 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp"] Dec 04 22:34:42.892619 master-0 kubenswrapper[33572]: I1204 22:34:42.892149 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:42.991764 master-0 kubenswrapper[33572]: I1204 22:34:42.991238 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp"] Dec 04 22:34:43.014193 master-0 kubenswrapper[33572]: I1204 22:34:43.014127 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-config\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.014624 master-0 kubenswrapper[33572]: I1204 22:34:43.014594 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-sb\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.014774 master-0 kubenswrapper[33572]: I1204 22:34:43.014757 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkdtz\" (UniqueName: \"kubernetes.io/projected/7fd26913-95c6-4bbe-a234-67d8b5544c80-kube-api-access-xkdtz\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.014877 master-0 kubenswrapper[33572]: I1204 22:34:43.014855 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-nb\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.016079 master-0 kubenswrapper[33572]: I1204 22:34:43.016017 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-svc\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.016180 master-0 kubenswrapper[33572]: I1204 22:34:43.016167 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-swift-storage-0\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.117716 master-0 kubenswrapper[33572]: I1204 22:34:43.117666 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-88ec-account-create-update-g22k5"] Dec 04 22:34:43.120677 master-0 kubenswrapper[33572]: I1204 22:34:43.120634 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-sb\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.120746 master-0 kubenswrapper[33572]: I1204 22:34:43.120700 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkdtz\" (UniqueName: \"kubernetes.io/projected/7fd26913-95c6-4bbe-a234-67d8b5544c80-kube-api-access-xkdtz\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.120746 master-0 kubenswrapper[33572]: I1204 22:34:43.120734 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-nb\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.120878 master-0 kubenswrapper[33572]: I1204 22:34:43.120859 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-svc\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.120924 master-0 kubenswrapper[33572]: I1204 22:34:43.120886 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-swift-storage-0\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.121026 master-0 kubenswrapper[33572]: I1204 22:34:43.121001 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-config\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.122134 master-0 kubenswrapper[33572]: I1204 22:34:43.121987 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-config\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.128663 master-0 kubenswrapper[33572]: I1204 22:34:43.128615 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-nb\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.152546 master-0 kubenswrapper[33572]: I1204 22:34:43.148351 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-sb\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.152546 master-0 kubenswrapper[33572]: I1204 22:34:43.149498 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-svc\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.152546 master-0 kubenswrapper[33572]: I1204 22:34:43.149540 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-swift-storage-0\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.170397 master-0 kubenswrapper[33572]: I1204 22:34:43.162600 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkdtz\" (UniqueName: \"kubernetes.io/projected/7fd26913-95c6-4bbe-a234-67d8b5544c80-kube-api-access-xkdtz\") pod \"dnsmasq-dns-5f5b6cc9d9-fl8vp\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.186529 master-0 kubenswrapper[33572]: I1204 22:34:43.171796 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:43.209534 master-0 kubenswrapper[33572]: I1204 22:34:43.178667 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-rln28"] Dec 04 22:34:43.209534 master-0 kubenswrapper[33572]: I1204 22:34:43.189311 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2ef1-account-create-update-nzfgp"] Dec 04 22:34:43.388239 master-0 kubenswrapper[33572]: I1204 22:34:43.387995 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kjf8b" event={"ID":"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49","Type":"ContainerStarted","Data":"03618bfdca10cd63f0ba0d81558c82a816166ed5c737eaa60242022f8268fd49"} Dec 04 22:34:43.388239 master-0 kubenswrapper[33572]: I1204 22:34:43.388043 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kjf8b" event={"ID":"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49","Type":"ContainerStarted","Data":"0601ddbb431836a04e26e7b6200a53cea6715c46520119938b3e03803eaabb2c"} Dec 04 22:34:43.399159 master-0 kubenswrapper[33572]: I1204 22:34:43.399111 33572 generic.go:334] "Generic (PLEG): container finished" podID="9d6a1d9e-6849-435b-aee9-caf23bce31bb" containerID="9170bbe0cf510948e1a4873299e247decf36ca3049c92c66997524dafe4d60a0" exitCode=0 Dec 04 22:34:43.399299 master-0 kubenswrapper[33572]: I1204 22:34:43.399221 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" event={"ID":"9d6a1d9e-6849-435b-aee9-caf23bce31bb","Type":"ContainerDied","Data":"9170bbe0cf510948e1a4873299e247decf36ca3049c92c66997524dafe4d60a0"} Dec 04 22:34:43.416731 master-0 kubenswrapper[33572]: I1204 22:34:43.411771 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-kjf8b" podStartSLOduration=2.411755063 podStartE2EDuration="2.411755063s" podCreationTimestamp="2025-12-04 22:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:34:43.404896523 +0000 UTC m=+947.132422172" watchObservedRunningTime="2025-12-04 22:34:43.411755063 +0000 UTC m=+947.139280702" Dec 04 22:34:43.436975 master-0 kubenswrapper[33572]: I1204 22:34:43.436924 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d6jvr" event={"ID":"d2a41736-e271-4aef-a67f-77937bc3e446","Type":"ContainerStarted","Data":"cc783703bd4e8c644d00a85a57d13d3212a5ef6892e4370ab80917aee2ff6673"} Dec 04 22:34:43.439644 master-0 kubenswrapper[33572]: I1204 22:34:43.439416 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2ef1-account-create-update-nzfgp" event={"ID":"c4f6a92e-1188-4a7d-94d8-0989924196a7","Type":"ContainerStarted","Data":"61e686563e1fb9cea9251ee0daca2b28d540278b2e686472351b9923018643d8"} Dec 04 22:34:43.440866 master-0 kubenswrapper[33572]: I1204 22:34:43.440747 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-88ec-account-create-update-g22k5" event={"ID":"9c0fabb4-e989-4e6b-b5f1-27f23997389b","Type":"ContainerStarted","Data":"ae0f57b4c6a3f110777e7793e5e5f43a37de57f3c22065c5900e0aebe4466e42"} Dec 04 22:34:43.443387 master-0 kubenswrapper[33572]: I1204 22:34:43.443354 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rln28" event={"ID":"2ff98c13-5ff9-4ffd-a779-e3121642011e","Type":"ContainerStarted","Data":"ee9d78222c86f8644d8bcd07da12fad5dff5416feb415bf67cfcf0a7819f8244"} Dec 04 22:34:43.506464 master-0 kubenswrapper[33572]: I1204 22:34:43.506141 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:43.629577 master-0 kubenswrapper[33572]: I1204 22:34:43.629518 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-config\") pod \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " Dec 04 22:34:43.629821 master-0 kubenswrapper[33572]: I1204 22:34:43.629770 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-svc\") pod \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " Dec 04 22:34:43.629865 master-0 kubenswrapper[33572]: I1204 22:34:43.629839 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-nb\") pod \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " Dec 04 22:34:43.629923 master-0 kubenswrapper[33572]: I1204 22:34:43.629893 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wn6z\" (UniqueName: \"kubernetes.io/projected/9d6a1d9e-6849-435b-aee9-caf23bce31bb-kube-api-access-7wn6z\") pod \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " Dec 04 22:34:43.629996 master-0 kubenswrapper[33572]: I1204 22:34:43.629978 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-sb\") pod \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " Dec 04 22:34:43.630073 master-0 kubenswrapper[33572]: I1204 22:34:43.630057 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-swift-storage-0\") pod \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\" (UID: \"9d6a1d9e-6849-435b-aee9-caf23bce31bb\") " Dec 04 22:34:43.660534 master-0 kubenswrapper[33572]: I1204 22:34:43.637532 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6a1d9e-6849-435b-aee9-caf23bce31bb-kube-api-access-7wn6z" (OuterVolumeSpecName: "kube-api-access-7wn6z") pod "9d6a1d9e-6849-435b-aee9-caf23bce31bb" (UID: "9d6a1d9e-6849-435b-aee9-caf23bce31bb"). InnerVolumeSpecName "kube-api-access-7wn6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:43.709027 master-0 kubenswrapper[33572]: I1204 22:34:43.708962 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d6a1d9e-6849-435b-aee9-caf23bce31bb" (UID: "9d6a1d9e-6849-435b-aee9-caf23bce31bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:43.716940 master-0 kubenswrapper[33572]: I1204 22:34:43.716410 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d6a1d9e-6849-435b-aee9-caf23bce31bb" (UID: "9d6a1d9e-6849-435b-aee9-caf23bce31bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:43.731874 master-0 kubenswrapper[33572]: I1204 22:34:43.731827 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d6a1d9e-6849-435b-aee9-caf23bce31bb" (UID: "9d6a1d9e-6849-435b-aee9-caf23bce31bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:43.732537 master-0 kubenswrapper[33572]: I1204 22:34:43.732262 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:43.732537 master-0 kubenswrapper[33572]: I1204 22:34:43.732298 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:43.732537 master-0 kubenswrapper[33572]: I1204 22:34:43.732311 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wn6z\" (UniqueName: \"kubernetes.io/projected/9d6a1d9e-6849-435b-aee9-caf23bce31bb-kube-api-access-7wn6z\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:43.732537 master-0 kubenswrapper[33572]: I1204 22:34:43.732320 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:43.755163 master-0 kubenswrapper[33572]: I1204 22:34:43.754889 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d6a1d9e-6849-435b-aee9-caf23bce31bb" (UID: "9d6a1d9e-6849-435b-aee9-caf23bce31bb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:43.771537 master-0 kubenswrapper[33572]: I1204 22:34:43.771468 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-config" (OuterVolumeSpecName: "config") pod "9d6a1d9e-6849-435b-aee9-caf23bce31bb" (UID: "9d6a1d9e-6849-435b-aee9-caf23bce31bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:43.836602 master-0 kubenswrapper[33572]: I1204 22:34:43.835044 33572 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:43.836602 master-0 kubenswrapper[33572]: I1204 22:34:43.835098 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d6a1d9e-6849-435b-aee9-caf23bce31bb-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:43.843903 master-0 kubenswrapper[33572]: I1204 22:34:43.838293 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp"] Dec 04 22:34:43.870407 master-0 kubenswrapper[33572]: W1204 22:34:43.870338 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fd26913_95c6_4bbe_a234_67d8b5544c80.slice/crio-f5225067757a90b91bb3e324518f389bf52deef06deb67e65ca168e0282042e7 WatchSource:0}: Error finding container f5225067757a90b91bb3e324518f389bf52deef06deb67e65ca168e0282042e7: Status 404 returned error can't find the container with id f5225067757a90b91bb3e324518f389bf52deef06deb67e65ca168e0282042e7 Dec 04 22:34:44.459218 master-0 kubenswrapper[33572]: I1204 22:34:44.459029 33572 generic.go:334] "Generic (PLEG): container finished" podID="d2a41736-e271-4aef-a67f-77937bc3e446" containerID="f6abd516c5fea9b5bd341b0a7d1e2139fc949d8d6089d7ff3cdb7a518d0c2899" exitCode=0 Dec 04 22:34:44.459218 master-0 kubenswrapper[33572]: I1204 22:34:44.459122 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d6jvr" event={"ID":"d2a41736-e271-4aef-a67f-77937bc3e446","Type":"ContainerDied","Data":"f6abd516c5fea9b5bd341b0a7d1e2139fc949d8d6089d7ff3cdb7a518d0c2899"} Dec 04 22:34:44.472850 master-0 kubenswrapper[33572]: I1204 22:34:44.472796 33572 generic.go:334] "Generic (PLEG): container finished" podID="c4f6a92e-1188-4a7d-94d8-0989924196a7" containerID="b3650a3e3b1126254c5989162d4092a91d06a2dd1e654579dae08edb3440de9c" exitCode=0 Dec 04 22:34:44.472958 master-0 kubenswrapper[33572]: I1204 22:34:44.472878 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2ef1-account-create-update-nzfgp" event={"ID":"c4f6a92e-1188-4a7d-94d8-0989924196a7","Type":"ContainerDied","Data":"b3650a3e3b1126254c5989162d4092a91d06a2dd1e654579dae08edb3440de9c"} Dec 04 22:34:44.474405 master-0 kubenswrapper[33572]: I1204 22:34:44.474381 33572 generic.go:334] "Generic (PLEG): container finished" podID="9c0fabb4-e989-4e6b-b5f1-27f23997389b" containerID="262ca22dfdb092ef67da696554009a9d233510bdc1ab8f72d9b6f18e697b954e" exitCode=0 Dec 04 22:34:44.474496 master-0 kubenswrapper[33572]: I1204 22:34:44.474433 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-88ec-account-create-update-g22k5" event={"ID":"9c0fabb4-e989-4e6b-b5f1-27f23997389b","Type":"ContainerDied","Data":"262ca22dfdb092ef67da696554009a9d233510bdc1ab8f72d9b6f18e697b954e"} Dec 04 22:34:44.475681 master-0 kubenswrapper[33572]: I1204 22:34:44.475657 33572 generic.go:334] "Generic (PLEG): container finished" podID="29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49" containerID="03618bfdca10cd63f0ba0d81558c82a816166ed5c737eaa60242022f8268fd49" exitCode=0 Dec 04 22:34:44.475756 master-0 kubenswrapper[33572]: I1204 22:34:44.475687 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kjf8b" event={"ID":"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49","Type":"ContainerDied","Data":"03618bfdca10cd63f0ba0d81558c82a816166ed5c737eaa60242022f8268fd49"} Dec 04 22:34:44.477323 master-0 kubenswrapper[33572]: I1204 22:34:44.477256 33572 generic.go:334] "Generic (PLEG): container finished" podID="7fd26913-95c6-4bbe-a234-67d8b5544c80" containerID="1866e45bcac4dc82b281fc6aca968c7db2fc493bc176cdc7e9b222b0dc038ad3" exitCode=0 Dec 04 22:34:44.477386 master-0 kubenswrapper[33572]: I1204 22:34:44.477360 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" event={"ID":"7fd26913-95c6-4bbe-a234-67d8b5544c80","Type":"ContainerDied","Data":"1866e45bcac4dc82b281fc6aca968c7db2fc493bc176cdc7e9b222b0dc038ad3"} Dec 04 22:34:44.477440 master-0 kubenswrapper[33572]: I1204 22:34:44.477397 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" event={"ID":"7fd26913-95c6-4bbe-a234-67d8b5544c80","Type":"ContainerStarted","Data":"f5225067757a90b91bb3e324518f389bf52deef06deb67e65ca168e0282042e7"} Dec 04 22:34:44.480741 master-0 kubenswrapper[33572]: I1204 22:34:44.480698 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" event={"ID":"9d6a1d9e-6849-435b-aee9-caf23bce31bb","Type":"ContainerDied","Data":"ff6d4e507f9376dfc3c933e8ab1fec8384f97096a9a39993d72629b0acc30c05"} Dec 04 22:34:44.480803 master-0 kubenswrapper[33572]: I1204 22:34:44.480747 33572 scope.go:117] "RemoveContainer" containerID="9170bbe0cf510948e1a4873299e247decf36ca3049c92c66997524dafe4d60a0" Dec 04 22:34:44.480874 master-0 kubenswrapper[33572]: I1204 22:34:44.480844 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c6f8c9899-t2px9" Dec 04 22:34:44.598034 master-0 kubenswrapper[33572]: I1204 22:34:44.597808 33572 scope.go:117] "RemoveContainer" containerID="6fbf6f3ece7242bb6a6b324c838350c3db0ead36498a57a5fccc127dc570d169" Dec 04 22:34:44.639237 master-0 kubenswrapper[33572]: I1204 22:34:44.639179 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c6f8c9899-t2px9"] Dec 04 22:34:44.651809 master-0 kubenswrapper[33572]: I1204 22:34:44.651752 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c6f8c9899-t2px9"] Dec 04 22:34:45.511923 master-0 kubenswrapper[33572]: I1204 22:34:45.511836 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" event={"ID":"7fd26913-95c6-4bbe-a234-67d8b5544c80","Type":"ContainerStarted","Data":"4114a176b9e0073b647cd0952384b11b473ec0f42e83a423a441a422d5f0900b"} Dec 04 22:34:45.512867 master-0 kubenswrapper[33572]: I1204 22:34:45.511984 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:45.538409 master-0 kubenswrapper[33572]: I1204 22:34:45.537691 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" podStartSLOduration=3.5376714959999997 podStartE2EDuration="3.537671496s" podCreationTimestamp="2025-12-04 22:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:34:45.536582226 +0000 UTC m=+949.264107865" watchObservedRunningTime="2025-12-04 22:34:45.537671496 +0000 UTC m=+949.265197145" Dec 04 22:34:46.545833 master-0 kubenswrapper[33572]: I1204 22:34:46.545786 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d6a1d9e-6849-435b-aee9-caf23bce31bb" path="/var/lib/kubelet/pods/9d6a1d9e-6849-435b-aee9-caf23bce31bb/volumes" Dec 04 22:34:48.271260 master-0 kubenswrapper[33572]: I1204 22:34:48.271218 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2ef1-account-create-update-nzfgp" Dec 04 22:34:48.282354 master-0 kubenswrapper[33572]: I1204 22:34:48.282304 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d6jvr" Dec 04 22:34:48.304007 master-0 kubenswrapper[33572]: I1204 22:34:48.303945 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kjf8b" Dec 04 22:34:48.380878 master-0 kubenswrapper[33572]: I1204 22:34:48.380481 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wllp\" (UniqueName: \"kubernetes.io/projected/c4f6a92e-1188-4a7d-94d8-0989924196a7-kube-api-access-6wllp\") pod \"c4f6a92e-1188-4a7d-94d8-0989924196a7\" (UID: \"c4f6a92e-1188-4a7d-94d8-0989924196a7\") " Dec 04 22:34:48.380878 master-0 kubenswrapper[33572]: I1204 22:34:48.380688 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4f6a92e-1188-4a7d-94d8-0989924196a7-operator-scripts\") pod \"c4f6a92e-1188-4a7d-94d8-0989924196a7\" (UID: \"c4f6a92e-1188-4a7d-94d8-0989924196a7\") " Dec 04 22:34:48.382352 master-0 kubenswrapper[33572]: I1204 22:34:48.382259 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f6a92e-1188-4a7d-94d8-0989924196a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4f6a92e-1188-4a7d-94d8-0989924196a7" (UID: "c4f6a92e-1188-4a7d-94d8-0989924196a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:48.383335 master-0 kubenswrapper[33572]: I1204 22:34:48.383117 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4f6a92e-1188-4a7d-94d8-0989924196a7-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:48.386491 master-0 kubenswrapper[33572]: I1204 22:34:48.386450 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f6a92e-1188-4a7d-94d8-0989924196a7-kube-api-access-6wllp" (OuterVolumeSpecName: "kube-api-access-6wllp") pod "c4f6a92e-1188-4a7d-94d8-0989924196a7" (UID: "c4f6a92e-1188-4a7d-94d8-0989924196a7"). InnerVolumeSpecName "kube-api-access-6wllp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:48.389017 master-0 kubenswrapper[33572]: I1204 22:34:48.388801 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-88ec-account-create-update-g22k5" Dec 04 22:34:48.485239 master-0 kubenswrapper[33572]: I1204 22:34:48.485193 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2a41736-e271-4aef-a67f-77937bc3e446-operator-scripts\") pod \"d2a41736-e271-4aef-a67f-77937bc3e446\" (UID: \"d2a41736-e271-4aef-a67f-77937bc3e446\") " Dec 04 22:34:48.485476 master-0 kubenswrapper[33572]: I1204 22:34:48.485348 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qjmp\" (UniqueName: \"kubernetes.io/projected/d2a41736-e271-4aef-a67f-77937bc3e446-kube-api-access-7qjmp\") pod \"d2a41736-e271-4aef-a67f-77937bc3e446\" (UID: \"d2a41736-e271-4aef-a67f-77937bc3e446\") " Dec 04 22:34:48.485476 master-0 kubenswrapper[33572]: I1204 22:34:48.485378 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z8sb\" (UniqueName: \"kubernetes.io/projected/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-kube-api-access-9z8sb\") pod \"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49\" (UID: \"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49\") " Dec 04 22:34:48.485476 master-0 kubenswrapper[33572]: I1204 22:34:48.485419 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-operator-scripts\") pod \"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49\" (UID: \"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49\") " Dec 04 22:34:48.485844 master-0 kubenswrapper[33572]: I1204 22:34:48.485814 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a41736-e271-4aef-a67f-77937bc3e446-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2a41736-e271-4aef-a67f-77937bc3e446" (UID: "d2a41736-e271-4aef-a67f-77937bc3e446"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:48.486224 master-0 kubenswrapper[33572]: I1204 22:34:48.485942 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wllp\" (UniqueName: \"kubernetes.io/projected/c4f6a92e-1188-4a7d-94d8-0989924196a7-kube-api-access-6wllp\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:48.486224 master-0 kubenswrapper[33572]: I1204 22:34:48.486177 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49" (UID: "29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:48.491642 master-0 kubenswrapper[33572]: I1204 22:34:48.491588 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-kube-api-access-9z8sb" (OuterVolumeSpecName: "kube-api-access-9z8sb") pod "29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49" (UID: "29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49"). InnerVolumeSpecName "kube-api-access-9z8sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:48.495097 master-0 kubenswrapper[33572]: I1204 22:34:48.495044 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a41736-e271-4aef-a67f-77937bc3e446-kube-api-access-7qjmp" (OuterVolumeSpecName: "kube-api-access-7qjmp") pod "d2a41736-e271-4aef-a67f-77937bc3e446" (UID: "d2a41736-e271-4aef-a67f-77937bc3e446"). InnerVolumeSpecName "kube-api-access-7qjmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:48.553399 master-0 kubenswrapper[33572]: I1204 22:34:48.553277 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d6jvr" event={"ID":"d2a41736-e271-4aef-a67f-77937bc3e446","Type":"ContainerDied","Data":"cc783703bd4e8c644d00a85a57d13d3212a5ef6892e4370ab80917aee2ff6673"} Dec 04 22:34:48.553399 master-0 kubenswrapper[33572]: I1204 22:34:48.553311 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d6jvr" Dec 04 22:34:48.553611 master-0 kubenswrapper[33572]: I1204 22:34:48.553324 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc783703bd4e8c644d00a85a57d13d3212a5ef6892e4370ab80917aee2ff6673" Dec 04 22:34:48.555024 master-0 kubenswrapper[33572]: I1204 22:34:48.554974 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2ef1-account-create-update-nzfgp" Dec 04 22:34:48.555117 master-0 kubenswrapper[33572]: I1204 22:34:48.554969 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2ef1-account-create-update-nzfgp" event={"ID":"c4f6a92e-1188-4a7d-94d8-0989924196a7","Type":"ContainerDied","Data":"61e686563e1fb9cea9251ee0daca2b28d540278b2e686472351b9923018643d8"} Dec 04 22:34:48.555219 master-0 kubenswrapper[33572]: I1204 22:34:48.555170 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61e686563e1fb9cea9251ee0daca2b28d540278b2e686472351b9923018643d8" Dec 04 22:34:48.556513 master-0 kubenswrapper[33572]: I1204 22:34:48.556465 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-88ec-account-create-update-g22k5" event={"ID":"9c0fabb4-e989-4e6b-b5f1-27f23997389b","Type":"ContainerDied","Data":"ae0f57b4c6a3f110777e7793e5e5f43a37de57f3c22065c5900e0aebe4466e42"} Dec 04 22:34:48.556612 master-0 kubenswrapper[33572]: I1204 22:34:48.556597 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae0f57b4c6a3f110777e7793e5e5f43a37de57f3c22065c5900e0aebe4466e42" Dec 04 22:34:48.556726 master-0 kubenswrapper[33572]: I1204 22:34:48.556536 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-88ec-account-create-update-g22k5" Dec 04 22:34:48.557908 master-0 kubenswrapper[33572]: I1204 22:34:48.557850 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rln28" event={"ID":"2ff98c13-5ff9-4ffd-a779-e3121642011e","Type":"ContainerStarted","Data":"4431ddd9fe3466ed91a5948ba60b94aaa2fa445da95a9fcfbb56879b9423eb60"} Dec 04 22:34:48.559321 master-0 kubenswrapper[33572]: I1204 22:34:48.559280 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kjf8b" event={"ID":"29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49","Type":"ContainerDied","Data":"0601ddbb431836a04e26e7b6200a53cea6715c46520119938b3e03803eaabb2c"} Dec 04 22:34:48.559384 master-0 kubenswrapper[33572]: I1204 22:34:48.559299 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kjf8b" Dec 04 22:34:48.559384 master-0 kubenswrapper[33572]: I1204 22:34:48.559324 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0601ddbb431836a04e26e7b6200a53cea6715c46520119938b3e03803eaabb2c" Dec 04 22:34:48.592767 master-0 kubenswrapper[33572]: I1204 22:34:48.583627 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-rln28" podStartSLOduration=1.7007755709999999 podStartE2EDuration="6.583605742s" podCreationTimestamp="2025-12-04 22:34:42 +0000 UTC" firstStartedPulling="2025-12-04 22:34:43.166391626 +0000 UTC m=+946.893917275" lastFinishedPulling="2025-12-04 22:34:48.049221787 +0000 UTC m=+951.776747446" observedRunningTime="2025-12-04 22:34:48.582788729 +0000 UTC m=+952.310314418" watchObservedRunningTime="2025-12-04 22:34:48.583605742 +0000 UTC m=+952.311131391" Dec 04 22:34:48.592767 master-0 kubenswrapper[33572]: I1204 22:34:48.587230 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4vjx\" (UniqueName: \"kubernetes.io/projected/9c0fabb4-e989-4e6b-b5f1-27f23997389b-kube-api-access-x4vjx\") pod \"9c0fabb4-e989-4e6b-b5f1-27f23997389b\" (UID: \"9c0fabb4-e989-4e6b-b5f1-27f23997389b\") " Dec 04 22:34:48.592767 master-0 kubenswrapper[33572]: I1204 22:34:48.587392 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c0fabb4-e989-4e6b-b5f1-27f23997389b-operator-scripts\") pod \"9c0fabb4-e989-4e6b-b5f1-27f23997389b\" (UID: \"9c0fabb4-e989-4e6b-b5f1-27f23997389b\") " Dec 04 22:34:48.592767 master-0 kubenswrapper[33572]: I1204 22:34:48.588247 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z8sb\" (UniqueName: \"kubernetes.io/projected/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-kube-api-access-9z8sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:48.592767 master-0 kubenswrapper[33572]: I1204 22:34:48.588277 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:48.592767 master-0 kubenswrapper[33572]: I1204 22:34:48.588298 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2a41736-e271-4aef-a67f-77937bc3e446-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:48.592767 master-0 kubenswrapper[33572]: I1204 22:34:48.588317 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qjmp\" (UniqueName: \"kubernetes.io/projected/d2a41736-e271-4aef-a67f-77937bc3e446-kube-api-access-7qjmp\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:48.592767 master-0 kubenswrapper[33572]: I1204 22:34:48.589828 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c0fabb4-e989-4e6b-b5f1-27f23997389b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c0fabb4-e989-4e6b-b5f1-27f23997389b" (UID: "9c0fabb4-e989-4e6b-b5f1-27f23997389b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:48.595028 master-0 kubenswrapper[33572]: I1204 22:34:48.593047 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c0fabb4-e989-4e6b-b5f1-27f23997389b-kube-api-access-x4vjx" (OuterVolumeSpecName: "kube-api-access-x4vjx") pod "9c0fabb4-e989-4e6b-b5f1-27f23997389b" (UID: "9c0fabb4-e989-4e6b-b5f1-27f23997389b"). InnerVolumeSpecName "kube-api-access-x4vjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:48.695196 master-0 kubenswrapper[33572]: I1204 22:34:48.695107 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c0fabb4-e989-4e6b-b5f1-27f23997389b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:48.695196 master-0 kubenswrapper[33572]: I1204 22:34:48.695183 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4vjx\" (UniqueName: \"kubernetes.io/projected/9c0fabb4-e989-4e6b-b5f1-27f23997389b-kube-api-access-x4vjx\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:52.620693 master-0 kubenswrapper[33572]: I1204 22:34:52.620574 33572 generic.go:334] "Generic (PLEG): container finished" podID="2ff98c13-5ff9-4ffd-a779-e3121642011e" containerID="4431ddd9fe3466ed91a5948ba60b94aaa2fa445da95a9fcfbb56879b9423eb60" exitCode=0 Dec 04 22:34:52.621752 master-0 kubenswrapper[33572]: I1204 22:34:52.620691 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rln28" event={"ID":"2ff98c13-5ff9-4ffd-a779-e3121642011e","Type":"ContainerDied","Data":"4431ddd9fe3466ed91a5948ba60b94aaa2fa445da95a9fcfbb56879b9423eb60"} Dec 04 22:34:53.173853 master-0 kubenswrapper[33572]: I1204 22:34:53.173776 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:34:53.275582 master-0 kubenswrapper[33572]: I1204 22:34:53.275051 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5789dc4cf-2rbv6"] Dec 04 22:34:53.275582 master-0 kubenswrapper[33572]: I1204 22:34:53.275337 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" podUID="0797bdd1-388c-4686-982d-ce11deae84c9" containerName="dnsmasq-dns" containerID="cri-o://a77e2f6b23a1dd5188388200f1d1c1d2bc318e534b37f612da2a01f0daa29b1b" gracePeriod=10 Dec 04 22:34:53.644077 master-0 kubenswrapper[33572]: I1204 22:34:53.644022 33572 generic.go:334] "Generic (PLEG): container finished" podID="0797bdd1-388c-4686-982d-ce11deae84c9" containerID="a77e2f6b23a1dd5188388200f1d1c1d2bc318e534b37f612da2a01f0daa29b1b" exitCode=0 Dec 04 22:34:53.644619 master-0 kubenswrapper[33572]: I1204 22:34:53.644242 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" event={"ID":"0797bdd1-388c-4686-982d-ce11deae84c9","Type":"ContainerDied","Data":"a77e2f6b23a1dd5188388200f1d1c1d2bc318e534b37f612da2a01f0daa29b1b"} Dec 04 22:34:53.894368 master-0 kubenswrapper[33572]: I1204 22:34:53.894319 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:34:54.032956 master-0 kubenswrapper[33572]: I1204 22:34:54.032903 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kww6g\" (UniqueName: \"kubernetes.io/projected/0797bdd1-388c-4686-982d-ce11deae84c9-kube-api-access-kww6g\") pod \"0797bdd1-388c-4686-982d-ce11deae84c9\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " Dec 04 22:34:54.033082 master-0 kubenswrapper[33572]: I1204 22:34:54.032973 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-nb\") pod \"0797bdd1-388c-4686-982d-ce11deae84c9\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " Dec 04 22:34:54.033082 master-0 kubenswrapper[33572]: I1204 22:34:54.033063 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-sb\") pod \"0797bdd1-388c-4686-982d-ce11deae84c9\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " Dec 04 22:34:54.033159 master-0 kubenswrapper[33572]: I1204 22:34:54.033098 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-dns-svc\") pod \"0797bdd1-388c-4686-982d-ce11deae84c9\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " Dec 04 22:34:54.033575 master-0 kubenswrapper[33572]: I1204 22:34:54.033485 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-config\") pod \"0797bdd1-388c-4686-982d-ce11deae84c9\" (UID: \"0797bdd1-388c-4686-982d-ce11deae84c9\") " Dec 04 22:34:54.036038 master-0 kubenswrapper[33572]: I1204 22:34:54.035994 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0797bdd1-388c-4686-982d-ce11deae84c9-kube-api-access-kww6g" (OuterVolumeSpecName: "kube-api-access-kww6g") pod "0797bdd1-388c-4686-982d-ce11deae84c9" (UID: "0797bdd1-388c-4686-982d-ce11deae84c9"). InnerVolumeSpecName "kube-api-access-kww6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:54.080189 master-0 kubenswrapper[33572]: I1204 22:34:54.080116 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0797bdd1-388c-4686-982d-ce11deae84c9" (UID: "0797bdd1-388c-4686-982d-ce11deae84c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:54.086236 master-0 kubenswrapper[33572]: I1204 22:34:54.086146 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0797bdd1-388c-4686-982d-ce11deae84c9" (UID: "0797bdd1-388c-4686-982d-ce11deae84c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:54.111913 master-0 kubenswrapper[33572]: I1204 22:34:54.111853 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0797bdd1-388c-4686-982d-ce11deae84c9" (UID: "0797bdd1-388c-4686-982d-ce11deae84c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:54.112451 master-0 kubenswrapper[33572]: I1204 22:34:54.112413 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:54.138703 master-0 kubenswrapper[33572]: I1204 22:34:54.138633 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-combined-ca-bundle\") pod \"2ff98c13-5ff9-4ffd-a779-e3121642011e\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " Dec 04 22:34:54.138957 master-0 kubenswrapper[33572]: I1204 22:34:54.138832 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxr5t\" (UniqueName: \"kubernetes.io/projected/2ff98c13-5ff9-4ffd-a779-e3121642011e-kube-api-access-bxr5t\") pod \"2ff98c13-5ff9-4ffd-a779-e3121642011e\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " Dec 04 22:34:54.141006 master-0 kubenswrapper[33572]: I1204 22:34:54.140952 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kww6g\" (UniqueName: \"kubernetes.io/projected/0797bdd1-388c-4686-982d-ce11deae84c9-kube-api-access-kww6g\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:54.141092 master-0 kubenswrapper[33572]: I1204 22:34:54.141003 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:54.141092 master-0 kubenswrapper[33572]: I1204 22:34:54.141031 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:54.141092 master-0 kubenswrapper[33572]: I1204 22:34:54.141055 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:54.144250 master-0 kubenswrapper[33572]: I1204 22:34:54.144180 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff98c13-5ff9-4ffd-a779-e3121642011e-kube-api-access-bxr5t" (OuterVolumeSpecName: "kube-api-access-bxr5t") pod "2ff98c13-5ff9-4ffd-a779-e3121642011e" (UID: "2ff98c13-5ff9-4ffd-a779-e3121642011e"). InnerVolumeSpecName "kube-api-access-bxr5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:54.147055 master-0 kubenswrapper[33572]: I1204 22:34:54.146995 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-config" (OuterVolumeSpecName: "config") pod "0797bdd1-388c-4686-982d-ce11deae84c9" (UID: "0797bdd1-388c-4686-982d-ce11deae84c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:54.173831 master-0 kubenswrapper[33572]: I1204 22:34:54.173619 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff98c13-5ff9-4ffd-a779-e3121642011e" (UID: "2ff98c13-5ff9-4ffd-a779-e3121642011e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:54.242042 master-0 kubenswrapper[33572]: I1204 22:34:54.241930 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-config-data\") pod \"2ff98c13-5ff9-4ffd-a779-e3121642011e\" (UID: \"2ff98c13-5ff9-4ffd-a779-e3121642011e\") " Dec 04 22:34:54.242863 master-0 kubenswrapper[33572]: I1204 22:34:54.242812 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0797bdd1-388c-4686-982d-ce11deae84c9-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:54.242863 master-0 kubenswrapper[33572]: I1204 22:34:54.242851 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:54.243076 master-0 kubenswrapper[33572]: I1204 22:34:54.242876 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxr5t\" (UniqueName: \"kubernetes.io/projected/2ff98c13-5ff9-4ffd-a779-e3121642011e-kube-api-access-bxr5t\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:54.317544 master-0 kubenswrapper[33572]: I1204 22:34:54.317404 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-config-data" (OuterVolumeSpecName: "config-data") pod "2ff98c13-5ff9-4ffd-a779-e3121642011e" (UID: "2ff98c13-5ff9-4ffd-a779-e3121642011e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:54.344193 master-0 kubenswrapper[33572]: I1204 22:34:54.344136 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff98c13-5ff9-4ffd-a779-e3121642011e-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:54.682681 master-0 kubenswrapper[33572]: I1204 22:34:54.682585 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" event={"ID":"0797bdd1-388c-4686-982d-ce11deae84c9","Type":"ContainerDied","Data":"438ca4995d6d21e65fd08cf2b07037f869568400e627146f43547dc9656630ce"} Dec 04 22:34:54.683746 master-0 kubenswrapper[33572]: I1204 22:34:54.682995 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5789dc4cf-2rbv6" Dec 04 22:34:54.684854 master-0 kubenswrapper[33572]: I1204 22:34:54.683823 33572 scope.go:117] "RemoveContainer" containerID="a77e2f6b23a1dd5188388200f1d1c1d2bc318e534b37f612da2a01f0daa29b1b" Dec 04 22:34:54.688224 master-0 kubenswrapper[33572]: I1204 22:34:54.688173 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-rln28" event={"ID":"2ff98c13-5ff9-4ffd-a779-e3121642011e","Type":"ContainerDied","Data":"ee9d78222c86f8644d8bcd07da12fad5dff5416feb415bf67cfcf0a7819f8244"} Dec 04 22:34:54.688447 master-0 kubenswrapper[33572]: I1204 22:34:54.688416 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee9d78222c86f8644d8bcd07da12fad5dff5416feb415bf67cfcf0a7819f8244" Dec 04 22:34:54.690468 master-0 kubenswrapper[33572]: I1204 22:34:54.688234 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-rln28" Dec 04 22:34:54.744226 master-0 kubenswrapper[33572]: I1204 22:34:54.744188 33572 scope.go:117] "RemoveContainer" containerID="ee07aae4310cb02d674cb636221f915e8a92620e0383a42c6e782c8c34272891" Dec 04 22:34:54.751712 master-0 kubenswrapper[33572]: I1204 22:34:54.751638 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5789dc4cf-2rbv6"] Dec 04 22:34:54.760962 master-0 kubenswrapper[33572]: I1204 22:34:54.760905 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5789dc4cf-2rbv6"] Dec 04 22:34:55.422251 master-0 kubenswrapper[33572]: I1204 22:34:55.422186 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b7zzl"] Dec 04 22:34:55.423026 master-0 kubenswrapper[33572]: E1204 22:34:55.423011 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d6a1d9e-6849-435b-aee9-caf23bce31bb" containerName="init" Dec 04 22:34:55.425839 master-0 kubenswrapper[33572]: I1204 22:34:55.425757 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d6a1d9e-6849-435b-aee9-caf23bce31bb" containerName="init" Dec 04 22:34:55.426012 master-0 kubenswrapper[33572]: E1204 22:34:55.425992 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49" containerName="mariadb-database-create" Dec 04 22:34:55.426162 master-0 kubenswrapper[33572]: I1204 22:34:55.426150 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49" containerName="mariadb-database-create" Dec 04 22:34:55.426282 master-0 kubenswrapper[33572]: E1204 22:34:55.426259 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a41736-e271-4aef-a67f-77937bc3e446" containerName="mariadb-database-create" Dec 04 22:34:55.426404 master-0 kubenswrapper[33572]: I1204 22:34:55.426386 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a41736-e271-4aef-a67f-77937bc3e446" containerName="mariadb-database-create" Dec 04 22:34:55.426591 master-0 kubenswrapper[33572]: E1204 22:34:55.426578 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff98c13-5ff9-4ffd-a779-e3121642011e" containerName="keystone-db-sync" Dec 04 22:34:55.437230 master-0 kubenswrapper[33572]: I1204 22:34:55.437188 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff98c13-5ff9-4ffd-a779-e3121642011e" containerName="keystone-db-sync" Dec 04 22:34:55.437485 master-0 kubenswrapper[33572]: E1204 22:34:55.437474 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0797bdd1-388c-4686-982d-ce11deae84c9" containerName="dnsmasq-dns" Dec 04 22:34:55.437581 master-0 kubenswrapper[33572]: I1204 22:34:55.437571 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0797bdd1-388c-4686-982d-ce11deae84c9" containerName="dnsmasq-dns" Dec 04 22:34:55.437691 master-0 kubenswrapper[33572]: E1204 22:34:55.437680 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d6a1d9e-6849-435b-aee9-caf23bce31bb" containerName="dnsmasq-dns" Dec 04 22:34:55.437791 master-0 kubenswrapper[33572]: I1204 22:34:55.437738 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d6a1d9e-6849-435b-aee9-caf23bce31bb" containerName="dnsmasq-dns" Dec 04 22:34:55.437865 master-0 kubenswrapper[33572]: E1204 22:34:55.437855 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0797bdd1-388c-4686-982d-ce11deae84c9" containerName="init" Dec 04 22:34:55.437924 master-0 kubenswrapper[33572]: I1204 22:34:55.437915 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0797bdd1-388c-4686-982d-ce11deae84c9" containerName="init" Dec 04 22:34:55.438037 master-0 kubenswrapper[33572]: E1204 22:34:55.438026 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c0fabb4-e989-4e6b-b5f1-27f23997389b" containerName="mariadb-account-create-update" Dec 04 22:34:55.438119 master-0 kubenswrapper[33572]: I1204 22:34:55.438108 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c0fabb4-e989-4e6b-b5f1-27f23997389b" containerName="mariadb-account-create-update" Dec 04 22:34:55.439211 master-0 kubenswrapper[33572]: E1204 22:34:55.438451 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4f6a92e-1188-4a7d-94d8-0989924196a7" containerName="mariadb-account-create-update" Dec 04 22:34:55.439211 master-0 kubenswrapper[33572]: I1204 22:34:55.438509 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f6a92e-1188-4a7d-94d8-0989924196a7" containerName="mariadb-account-create-update" Dec 04 22:34:55.439211 master-0 kubenswrapper[33572]: I1204 22:34:55.439052 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff98c13-5ff9-4ffd-a779-e3121642011e" containerName="keystone-db-sync" Dec 04 22:34:55.439211 master-0 kubenswrapper[33572]: I1204 22:34:55.439076 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d6a1d9e-6849-435b-aee9-caf23bce31bb" containerName="dnsmasq-dns" Dec 04 22:34:55.439211 master-0 kubenswrapper[33572]: I1204 22:34:55.439099 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c0fabb4-e989-4e6b-b5f1-27f23997389b" containerName="mariadb-account-create-update" Dec 04 22:34:55.439211 master-0 kubenswrapper[33572]: I1204 22:34:55.439107 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4f6a92e-1188-4a7d-94d8-0989924196a7" containerName="mariadb-account-create-update" Dec 04 22:34:55.439211 master-0 kubenswrapper[33572]: I1204 22:34:55.439144 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="0797bdd1-388c-4686-982d-ce11deae84c9" containerName="dnsmasq-dns" Dec 04 22:34:55.439211 master-0 kubenswrapper[33572]: I1204 22:34:55.439157 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a41736-e271-4aef-a67f-77937bc3e446" containerName="mariadb-database-create" Dec 04 22:34:55.439211 master-0 kubenswrapper[33572]: I1204 22:34:55.439166 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49" containerName="mariadb-database-create" Dec 04 22:34:55.448654 master-0 kubenswrapper[33572]: I1204 22:34:55.440148 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7db899889c-wtqrt"] Dec 04 22:34:55.448654 master-0 kubenswrapper[33572]: I1204 22:34:55.440314 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.448654 master-0 kubenswrapper[33572]: I1204 22:34:55.443610 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 22:34:55.448654 master-0 kubenswrapper[33572]: I1204 22:34:55.444032 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 22:34:55.448654 master-0 kubenswrapper[33572]: I1204 22:34:55.444243 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 22:34:55.448654 master-0 kubenswrapper[33572]: I1204 22:34:55.445650 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 22:34:55.455601 master-0 kubenswrapper[33572]: I1204 22:34:55.455320 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b7zzl"] Dec 04 22:34:55.455601 master-0 kubenswrapper[33572]: I1204 22:34:55.455428 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.462417 master-0 kubenswrapper[33572]: I1204 22:34:55.461673 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7db899889c-wtqrt"] Dec 04 22:34:55.563898 master-0 kubenswrapper[33572]: I1204 22:34:55.558997 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-xd84n"] Dec 04 22:34:55.578262 master-0 kubenswrapper[33572]: I1204 22:34:55.577293 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-xd84n" Dec 04 22:34:55.606162 master-0 kubenswrapper[33572]: I1204 22:34:55.605262 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7675d-db-sync-d9l4w"] Dec 04 22:34:55.609657 master-0 kubenswrapper[33572]: I1204 22:34:55.606703 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.609657 master-0 kubenswrapper[33572]: I1204 22:34:55.608943 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-config-data" Dec 04 22:34:55.609657 master-0 kubenswrapper[33572]: I1204 22:34:55.609138 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-scripts" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.638738 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-combined-ca-bundle\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.638800 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-nb\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.638828 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-combined-ca-bundle\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.638871 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-swift-storage-0\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.638895 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfdw4\" (UniqueName: \"kubernetes.io/projected/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-kube-api-access-mfdw4\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.638918 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-db-sync-config-data\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.638978 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-config-data\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.638998 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwtc7\" (UniqueName: \"kubernetes.io/projected/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-kube-api-access-vwtc7\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.639029 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsm67\" (UniqueName: \"kubernetes.io/projected/d61c73b1-2107-4afb-ba37-c185b699095b-kube-api-access-dsm67\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.639085 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-credential-keys\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.639104 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-etc-machine-id\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.639124 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-fernet-keys\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.639142 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8733e1a3-7b35-40cb-b129-536fe10961a8-operator-scripts\") pod \"ironic-db-create-xd84n\" (UID: \"8733e1a3-7b35-40cb-b129-536fe10961a8\") " pod="openstack/ironic-db-create-xd84n" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.639160 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-scripts\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.639160 master-0 kubenswrapper[33572]: I1204 22:34:55.639176 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-svc\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.640074 master-0 kubenswrapper[33572]: I1204 22:34:55.639198 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-sb\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.641733 master-0 kubenswrapper[33572]: I1204 22:34:55.640851 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lx2q\" (UniqueName: \"kubernetes.io/projected/8733e1a3-7b35-40cb-b129-536fe10961a8-kube-api-access-9lx2q\") pod \"ironic-db-create-xd84n\" (UID: \"8733e1a3-7b35-40cb-b129-536fe10961a8\") " pod="openstack/ironic-db-create-xd84n" Dec 04 22:34:55.641733 master-0 kubenswrapper[33572]: I1204 22:34:55.641062 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-scripts\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.641733 master-0 kubenswrapper[33572]: I1204 22:34:55.641126 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-config\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.641733 master-0 kubenswrapper[33572]: I1204 22:34:55.641164 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-config-data\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.669029 master-0 kubenswrapper[33572]: I1204 22:34:55.667611 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-xd84n"] Dec 04 22:34:55.673755 master-0 kubenswrapper[33572]: I1204 22:34:55.673714 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-db-sync-d9l4w"] Dec 04 22:34:55.685326 master-0 kubenswrapper[33572]: I1204 22:34:55.685253 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-lxrlk"] Dec 04 22:34:55.687129 master-0 kubenswrapper[33572]: I1204 22:34:55.687093 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:55.692525 master-0 kubenswrapper[33572]: I1204 22:34:55.689359 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 22:34:55.692872 master-0 kubenswrapper[33572]: I1204 22:34:55.692838 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745620 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-db-sync-config-data\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745714 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-config-data\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745752 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwtc7\" (UniqueName: \"kubernetes.io/projected/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-kube-api-access-vwtc7\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745784 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsm67\" (UniqueName: \"kubernetes.io/projected/d61c73b1-2107-4afb-ba37-c185b699095b-kube-api-access-dsm67\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745855 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-credential-keys\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745884 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-etc-machine-id\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745906 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-fernet-keys\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745928 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8733e1a3-7b35-40cb-b129-536fe10961a8-operator-scripts\") pod \"ironic-db-create-xd84n\" (UID: \"8733e1a3-7b35-40cb-b129-536fe10961a8\") " pod="openstack/ironic-db-create-xd84n" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745949 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-scripts\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745967 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-svc\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.745988 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-sb\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746010 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m92n\" (UniqueName: \"kubernetes.io/projected/2609a439-7ac0-4253-a74e-e4c90a023832-kube-api-access-9m92n\") pod \"neutron-db-sync-lxrlk\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746086 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lx2q\" (UniqueName: \"kubernetes.io/projected/8733e1a3-7b35-40cb-b129-536fe10961a8-kube-api-access-9lx2q\") pod \"ironic-db-create-xd84n\" (UID: \"8733e1a3-7b35-40cb-b129-536fe10961a8\") " pod="openstack/ironic-db-create-xd84n" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746111 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-scripts\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746140 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-config\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746155 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-config-data\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746202 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-combined-ca-bundle\") pod \"neutron-db-sync-lxrlk\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746226 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-config\") pod \"neutron-db-sync-lxrlk\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746244 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-combined-ca-bundle\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746259 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-nb\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746280 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-combined-ca-bundle\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746325 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-swift-storage-0\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.750698 master-0 kubenswrapper[33572]: I1204 22:34:55.746351 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfdw4\" (UniqueName: \"kubernetes.io/projected/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-kube-api-access-mfdw4\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.751963 master-0 kubenswrapper[33572]: I1204 22:34:55.751887 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-etc-machine-id\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.755017 master-0 kubenswrapper[33572]: I1204 22:34:55.754829 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8733e1a3-7b35-40cb-b129-536fe10961a8-operator-scripts\") pod \"ironic-db-create-xd84n\" (UID: \"8733e1a3-7b35-40cb-b129-536fe10961a8\") " pod="openstack/ironic-db-create-xd84n" Dec 04 22:34:55.755017 master-0 kubenswrapper[33572]: I1204 22:34:55.754845 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-credential-keys\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.755490 master-0 kubenswrapper[33572]: I1204 22:34:55.755391 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-config-data\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.755490 master-0 kubenswrapper[33572]: I1204 22:34:55.755425 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-nb\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.755628 master-0 kubenswrapper[33572]: I1204 22:34:55.755597 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-swift-storage-0\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.758770 master-0 kubenswrapper[33572]: I1204 22:34:55.756274 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-sb\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.758770 master-0 kubenswrapper[33572]: I1204 22:34:55.756448 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-svc\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.758770 master-0 kubenswrapper[33572]: I1204 22:34:55.756751 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-scripts\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.758770 master-0 kubenswrapper[33572]: I1204 22:34:55.757016 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-config-data\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.760209 master-0 kubenswrapper[33572]: I1204 22:34:55.760071 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-config\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.760517 master-0 kubenswrapper[33572]: I1204 22:34:55.760473 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-lxrlk"] Dec 04 22:34:55.761173 master-0 kubenswrapper[33572]: I1204 22:34:55.761120 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-combined-ca-bundle\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.767433 master-0 kubenswrapper[33572]: I1204 22:34:55.767396 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-scripts\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.767953 master-0 kubenswrapper[33572]: I1204 22:34:55.767908 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-combined-ca-bundle\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.770295 master-0 kubenswrapper[33572]: I1204 22:34:55.770269 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfdw4\" (UniqueName: \"kubernetes.io/projected/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-kube-api-access-mfdw4\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.780074 master-0 kubenswrapper[33572]: I1204 22:34:55.774603 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-ab6c-account-create-update-vs69m"] Dec 04 22:34:55.780074 master-0 kubenswrapper[33572]: I1204 22:34:55.778032 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-db-sync-config-data\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.780074 master-0 kubenswrapper[33572]: I1204 22:34:55.778268 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-ab6c-account-create-update-vs69m" Dec 04 22:34:55.786896 master-0 kubenswrapper[33572]: I1204 22:34:55.786825 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-fernet-keys\") pod \"keystone-bootstrap-b7zzl\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:55.788158 master-0 kubenswrapper[33572]: I1204 22:34:55.788121 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-ab6c-account-create-update-vs69m"] Dec 04 22:34:55.814703 master-0 kubenswrapper[33572]: I1204 22:34:55.809245 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsm67\" (UniqueName: \"kubernetes.io/projected/d61c73b1-2107-4afb-ba37-c185b699095b-kube-api-access-dsm67\") pod \"dnsmasq-dns-7db899889c-wtqrt\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.814703 master-0 kubenswrapper[33572]: I1204 22:34:55.809287 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Dec 04 22:34:55.826280 master-0 kubenswrapper[33572]: I1204 22:34:55.822015 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwtc7\" (UniqueName: \"kubernetes.io/projected/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-kube-api-access-vwtc7\") pod \"cinder-7675d-db-sync-d9l4w\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.853538 master-0 kubenswrapper[33572]: I1204 22:34:55.828134 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lx2q\" (UniqueName: \"kubernetes.io/projected/8733e1a3-7b35-40cb-b129-536fe10961a8-kube-api-access-9lx2q\") pod \"ironic-db-create-xd84n\" (UID: \"8733e1a3-7b35-40cb-b129-536fe10961a8\") " pod="openstack/ironic-db-create-xd84n" Dec 04 22:34:55.853538 master-0 kubenswrapper[33572]: I1204 22:34:55.847382 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m92n\" (UniqueName: \"kubernetes.io/projected/2609a439-7ac0-4253-a74e-e4c90a023832-kube-api-access-9m92n\") pod \"neutron-db-sync-lxrlk\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:55.853538 master-0 kubenswrapper[33572]: I1204 22:34:55.847447 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q24zl\" (UniqueName: \"kubernetes.io/projected/2a267444-303d-4eaf-98f5-01b11c7efe8f-kube-api-access-q24zl\") pod \"ironic-ab6c-account-create-update-vs69m\" (UID: \"2a267444-303d-4eaf-98f5-01b11c7efe8f\") " pod="openstack/ironic-ab6c-account-create-update-vs69m" Dec 04 22:34:55.853538 master-0 kubenswrapper[33572]: I1204 22:34:55.847485 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a267444-303d-4eaf-98f5-01b11c7efe8f-operator-scripts\") pod \"ironic-ab6c-account-create-update-vs69m\" (UID: \"2a267444-303d-4eaf-98f5-01b11c7efe8f\") " pod="openstack/ironic-ab6c-account-create-update-vs69m" Dec 04 22:34:55.853538 master-0 kubenswrapper[33572]: I1204 22:34:55.847547 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-combined-ca-bundle\") pod \"neutron-db-sync-lxrlk\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:55.853538 master-0 kubenswrapper[33572]: I1204 22:34:55.847567 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-config\") pod \"neutron-db-sync-lxrlk\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:55.866609 master-0 kubenswrapper[33572]: I1204 22:34:55.857009 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:55.873369 master-0 kubenswrapper[33572]: I1204 22:34:55.873314 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-combined-ca-bundle\") pod \"neutron-db-sync-lxrlk\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:55.900533 master-0 kubenswrapper[33572]: I1204 22:34:55.876337 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m92n\" (UniqueName: \"kubernetes.io/projected/2609a439-7ac0-4253-a74e-e4c90a023832-kube-api-access-9m92n\") pod \"neutron-db-sync-lxrlk\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:55.900533 master-0 kubenswrapper[33572]: I1204 22:34:55.891361 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-config\") pod \"neutron-db-sync-lxrlk\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:55.942768 master-0 kubenswrapper[33572]: I1204 22:34:55.939299 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-xd84n" Dec 04 22:34:55.949689 master-0 kubenswrapper[33572]: I1204 22:34:55.948458 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q24zl\" (UniqueName: \"kubernetes.io/projected/2a267444-303d-4eaf-98f5-01b11c7efe8f-kube-api-access-q24zl\") pod \"ironic-ab6c-account-create-update-vs69m\" (UID: \"2a267444-303d-4eaf-98f5-01b11c7efe8f\") " pod="openstack/ironic-ab6c-account-create-update-vs69m" Dec 04 22:34:55.949689 master-0 kubenswrapper[33572]: I1204 22:34:55.948534 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a267444-303d-4eaf-98f5-01b11c7efe8f-operator-scripts\") pod \"ironic-ab6c-account-create-update-vs69m\" (UID: \"2a267444-303d-4eaf-98f5-01b11c7efe8f\") " pod="openstack/ironic-ab6c-account-create-update-vs69m" Dec 04 22:34:55.949689 master-0 kubenswrapper[33572]: I1204 22:34:55.949435 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a267444-303d-4eaf-98f5-01b11c7efe8f-operator-scripts\") pod \"ironic-ab6c-account-create-update-vs69m\" (UID: \"2a267444-303d-4eaf-98f5-01b11c7efe8f\") " pod="openstack/ironic-ab6c-account-create-update-vs69m" Dec 04 22:34:55.966299 master-0 kubenswrapper[33572]: I1204 22:34:55.966209 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q24zl\" (UniqueName: \"kubernetes.io/projected/2a267444-303d-4eaf-98f5-01b11c7efe8f-kube-api-access-q24zl\") pod \"ironic-ab6c-account-create-update-vs69m\" (UID: \"2a267444-303d-4eaf-98f5-01b11c7efe8f\") " pod="openstack/ironic-ab6c-account-create-update-vs69m" Dec 04 22:34:55.969179 master-0 kubenswrapper[33572]: I1204 22:34:55.969146 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:34:55.986896 master-0 kubenswrapper[33572]: I1204 22:34:55.981573 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-ab6c-account-create-update-vs69m" Dec 04 22:34:55.997475 master-0 kubenswrapper[33572]: I1204 22:34:55.995565 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7db899889c-wtqrt"] Dec 04 22:34:56.024090 master-0 kubenswrapper[33572]: I1204 22:34:56.024037 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:34:56.030127 master-0 kubenswrapper[33572]: I1204 22:34:56.030051 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ms5vx"] Dec 04 22:34:56.032792 master-0 kubenswrapper[33572]: I1204 22:34:56.032402 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.035307 master-0 kubenswrapper[33572]: I1204 22:34:56.035275 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 22:34:56.035561 master-0 kubenswrapper[33572]: I1204 22:34:56.035542 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 22:34:56.043525 master-0 kubenswrapper[33572]: I1204 22:34:56.043476 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ms5vx"] Dec 04 22:34:56.054366 master-0 kubenswrapper[33572]: I1204 22:34:56.051556 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-scripts\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.054366 master-0 kubenswrapper[33572]: I1204 22:34:56.051610 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-combined-ca-bundle\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.054366 master-0 kubenswrapper[33572]: I1204 22:34:56.051633 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rs6\" (UniqueName: \"kubernetes.io/projected/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-kube-api-access-94rs6\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.054366 master-0 kubenswrapper[33572]: I1204 22:34:56.051796 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-config-data\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.054366 master-0 kubenswrapper[33572]: I1204 22:34:56.051820 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-logs\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.081053 master-0 kubenswrapper[33572]: I1204 22:34:56.080834 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:34:56.087544 master-0 kubenswrapper[33572]: I1204 22:34:56.084595 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d7495797c-5fbwv"] Dec 04 22:34:56.087544 master-0 kubenswrapper[33572]: I1204 22:34:56.086338 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.131132 master-0 kubenswrapper[33572]: I1204 22:34:56.130655 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d7495797c-5fbwv"] Dec 04 22:34:56.153181 master-0 kubenswrapper[33572]: I1204 22:34:56.153113 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-config-data\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.153181 master-0 kubenswrapper[33572]: I1204 22:34:56.153167 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-logs\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.153978 master-0 kubenswrapper[33572]: I1204 22:34:56.153647 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-scripts\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.153978 master-0 kubenswrapper[33572]: I1204 22:34:56.153722 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-combined-ca-bundle\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.154053 master-0 kubenswrapper[33572]: I1204 22:34:56.154026 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-logs\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.154421 master-0 kubenswrapper[33572]: I1204 22:34:56.154224 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94rs6\" (UniqueName: \"kubernetes.io/projected/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-kube-api-access-94rs6\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.158932 master-0 kubenswrapper[33572]: I1204 22:34:56.158902 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-scripts\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.167474 master-0 kubenswrapper[33572]: I1204 22:34:56.166955 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-config-data\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.168935 master-0 kubenswrapper[33572]: I1204 22:34:56.168890 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rs6\" (UniqueName: \"kubernetes.io/projected/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-kube-api-access-94rs6\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.184937 master-0 kubenswrapper[33572]: I1204 22:34:56.184897 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-combined-ca-bundle\") pod \"placement-db-sync-ms5vx\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.256226 master-0 kubenswrapper[33572]: I1204 22:34:56.256179 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-swift-storage-0\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.256226 master-0 kubenswrapper[33572]: I1204 22:34:56.256237 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-config\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.256484 master-0 kubenswrapper[33572]: I1204 22:34:56.256261 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-nb\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.256484 master-0 kubenswrapper[33572]: I1204 22:34:56.256288 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-svc\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.256484 master-0 kubenswrapper[33572]: I1204 22:34:56.256312 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-sb\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.256484 master-0 kubenswrapper[33572]: I1204 22:34:56.256330 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs5g4\" (UniqueName: \"kubernetes.io/projected/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-kube-api-access-xs5g4\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.360360 master-0 kubenswrapper[33572]: I1204 22:34:56.360312 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-config\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.360485 master-0 kubenswrapper[33572]: I1204 22:34:56.360354 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ms5vx" Dec 04 22:34:56.360485 master-0 kubenswrapper[33572]: I1204 22:34:56.360362 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-nb\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.361214 master-0 kubenswrapper[33572]: I1204 22:34:56.361183 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-nb\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.365010 master-0 kubenswrapper[33572]: I1204 22:34:56.364491 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-svc\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.365100 master-0 kubenswrapper[33572]: I1204 22:34:56.365057 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-sb\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.365100 master-0 kubenswrapper[33572]: I1204 22:34:56.365093 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs5g4\" (UniqueName: \"kubernetes.io/projected/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-kube-api-access-xs5g4\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.365238 master-0 kubenswrapper[33572]: I1204 22:34:56.365207 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-config\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.365571 master-0 kubenswrapper[33572]: I1204 22:34:56.365474 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-swift-storage-0\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.367614 master-0 kubenswrapper[33572]: I1204 22:34:56.367309 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-swift-storage-0\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.372146 master-0 kubenswrapper[33572]: I1204 22:34:56.371386 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-svc\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.384924 master-0 kubenswrapper[33572]: I1204 22:34:56.383177 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-sb\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.404048 master-0 kubenswrapper[33572]: I1204 22:34:56.403997 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs5g4\" (UniqueName: \"kubernetes.io/projected/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-kube-api-access-xs5g4\") pod \"dnsmasq-dns-d7495797c-5fbwv\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.425060 master-0 kubenswrapper[33572]: I1204 22:34:56.425008 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:56.506846 master-0 kubenswrapper[33572]: I1204 22:34:56.503388 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7db899889c-wtqrt"] Dec 04 22:34:56.539526 master-0 kubenswrapper[33572]: I1204 22:34:56.539465 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0797bdd1-388c-4686-982d-ce11deae84c9" path="/var/lib/kubelet/pods/0797bdd1-388c-4686-982d-ce11deae84c9/volumes" Dec 04 22:34:56.677518 master-0 kubenswrapper[33572]: I1204 22:34:56.677126 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-xd84n"] Dec 04 22:34:56.689572 master-0 kubenswrapper[33572]: I1204 22:34:56.687531 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-ab6c-account-create-update-vs69m"] Dec 04 22:34:56.703552 master-0 kubenswrapper[33572]: W1204 22:34:56.703471 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a267444_303d_4eaf_98f5_01b11c7efe8f.slice/crio-17156bab2c23d7de2097c515ea3e47d0a1a21869207a6a93fcfa55048b99b4c3 WatchSource:0}: Error finding container 17156bab2c23d7de2097c515ea3e47d0a1a21869207a6a93fcfa55048b99b4c3: Status 404 returned error can't find the container with id 17156bab2c23d7de2097c515ea3e47d0a1a21869207a6a93fcfa55048b99b4c3 Dec 04 22:34:56.713898 master-0 kubenswrapper[33572]: I1204 22:34:56.713850 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Dec 04 22:34:56.723493 master-0 kubenswrapper[33572]: W1204 22:34:56.722993 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8733e1a3_7b35_40cb_b129_536fe10961a8.slice/crio-635db6de4acea0d4e99aea3654170143562ee3361b6b7748fb83fa63669943bd WatchSource:0}: Error finding container 635db6de4acea0d4e99aea3654170143562ee3361b6b7748fb83fa63669943bd: Status 404 returned error can't find the container with id 635db6de4acea0d4e99aea3654170143562ee3361b6b7748fb83fa63669943bd Dec 04 22:34:56.787137 master-0 kubenswrapper[33572]: I1204 22:34:56.786998 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7db899889c-wtqrt" event={"ID":"d61c73b1-2107-4afb-ba37-c185b699095b","Type":"ContainerStarted","Data":"a5e072bbc201f295d561aa60312ec6c98a1bff59f116233f25df55666f91b824"} Dec 04 22:34:56.792611 master-0 kubenswrapper[33572]: I1204 22:34:56.792544 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-ab6c-account-create-update-vs69m" event={"ID":"2a267444-303d-4eaf-98f5-01b11c7efe8f","Type":"ContainerStarted","Data":"17156bab2c23d7de2097c515ea3e47d0a1a21869207a6a93fcfa55048b99b4c3"} Dec 04 22:34:56.799935 master-0 kubenswrapper[33572]: I1204 22:34:56.798597 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-xd84n" event={"ID":"8733e1a3-7b35-40cb-b129-536fe10961a8","Type":"ContainerStarted","Data":"635db6de4acea0d4e99aea3654170143562ee3361b6b7748fb83fa63669943bd"} Dec 04 22:34:57.174605 master-0 kubenswrapper[33572]: I1204 22:34:57.174240 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-db-sync-d9l4w"] Dec 04 22:34:57.181025 master-0 kubenswrapper[33572]: W1204 22:34:57.179819 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2609a439_7ac0_4253_a74e_e4c90a023832.slice/crio-6e082596aabe4ae5272edacf962d17659440664e3ccb24f2c8eacfe799775895 WatchSource:0}: Error finding container 6e082596aabe4ae5272edacf962d17659440664e3ccb24f2c8eacfe799775895: Status 404 returned error can't find the container with id 6e082596aabe4ae5272edacf962d17659440664e3ccb24f2c8eacfe799775895 Dec 04 22:34:57.198671 master-0 kubenswrapper[33572]: W1204 22:34:57.197358 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd66b1f8f_3a86_442f_bdec_422e5a6e03ee.slice/crio-be4c08f6200e18a631c7a9bfef072f6b1c1e4f4f14b95bb4b4ae17aebd07ea90 WatchSource:0}: Error finding container be4c08f6200e18a631c7a9bfef072f6b1c1e4f4f14b95bb4b4ae17aebd07ea90: Status 404 returned error can't find the container with id be4c08f6200e18a631c7a9bfef072f6b1c1e4f4f14b95bb4b4ae17aebd07ea90 Dec 04 22:34:57.199372 master-0 kubenswrapper[33572]: I1204 22:34:57.199316 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-lxrlk"] Dec 04 22:34:57.207429 master-0 kubenswrapper[33572]: I1204 22:34:57.207377 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 22:34:57.211615 master-0 kubenswrapper[33572]: I1204 22:34:57.211555 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b7zzl"] Dec 04 22:34:57.214933 master-0 kubenswrapper[33572]: W1204 22:34:57.214883 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cdb5e15_32e4_4257_be61_ec0ba1a6884e.slice/crio-2cf56dddc2abb084322e094fa18d088ee51a6200b47ed45fbd471da397b778d6 WatchSource:0}: Error finding container 2cf56dddc2abb084322e094fa18d088ee51a6200b47ed45fbd471da397b778d6: Status 404 returned error can't find the container with id 2cf56dddc2abb084322e094fa18d088ee51a6200b47ed45fbd471da397b778d6 Dec 04 22:34:57.228263 master-0 kubenswrapper[33572]: I1204 22:34:57.221720 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ms5vx"] Dec 04 22:34:57.228263 master-0 kubenswrapper[33572]: I1204 22:34:57.228043 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d7495797c-5fbwv"] Dec 04 22:34:57.508754 master-0 kubenswrapper[33572]: E1204 22:34:57.508664 33572 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8733e1a3_7b35_40cb_b129_536fe10961a8.slice/crio-abdd217b047e5958a981618bd0d5cac76014fb5b033a88a73333a5f1d1e58c17.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a267444_303d_4eaf_98f5_01b11c7efe8f.slice/crio-conmon-d689486f9be58a6935f936459fe33002b1b01666be0772cbeb95fabb72de0703.scope\": RecentStats: unable to find data in memory cache]" Dec 04 22:34:57.561959 master-0 kubenswrapper[33572]: I1204 22:34:57.561751 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:34:57.566371 master-0 kubenswrapper[33572]: I1204 22:34:57.566325 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.569548 master-0 kubenswrapper[33572]: I1204 22:34:57.569223 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 22:34:57.569613 master-0 kubenswrapper[33572]: I1204 22:34:57.569582 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-7675d-default-external-config-data" Dec 04 22:34:57.570647 master-0 kubenswrapper[33572]: I1204 22:34:57.570602 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 22:34:57.583977 master-0 kubenswrapper[33572]: I1204 22:34:57.583932 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:34:57.610117 master-0 kubenswrapper[33572]: I1204 22:34:57.610048 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x95k6\" (UniqueName: \"kubernetes.io/projected/e5d11eba-99b8-4bb2-9d30-652d86c553fa-kube-api-access-x95k6\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.610117 master-0 kubenswrapper[33572]: I1204 22:34:57.610116 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.610416 master-0 kubenswrapper[33572]: I1204 22:34:57.610366 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-config-data\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.610458 master-0 kubenswrapper[33572]: I1204 22:34:57.610414 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-httpd-run\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.610487 master-0 kubenswrapper[33572]: I1204 22:34:57.610461 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-public-tls-certs\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.610542 master-0 kubenswrapper[33572]: I1204 22:34:57.610484 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-logs\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.610574 master-0 kubenswrapper[33572]: I1204 22:34:57.610549 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-combined-ca-bundle\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.610718 master-0 kubenswrapper[33572]: I1204 22:34:57.610662 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-scripts\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.712626 master-0 kubenswrapper[33572]: I1204 22:34:57.712567 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x95k6\" (UniqueName: \"kubernetes.io/projected/e5d11eba-99b8-4bb2-9d30-652d86c553fa-kube-api-access-x95k6\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.712626 master-0 kubenswrapper[33572]: I1204 22:34:57.712623 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.713176 master-0 kubenswrapper[33572]: I1204 22:34:57.712642 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-config-data\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.713176 master-0 kubenswrapper[33572]: I1204 22:34:57.712673 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-httpd-run\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.713176 master-0 kubenswrapper[33572]: I1204 22:34:57.712716 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-public-tls-certs\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.713176 master-0 kubenswrapper[33572]: I1204 22:34:57.712740 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-logs\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.713176 master-0 kubenswrapper[33572]: I1204 22:34:57.712770 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-combined-ca-bundle\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.713176 master-0 kubenswrapper[33572]: I1204 22:34:57.712794 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-scripts\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.714240 master-0 kubenswrapper[33572]: I1204 22:34:57.714050 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-logs\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.714240 master-0 kubenswrapper[33572]: I1204 22:34:57.714194 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-httpd-run\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.718072 master-0 kubenswrapper[33572]: I1204 22:34:57.717602 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:34:57.718072 master-0 kubenswrapper[33572]: I1204 22:34:57.717634 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c09096dd0f6c531150e055f5f0297026538faf280cff6c93023f9f573f827900/globalmount\"" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.718072 master-0 kubenswrapper[33572]: I1204 22:34:57.717790 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-scripts\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.718072 master-0 kubenswrapper[33572]: I1204 22:34:57.718016 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-combined-ca-bundle\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.718267 master-0 kubenswrapper[33572]: I1204 22:34:57.718195 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-public-tls-certs\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.720857 master-0 kubenswrapper[33572]: I1204 22:34:57.720801 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-config-data\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.735082 master-0 kubenswrapper[33572]: I1204 22:34:57.734972 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x95k6\" (UniqueName: \"kubernetes.io/projected/e5d11eba-99b8-4bb2-9d30-652d86c553fa-kube-api-access-x95k6\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:57.825924 master-0 kubenswrapper[33572]: I1204 22:34:57.825829 33572 generic.go:334] "Generic (PLEG): container finished" podID="a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" containerID="29b204cc4b45d355767e20b68482224f716a0bd8e04053ae0bbb254486085bd7" exitCode=0 Dec 04 22:34:57.828604 master-0 kubenswrapper[33572]: I1204 22:34:57.826027 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" event={"ID":"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50","Type":"ContainerDied","Data":"29b204cc4b45d355767e20b68482224f716a0bd8e04053ae0bbb254486085bd7"} Dec 04 22:34:57.828604 master-0 kubenswrapper[33572]: I1204 22:34:57.826935 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" event={"ID":"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50","Type":"ContainerStarted","Data":"7c5db0a0b96fc171be3f91d985e326537d3bdf4b9c65a00a4fbc319d9d7fc743"} Dec 04 22:34:57.830982 master-0 kubenswrapper[33572]: I1204 22:34:57.830917 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lxrlk" event={"ID":"2609a439-7ac0-4253-a74e-e4c90a023832","Type":"ContainerStarted","Data":"f5642154d0543171958cf65f3dc2a6a8cd75c02eb1c3577bddc47c91c3995270"} Dec 04 22:34:57.830982 master-0 kubenswrapper[33572]: I1204 22:34:57.831002 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lxrlk" event={"ID":"2609a439-7ac0-4253-a74e-e4c90a023832","Type":"ContainerStarted","Data":"6e082596aabe4ae5272edacf962d17659440664e3ccb24f2c8eacfe799775895"} Dec 04 22:34:57.834148 master-0 kubenswrapper[33572]: I1204 22:34:57.834100 33572 generic.go:334] "Generic (PLEG): container finished" podID="d61c73b1-2107-4afb-ba37-c185b699095b" containerID="4de6574da1ff77f5b01699de527e595c9f14fcd59f29e1aa6140654a5a368d72" exitCode=0 Dec 04 22:34:57.834398 master-0 kubenswrapper[33572]: I1204 22:34:57.834139 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7db899889c-wtqrt" event={"ID":"d61c73b1-2107-4afb-ba37-c185b699095b","Type":"ContainerDied","Data":"4de6574da1ff77f5b01699de527e595c9f14fcd59f29e1aa6140654a5a368d72"} Dec 04 22:34:57.836185 master-0 kubenswrapper[33572]: I1204 22:34:57.836158 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ms5vx" event={"ID":"8cdb5e15-32e4-4257-be61-ec0ba1a6884e","Type":"ContainerStarted","Data":"2cf56dddc2abb084322e094fa18d088ee51a6200b47ed45fbd471da397b778d6"} Dec 04 22:34:57.842892 master-0 kubenswrapper[33572]: I1204 22:34:57.842804 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b7zzl" event={"ID":"d66b1f8f-3a86-442f-bdec-422e5a6e03ee","Type":"ContainerStarted","Data":"ea94b2e79bc02874fbf7839b0b47359fe9347e3d1012c0f8016aaeadd109d828"} Dec 04 22:34:57.842892 master-0 kubenswrapper[33572]: I1204 22:34:57.842874 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b7zzl" event={"ID":"d66b1f8f-3a86-442f-bdec-422e5a6e03ee","Type":"ContainerStarted","Data":"be4c08f6200e18a631c7a9bfef072f6b1c1e4f4f14b95bb4b4ae17aebd07ea90"} Dec 04 22:34:57.848736 master-0 kubenswrapper[33572]: I1204 22:34:57.848593 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-db-sync-d9l4w" event={"ID":"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2","Type":"ContainerStarted","Data":"fb74acd76d089e5817e410933fb939ec607f974ab84b468ed51676cbb1458098"} Dec 04 22:34:57.874822 master-0 kubenswrapper[33572]: I1204 22:34:57.874764 33572 generic.go:334] "Generic (PLEG): container finished" podID="2a267444-303d-4eaf-98f5-01b11c7efe8f" containerID="d689486f9be58a6935f936459fe33002b1b01666be0772cbeb95fabb72de0703" exitCode=0 Dec 04 22:34:57.875742 master-0 kubenswrapper[33572]: I1204 22:34:57.874938 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-ab6c-account-create-update-vs69m" event={"ID":"2a267444-303d-4eaf-98f5-01b11c7efe8f","Type":"ContainerDied","Data":"d689486f9be58a6935f936459fe33002b1b01666be0772cbeb95fabb72de0703"} Dec 04 22:34:57.888190 master-0 kubenswrapper[33572]: I1204 22:34:57.888131 33572 generic.go:334] "Generic (PLEG): container finished" podID="8733e1a3-7b35-40cb-b129-536fe10961a8" containerID="abdd217b047e5958a981618bd0d5cac76014fb5b033a88a73333a5f1d1e58c17" exitCode=0 Dec 04 22:34:57.888957 master-0 kubenswrapper[33572]: I1204 22:34:57.888476 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-xd84n" event={"ID":"8733e1a3-7b35-40cb-b129-536fe10961a8","Type":"ContainerDied","Data":"abdd217b047e5958a981618bd0d5cac76014fb5b033a88a73333a5f1d1e58c17"} Dec 04 22:34:57.992593 master-0 kubenswrapper[33572]: I1204 22:34:57.980142 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-lxrlk" podStartSLOduration=2.980121941 podStartE2EDuration="2.980121941s" podCreationTimestamp="2025-12-04 22:34:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:34:57.969747043 +0000 UTC m=+961.697272712" watchObservedRunningTime="2025-12-04 22:34:57.980121941 +0000 UTC m=+961.707647590" Dec 04 22:34:58.002519 master-0 kubenswrapper[33572]: I1204 22:34:57.999992 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b7zzl" podStartSLOduration=2.999972052 podStartE2EDuration="2.999972052s" podCreationTimestamp="2025-12-04 22:34:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:34:57.997909224 +0000 UTC m=+961.725434873" watchObservedRunningTime="2025-12-04 22:34:57.999972052 +0000 UTC m=+961.727497701" Dec 04 22:34:58.331897 master-0 kubenswrapper[33572]: I1204 22:34:58.331819 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:58.452532 master-0 kubenswrapper[33572]: I1204 22:34:58.447157 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-nb\") pod \"d61c73b1-2107-4afb-ba37-c185b699095b\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " Dec 04 22:34:58.452532 master-0 kubenswrapper[33572]: I1204 22:34:58.447212 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-sb\") pod \"d61c73b1-2107-4afb-ba37-c185b699095b\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " Dec 04 22:34:58.452532 master-0 kubenswrapper[33572]: I1204 22:34:58.447234 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-swift-storage-0\") pod \"d61c73b1-2107-4afb-ba37-c185b699095b\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " Dec 04 22:34:58.452532 master-0 kubenswrapper[33572]: I1204 22:34:58.447261 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsm67\" (UniqueName: \"kubernetes.io/projected/d61c73b1-2107-4afb-ba37-c185b699095b-kube-api-access-dsm67\") pod \"d61c73b1-2107-4afb-ba37-c185b699095b\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " Dec 04 22:34:58.452532 master-0 kubenswrapper[33572]: I1204 22:34:58.447290 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-config\") pod \"d61c73b1-2107-4afb-ba37-c185b699095b\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " Dec 04 22:34:58.452532 master-0 kubenswrapper[33572]: I1204 22:34:58.447336 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-svc\") pod \"d61c73b1-2107-4afb-ba37-c185b699095b\" (UID: \"d61c73b1-2107-4afb-ba37-c185b699095b\") " Dec 04 22:34:58.458549 master-0 kubenswrapper[33572]: I1204 22:34:58.457748 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61c73b1-2107-4afb-ba37-c185b699095b-kube-api-access-dsm67" (OuterVolumeSpecName: "kube-api-access-dsm67") pod "d61c73b1-2107-4afb-ba37-c185b699095b" (UID: "d61c73b1-2107-4afb-ba37-c185b699095b"). InnerVolumeSpecName "kube-api-access-dsm67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:58.477712 master-0 kubenswrapper[33572]: I1204 22:34:58.477635 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d61c73b1-2107-4afb-ba37-c185b699095b" (UID: "d61c73b1-2107-4afb-ba37-c185b699095b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:58.512394 master-0 kubenswrapper[33572]: I1204 22:34:58.512268 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d61c73b1-2107-4afb-ba37-c185b699095b" (UID: "d61c73b1-2107-4afb-ba37-c185b699095b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:58.512781 master-0 kubenswrapper[33572]: I1204 22:34:58.512716 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d61c73b1-2107-4afb-ba37-c185b699095b" (UID: "d61c73b1-2107-4afb-ba37-c185b699095b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:58.513377 master-0 kubenswrapper[33572]: I1204 22:34:58.513307 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-config" (OuterVolumeSpecName: "config") pod "d61c73b1-2107-4afb-ba37-c185b699095b" (UID: "d61c73b1-2107-4afb-ba37-c185b699095b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:58.532325 master-0 kubenswrapper[33572]: I1204 22:34:58.532228 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d61c73b1-2107-4afb-ba37-c185b699095b" (UID: "d61c73b1-2107-4afb-ba37-c185b699095b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:58.549692 master-0 kubenswrapper[33572]: I1204 22:34:58.549630 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:58.549692 master-0 kubenswrapper[33572]: I1204 22:34:58.549681 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:58.549692 master-0 kubenswrapper[33572]: I1204 22:34:58.549695 33572 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:58.549989 master-0 kubenswrapper[33572]: I1204 22:34:58.549709 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsm67\" (UniqueName: \"kubernetes.io/projected/d61c73b1-2107-4afb-ba37-c185b699095b-kube-api-access-dsm67\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:58.549989 master-0 kubenswrapper[33572]: I1204 22:34:58.549721 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:58.549989 master-0 kubenswrapper[33572]: I1204 22:34:58.549732 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d61c73b1-2107-4afb-ba37-c185b699095b-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:58.561584 master-0 kubenswrapper[33572]: I1204 22:34:58.561527 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7675d-default-internal-api-0"] Dec 04 22:34:58.561954 master-0 kubenswrapper[33572]: E1204 22:34:58.561909 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61c73b1-2107-4afb-ba37-c185b699095b" containerName="init" Dec 04 22:34:58.561954 master-0 kubenswrapper[33572]: I1204 22:34:58.561925 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61c73b1-2107-4afb-ba37-c185b699095b" containerName="init" Dec 04 22:34:58.562151 master-0 kubenswrapper[33572]: I1204 22:34:58.562123 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61c73b1-2107-4afb-ba37-c185b699095b" containerName="init" Dec 04 22:34:58.563206 master-0 kubenswrapper[33572]: I1204 22:34:58.563176 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.565451 master-0 kubenswrapper[33572]: I1204 22:34:58.565417 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-7675d-default-internal-config-data" Dec 04 22:34:58.565623 master-0 kubenswrapper[33572]: I1204 22:34:58.565594 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 22:34:58.577675 master-0 kubenswrapper[33572]: I1204 22:34:58.577542 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7675d-default-internal-api-0"] Dec 04 22:34:58.608663 master-0 kubenswrapper[33572]: I1204 22:34:58.608570 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:34:58.609400 master-0 kubenswrapper[33572]: E1204 22:34:58.609334 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-7675d-default-external-api-0" podUID="e5d11eba-99b8-4bb2-9d30-652d86c553fa" Dec 04 22:34:58.754868 master-0 kubenswrapper[33572]: I1204 22:34:58.754799 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-httpd-run\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.754868 master-0 kubenswrapper[33572]: I1204 22:34:58.754849 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-combined-ca-bundle\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.755464 master-0 kubenswrapper[33572]: I1204 22:34:58.754957 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-config-data\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.755464 master-0 kubenswrapper[33572]: I1204 22:34:58.755006 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-internal-tls-certs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.755464 master-0 kubenswrapper[33572]: I1204 22:34:58.755030 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r952\" (UniqueName: \"kubernetes.io/projected/4916f987-e677-4aee-b52c-88534ce7b28b-kube-api-access-4r952\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.755464 master-0 kubenswrapper[33572]: I1204 22:34:58.755166 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-logs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.755464 master-0 kubenswrapper[33572]: I1204 22:34:58.755225 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.755464 master-0 kubenswrapper[33572]: I1204 22:34:58.755465 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-scripts\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.858756 master-0 kubenswrapper[33572]: I1204 22:34:58.858698 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-config-data\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.858957 master-0 kubenswrapper[33572]: I1204 22:34:58.858777 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-internal-tls-certs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.858957 master-0 kubenswrapper[33572]: I1204 22:34:58.858807 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r952\" (UniqueName: \"kubernetes.io/projected/4916f987-e677-4aee-b52c-88534ce7b28b-kube-api-access-4r952\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.858957 master-0 kubenswrapper[33572]: I1204 22:34:58.858838 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-logs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.858957 master-0 kubenswrapper[33572]: I1204 22:34:58.858859 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.858957 master-0 kubenswrapper[33572]: I1204 22:34:58.858948 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-scripts\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.859129 master-0 kubenswrapper[33572]: I1204 22:34:58.858974 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-httpd-run\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.859956 master-0 kubenswrapper[33572]: I1204 22:34:58.859235 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-combined-ca-bundle\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.862393 master-0 kubenswrapper[33572]: I1204 22:34:58.862190 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-logs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.868738 master-0 kubenswrapper[33572]: I1204 22:34:58.866894 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:34:58.868738 master-0 kubenswrapper[33572]: I1204 22:34:58.866978 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/19b4b44bb68e6cf797033e7d835d74dc088fa8fee5def4ff67ebc77f83d36479/globalmount\"" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.880968 master-0 kubenswrapper[33572]: I1204 22:34:58.880913 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-httpd-run\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.883612 master-0 kubenswrapper[33572]: I1204 22:34:58.883561 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-config-data\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.885797 master-0 kubenswrapper[33572]: I1204 22:34:58.884455 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-internal-tls-certs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.885797 master-0 kubenswrapper[33572]: I1204 22:34:58.884980 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-scripts\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.885797 master-0 kubenswrapper[33572]: I1204 22:34:58.885579 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-combined-ca-bundle\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.890337 master-0 kubenswrapper[33572]: I1204 22:34:58.890260 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r952\" (UniqueName: \"kubernetes.io/projected/4916f987-e677-4aee-b52c-88534ce7b28b-kube-api-access-4r952\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:34:58.915446 master-0 kubenswrapper[33572]: I1204 22:34:58.915391 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" event={"ID":"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50","Type":"ContainerStarted","Data":"2147933aafa02c46bc0780d7ecb4bada18ebbbc4671ffe40cebdf847e8ec4077"} Dec 04 22:34:58.915644 master-0 kubenswrapper[33572]: I1204 22:34:58.915493 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:34:58.923873 master-0 kubenswrapper[33572]: I1204 22:34:58.923817 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7db899889c-wtqrt" Dec 04 22:34:58.924036 master-0 kubenswrapper[33572]: I1204 22:34:58.923899 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7db899889c-wtqrt" event={"ID":"d61c73b1-2107-4afb-ba37-c185b699095b","Type":"ContainerDied","Data":"a5e072bbc201f295d561aa60312ec6c98a1bff59f116233f25df55666f91b824"} Dec 04 22:34:58.924168 master-0 kubenswrapper[33572]: I1204 22:34:58.924056 33572 scope.go:117] "RemoveContainer" containerID="4de6574da1ff77f5b01699de527e595c9f14fcd59f29e1aa6140654a5a368d72" Dec 04 22:34:58.926135 master-0 kubenswrapper[33572]: I1204 22:34:58.926059 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:58.952218 master-0 kubenswrapper[33572]: I1204 22:34:58.952130 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" podStartSLOduration=3.952107974 podStartE2EDuration="3.952107974s" podCreationTimestamp="2025-12-04 22:34:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:34:58.940686038 +0000 UTC m=+962.668211687" watchObservedRunningTime="2025-12-04 22:34:58.952107974 +0000 UTC m=+962.679633623" Dec 04 22:34:59.023295 master-0 kubenswrapper[33572]: I1204 22:34:59.023254 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:59.094797 master-0 kubenswrapper[33572]: I1204 22:34:59.094561 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7db899889c-wtqrt"] Dec 04 22:34:59.105609 master-0 kubenswrapper[33572]: I1204 22:34:59.105493 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7db899889c-wtqrt"] Dec 04 22:34:59.168970 master-0 kubenswrapper[33572]: I1204 22:34:59.168923 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-scripts\") pod \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " Dec 04 22:34:59.168970 master-0 kubenswrapper[33572]: I1204 22:34:59.168977 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-config-data\") pod \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " Dec 04 22:34:59.169284 master-0 kubenswrapper[33572]: I1204 22:34:59.169014 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-public-tls-certs\") pod \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " Dec 04 22:34:59.169284 master-0 kubenswrapper[33572]: I1204 22:34:59.169086 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-httpd-run\") pod \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " Dec 04 22:34:59.169284 master-0 kubenswrapper[33572]: I1204 22:34:59.169233 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-logs\") pod \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " Dec 04 22:34:59.169284 master-0 kubenswrapper[33572]: I1204 22:34:59.169252 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-combined-ca-bundle\") pod \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " Dec 04 22:34:59.169284 master-0 kubenswrapper[33572]: I1204 22:34:59.169277 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x95k6\" (UniqueName: \"kubernetes.io/projected/e5d11eba-99b8-4bb2-9d30-652d86c553fa-kube-api-access-x95k6\") pod \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " Dec 04 22:34:59.169776 master-0 kubenswrapper[33572]: I1204 22:34:59.169745 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:34:59.170930 master-0 kubenswrapper[33572]: I1204 22:34:59.170876 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e5d11eba-99b8-4bb2-9d30-652d86c553fa" (UID: "e5d11eba-99b8-4bb2-9d30-652d86c553fa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:34:59.171234 master-0 kubenswrapper[33572]: I1204 22:34:59.171208 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-logs" (OuterVolumeSpecName: "logs") pod "e5d11eba-99b8-4bb2-9d30-652d86c553fa" (UID: "e5d11eba-99b8-4bb2-9d30-652d86c553fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:34:59.179524 master-0 kubenswrapper[33572]: I1204 22:34:59.179207 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-scripts" (OuterVolumeSpecName: "scripts") pod "e5d11eba-99b8-4bb2-9d30-652d86c553fa" (UID: "e5d11eba-99b8-4bb2-9d30-652d86c553fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:59.179524 master-0 kubenswrapper[33572]: I1204 22:34:59.179252 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d11eba-99b8-4bb2-9d30-652d86c553fa-kube-api-access-x95k6" (OuterVolumeSpecName: "kube-api-access-x95k6") pod "e5d11eba-99b8-4bb2-9d30-652d86c553fa" (UID: "e5d11eba-99b8-4bb2-9d30-652d86c553fa"). InnerVolumeSpecName "kube-api-access-x95k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:59.180929 master-0 kubenswrapper[33572]: I1204 22:34:59.180891 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e5d11eba-99b8-4bb2-9d30-652d86c553fa" (UID: "e5d11eba-99b8-4bb2-9d30-652d86c553fa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:59.182437 master-0 kubenswrapper[33572]: I1204 22:34:59.182343 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-config-data" (OuterVolumeSpecName: "config-data") pod "e5d11eba-99b8-4bb2-9d30-652d86c553fa" (UID: "e5d11eba-99b8-4bb2-9d30-652d86c553fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:59.182975 master-0 kubenswrapper[33572]: I1204 22:34:59.182928 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5d11eba-99b8-4bb2-9d30-652d86c553fa" (UID: "e5d11eba-99b8-4bb2-9d30-652d86c553fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:34:59.273407 master-0 kubenswrapper[33572]: I1204 22:34:59.273254 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\" (UID: \"e5d11eba-99b8-4bb2-9d30-652d86c553fa\") " Dec 04 22:34:59.276579 master-0 kubenswrapper[33572]: I1204 22:34:59.276537 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.276579 master-0 kubenswrapper[33572]: I1204 22:34:59.276570 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.276579 master-0 kubenswrapper[33572]: I1204 22:34:59.276582 33572 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.277107 master-0 kubenswrapper[33572]: I1204 22:34:59.276591 33572 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.277107 master-0 kubenswrapper[33572]: I1204 22:34:59.276600 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5d11eba-99b8-4bb2-9d30-652d86c553fa-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.277107 master-0 kubenswrapper[33572]: I1204 22:34:59.276610 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d11eba-99b8-4bb2-9d30-652d86c553fa-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.277107 master-0 kubenswrapper[33572]: I1204 22:34:59.276619 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x95k6\" (UniqueName: \"kubernetes.io/projected/e5d11eba-99b8-4bb2-9d30-652d86c553fa-kube-api-access-x95k6\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.523723 master-0 kubenswrapper[33572]: I1204 22:34:59.523667 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-ab6c-account-create-update-vs69m" Dec 04 22:34:59.575307 master-0 kubenswrapper[33572]: I1204 22:34:59.574610 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-xd84n" Dec 04 22:34:59.699649 master-0 kubenswrapper[33572]: I1204 22:34:59.699318 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q24zl\" (UniqueName: \"kubernetes.io/projected/2a267444-303d-4eaf-98f5-01b11c7efe8f-kube-api-access-q24zl\") pod \"2a267444-303d-4eaf-98f5-01b11c7efe8f\" (UID: \"2a267444-303d-4eaf-98f5-01b11c7efe8f\") " Dec 04 22:34:59.699649 master-0 kubenswrapper[33572]: I1204 22:34:59.699434 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8733e1a3-7b35-40cb-b129-536fe10961a8-operator-scripts\") pod \"8733e1a3-7b35-40cb-b129-536fe10961a8\" (UID: \"8733e1a3-7b35-40cb-b129-536fe10961a8\") " Dec 04 22:34:59.699649 master-0 kubenswrapper[33572]: I1204 22:34:59.699556 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lx2q\" (UniqueName: \"kubernetes.io/projected/8733e1a3-7b35-40cb-b129-536fe10961a8-kube-api-access-9lx2q\") pod \"8733e1a3-7b35-40cb-b129-536fe10961a8\" (UID: \"8733e1a3-7b35-40cb-b129-536fe10961a8\") " Dec 04 22:34:59.701396 master-0 kubenswrapper[33572]: I1204 22:34:59.700648 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a267444-303d-4eaf-98f5-01b11c7efe8f-operator-scripts\") pod \"2a267444-303d-4eaf-98f5-01b11c7efe8f\" (UID: \"2a267444-303d-4eaf-98f5-01b11c7efe8f\") " Dec 04 22:34:59.703360 master-0 kubenswrapper[33572]: I1204 22:34:59.702980 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a267444-303d-4eaf-98f5-01b11c7efe8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a267444-303d-4eaf-98f5-01b11c7efe8f" (UID: "2a267444-303d-4eaf-98f5-01b11c7efe8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:59.703801 master-0 kubenswrapper[33572]: I1204 22:34:59.703737 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8733e1a3-7b35-40cb-b129-536fe10961a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8733e1a3-7b35-40cb-b129-536fe10961a8" (UID: "8733e1a3-7b35-40cb-b129-536fe10961a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:34:59.706286 master-0 kubenswrapper[33572]: I1204 22:34:59.706222 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a267444-303d-4eaf-98f5-01b11c7efe8f-kube-api-access-q24zl" (OuterVolumeSpecName: "kube-api-access-q24zl") pod "2a267444-303d-4eaf-98f5-01b11c7efe8f" (UID: "2a267444-303d-4eaf-98f5-01b11c7efe8f"). InnerVolumeSpecName "kube-api-access-q24zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:59.710808 master-0 kubenswrapper[33572]: I1204 22:34:59.710746 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8733e1a3-7b35-40cb-b129-536fe10961a8-kube-api-access-9lx2q" (OuterVolumeSpecName: "kube-api-access-9lx2q") pod "8733e1a3-7b35-40cb-b129-536fe10961a8" (UID: "8733e1a3-7b35-40cb-b129-536fe10961a8"). InnerVolumeSpecName "kube-api-access-9lx2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:34:59.803997 master-0 kubenswrapper[33572]: I1204 22:34:59.803949 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a267444-303d-4eaf-98f5-01b11c7efe8f-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.804706 master-0 kubenswrapper[33572]: I1204 22:34:59.804658 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q24zl\" (UniqueName: \"kubernetes.io/projected/2a267444-303d-4eaf-98f5-01b11c7efe8f-kube-api-access-q24zl\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.804794 master-0 kubenswrapper[33572]: I1204 22:34:59.804711 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8733e1a3-7b35-40cb-b129-536fe10961a8-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.804794 master-0 kubenswrapper[33572]: I1204 22:34:59.804723 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lx2q\" (UniqueName: \"kubernetes.io/projected/8733e1a3-7b35-40cb-b129-536fe10961a8-kube-api-access-9lx2q\") on node \"master-0\" DevicePath \"\"" Dec 04 22:34:59.945949 master-0 kubenswrapper[33572]: I1204 22:34:59.945866 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-ab6c-account-create-update-vs69m" event={"ID":"2a267444-303d-4eaf-98f5-01b11c7efe8f","Type":"ContainerDied","Data":"17156bab2c23d7de2097c515ea3e47d0a1a21869207a6a93fcfa55048b99b4c3"} Dec 04 22:34:59.945949 master-0 kubenswrapper[33572]: I1204 22:34:59.945918 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17156bab2c23d7de2097c515ea3e47d0a1a21869207a6a93fcfa55048b99b4c3" Dec 04 22:34:59.945949 master-0 kubenswrapper[33572]: I1204 22:34:59.945973 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-ab6c-account-create-update-vs69m" Dec 04 22:34:59.956566 master-0 kubenswrapper[33572]: I1204 22:34:59.956160 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-xd84n" event={"ID":"8733e1a3-7b35-40cb-b129-536fe10961a8","Type":"ContainerDied","Data":"635db6de4acea0d4e99aea3654170143562ee3361b6b7748fb83fa63669943bd"} Dec 04 22:34:59.956566 master-0 kubenswrapper[33572]: I1204 22:34:59.956211 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="635db6de4acea0d4e99aea3654170143562ee3361b6b7748fb83fa63669943bd" Dec 04 22:34:59.956566 master-0 kubenswrapper[33572]: I1204 22:34:59.956222 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-xd84n" Dec 04 22:34:59.961363 master-0 kubenswrapper[33572]: I1204 22:34:59.959425 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:00.543750 master-0 kubenswrapper[33572]: I1204 22:35:00.543649 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61c73b1-2107-4afb-ba37-c185b699095b" path="/var/lib/kubelet/pods/d61c73b1-2107-4afb-ba37-c185b699095b/volumes" Dec 04 22:35:00.979251 master-0 kubenswrapper[33572]: I1204 22:35:00.978375 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358" (OuterVolumeSpecName: "glance") pod "e5d11eba-99b8-4bb2-9d30-652d86c553fa" (UID: "e5d11eba-99b8-4bb2-9d30-652d86c553fa"). InnerVolumeSpecName "pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 22:35:00.982152 master-0 kubenswrapper[33572]: I1204 22:35:00.982132 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") pod \"glance-7675d-default-internal-api-0\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:35:01.040317 master-0 kubenswrapper[33572]: I1204 22:35:01.040250 33572 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") on node \"master-0\" " Dec 04 22:35:01.071820 master-0 kubenswrapper[33572]: I1204 22:35:01.071757 33572 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 22:35:01.072014 master-0 kubenswrapper[33572]: I1204 22:35:01.071930 33572 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86" (UniqueName: "kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358") on node "master-0" Dec 04 22:35:01.144329 master-0 kubenswrapper[33572]: I1204 22:35:01.142711 33572 reconciler_common.go:293] "Volume detached for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:01.465957 master-0 kubenswrapper[33572]: I1204 22:35:01.465771 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-j89sq"] Dec 04 22:35:01.466362 master-0 kubenswrapper[33572]: E1204 22:35:01.466339 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8733e1a3-7b35-40cb-b129-536fe10961a8" containerName="mariadb-database-create" Dec 04 22:35:01.466362 master-0 kubenswrapper[33572]: I1204 22:35:01.466359 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8733e1a3-7b35-40cb-b129-536fe10961a8" containerName="mariadb-database-create" Dec 04 22:35:01.466651 master-0 kubenswrapper[33572]: E1204 22:35:01.466388 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a267444-303d-4eaf-98f5-01b11c7efe8f" containerName="mariadb-account-create-update" Dec 04 22:35:01.466651 master-0 kubenswrapper[33572]: I1204 22:35:01.466395 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a267444-303d-4eaf-98f5-01b11c7efe8f" containerName="mariadb-account-create-update" Dec 04 22:35:01.466651 master-0 kubenswrapper[33572]: I1204 22:35:01.466647 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="8733e1a3-7b35-40cb-b129-536fe10961a8" containerName="mariadb-database-create" Dec 04 22:35:01.466969 master-0 kubenswrapper[33572]: I1204 22:35:01.466675 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a267444-303d-4eaf-98f5-01b11c7efe8f" containerName="mariadb-account-create-update" Dec 04 22:35:01.468011 master-0 kubenswrapper[33572]: I1204 22:35:01.467959 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.472615 master-0 kubenswrapper[33572]: I1204 22:35:01.472470 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Dec 04 22:35:01.472615 master-0 kubenswrapper[33572]: I1204 22:35:01.472477 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Dec 04 22:35:01.559380 master-0 kubenswrapper[33572]: I1204 22:35:01.559288 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/320dd132-b10a-4d56-88a4-307ecb61196f-etc-podinfo\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.559380 master-0 kubenswrapper[33572]: I1204 22:35:01.559368 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/320dd132-b10a-4d56-88a4-307ecb61196f-config-data-merged\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.560306 master-0 kubenswrapper[33572]: I1204 22:35:01.559453 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkptk\" (UniqueName: \"kubernetes.io/projected/320dd132-b10a-4d56-88a4-307ecb61196f-kube-api-access-rkptk\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.560306 master-0 kubenswrapper[33572]: I1204 22:35:01.559533 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-scripts\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.560306 master-0 kubenswrapper[33572]: I1204 22:35:01.559679 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-config-data\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.560306 master-0 kubenswrapper[33572]: I1204 22:35:01.559760 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-combined-ca-bundle\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.593942 master-0 kubenswrapper[33572]: I1204 22:35:01.593881 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:35:01.662080 master-0 kubenswrapper[33572]: I1204 22:35:01.661994 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-config-data\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.662080 master-0 kubenswrapper[33572]: I1204 22:35:01.662082 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-combined-ca-bundle\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.662524 master-0 kubenswrapper[33572]: I1204 22:35:01.662393 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/320dd132-b10a-4d56-88a4-307ecb61196f-etc-podinfo\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.663237 master-0 kubenswrapper[33572]: I1204 22:35:01.662833 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/320dd132-b10a-4d56-88a4-307ecb61196f-config-data-merged\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.663237 master-0 kubenswrapper[33572]: I1204 22:35:01.662879 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkptk\" (UniqueName: \"kubernetes.io/projected/320dd132-b10a-4d56-88a4-307ecb61196f-kube-api-access-rkptk\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.663237 master-0 kubenswrapper[33572]: I1204 22:35:01.662931 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-scripts\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.663416 master-0 kubenswrapper[33572]: I1204 22:35:01.663289 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/320dd132-b10a-4d56-88a4-307ecb61196f-config-data-merged\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.691533 master-0 kubenswrapper[33572]: I1204 22:35:01.678188 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-combined-ca-bundle\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.691533 master-0 kubenswrapper[33572]: I1204 22:35:01.688384 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-scripts\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.692715 master-0 kubenswrapper[33572]: I1204 22:35:01.692637 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-config-data\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.706103 master-0 kubenswrapper[33572]: I1204 22:35:01.698189 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/320dd132-b10a-4d56-88a4-307ecb61196f-etc-podinfo\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.706103 master-0 kubenswrapper[33572]: I1204 22:35:01.699381 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-j89sq"] Dec 04 22:35:01.729627 master-0 kubenswrapper[33572]: I1204 22:35:01.729452 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkptk\" (UniqueName: \"kubernetes.io/projected/320dd132-b10a-4d56-88a4-307ecb61196f-kube-api-access-rkptk\") pod \"ironic-db-sync-j89sq\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.759711 master-0 kubenswrapper[33572]: I1204 22:35:01.757487 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:35:01.774383 master-0 kubenswrapper[33572]: I1204 22:35:01.763777 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:35:01.799578 master-0 kubenswrapper[33572]: I1204 22:35:01.799543 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:01.873424 master-0 kubenswrapper[33572]: I1204 22:35:01.873365 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:35:01.875516 master-0 kubenswrapper[33572]: I1204 22:35:01.875322 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:01.878935 master-0 kubenswrapper[33572]: I1204 22:35:01.878888 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 22:35:01.879263 master-0 kubenswrapper[33572]: I1204 22:35:01.879236 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-7675d-default-external-config-data" Dec 04 22:35:01.919217 master-0 kubenswrapper[33572]: I1204 22:35:01.919114 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:35:01.972296 master-0 kubenswrapper[33572]: I1204 22:35:01.971819 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-public-tls-certs\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:01.972296 master-0 kubenswrapper[33572]: I1204 22:35:01.972034 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxgmh\" (UniqueName: \"kubernetes.io/projected/bbe730dd-7824-4438-981d-bf5429987895-kube-api-access-pxgmh\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:01.972296 master-0 kubenswrapper[33572]: I1204 22:35:01.972099 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-config-data\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:01.972479 master-0 kubenswrapper[33572]: I1204 22:35:01.972345 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:01.972479 master-0 kubenswrapper[33572]: I1204 22:35:01.972372 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-httpd-run\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:01.972479 master-0 kubenswrapper[33572]: I1204 22:35:01.972413 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-combined-ca-bundle\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:01.972479 master-0 kubenswrapper[33572]: I1204 22:35:01.972466 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-logs\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:01.973998 master-0 kubenswrapper[33572]: I1204 22:35:01.972704 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-scripts\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:01.992280 master-0 kubenswrapper[33572]: I1204 22:35:01.991641 33572 generic.go:334] "Generic (PLEG): container finished" podID="d66b1f8f-3a86-442f-bdec-422e5a6e03ee" containerID="ea94b2e79bc02874fbf7839b0b47359fe9347e3d1012c0f8016aaeadd109d828" exitCode=0 Dec 04 22:35:01.992280 master-0 kubenswrapper[33572]: I1204 22:35:01.991687 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b7zzl" event={"ID":"d66b1f8f-3a86-442f-bdec-422e5a6e03ee","Type":"ContainerDied","Data":"ea94b2e79bc02874fbf7839b0b47359fe9347e3d1012c0f8016aaeadd109d828"} Dec 04 22:35:02.075583 master-0 kubenswrapper[33572]: I1204 22:35:02.075526 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.075725 master-0 kubenswrapper[33572]: I1204 22:35:02.075593 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-httpd-run\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.075725 master-0 kubenswrapper[33572]: I1204 22:35:02.075640 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-combined-ca-bundle\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.075725 master-0 kubenswrapper[33572]: I1204 22:35:02.075668 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-logs\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.075843 master-0 kubenswrapper[33572]: I1204 22:35:02.075771 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-scripts\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.075843 master-0 kubenswrapper[33572]: I1204 22:35:02.075805 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-public-tls-certs\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.075916 master-0 kubenswrapper[33572]: I1204 22:35:02.075862 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxgmh\" (UniqueName: \"kubernetes.io/projected/bbe730dd-7824-4438-981d-bf5429987895-kube-api-access-pxgmh\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.075916 master-0 kubenswrapper[33572]: I1204 22:35:02.075893 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-config-data\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.077007 master-0 kubenswrapper[33572]: I1204 22:35:02.076962 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-logs\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.077519 master-0 kubenswrapper[33572]: I1204 22:35:02.077471 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-httpd-run\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.080468 master-0 kubenswrapper[33572]: I1204 22:35:02.080436 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-config-data\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.082943 master-0 kubenswrapper[33572]: I1204 22:35:02.082914 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:35:02.083012 master-0 kubenswrapper[33572]: I1204 22:35:02.082948 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c09096dd0f6c531150e055f5f0297026538faf280cff6c93023f9f573f827900/globalmount\"" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.085356 master-0 kubenswrapper[33572]: I1204 22:35:02.085316 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-public-tls-certs\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.093611 master-0 kubenswrapper[33572]: I1204 22:35:02.093525 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-combined-ca-bundle\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.097777 master-0 kubenswrapper[33572]: I1204 22:35:02.096711 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-scripts\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.113440 master-0 kubenswrapper[33572]: I1204 22:35:02.112862 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxgmh\" (UniqueName: \"kubernetes.io/projected/bbe730dd-7824-4438-981d-bf5429987895-kube-api-access-pxgmh\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:02.307545 master-0 kubenswrapper[33572]: W1204 22:35:02.307303 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4916f987_e677_4aee_b52c_88534ce7b28b.slice/crio-5e33ffa1d70bdec93611e0538c3d57febf953d81f629014db6972c2187030c9f WatchSource:0}: Error finding container 5e33ffa1d70bdec93611e0538c3d57febf953d81f629014db6972c2187030c9f: Status 404 returned error can't find the container with id 5e33ffa1d70bdec93611e0538c3d57febf953d81f629014db6972c2187030c9f Dec 04 22:35:02.322784 master-0 kubenswrapper[33572]: I1204 22:35:02.322712 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7675d-default-internal-api-0"] Dec 04 22:35:02.380046 master-0 kubenswrapper[33572]: I1204 22:35:02.379962 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-j89sq"] Dec 04 22:35:02.550659 master-0 kubenswrapper[33572]: I1204 22:35:02.550540 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d11eba-99b8-4bb2-9d30-652d86c553fa" path="/var/lib/kubelet/pods/e5d11eba-99b8-4bb2-9d30-652d86c553fa/volumes" Dec 04 22:35:03.013876 master-0 kubenswrapper[33572]: I1204 22:35:03.013829 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-internal-api-0" event={"ID":"4916f987-e677-4aee-b52c-88534ce7b28b","Type":"ContainerStarted","Data":"39c7519c41cb432277faa3e4c3b52c354404031a6488e3cb4f0175ea369923a3"} Dec 04 22:35:03.014272 master-0 kubenswrapper[33572]: I1204 22:35:03.013882 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-internal-api-0" event={"ID":"4916f987-e677-4aee-b52c-88534ce7b28b","Type":"ContainerStarted","Data":"5e33ffa1d70bdec93611e0538c3d57febf953d81f629014db6972c2187030c9f"} Dec 04 22:35:03.019285 master-0 kubenswrapper[33572]: I1204 22:35:03.019254 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ms5vx" event={"ID":"8cdb5e15-32e4-4257-be61-ec0ba1a6884e","Type":"ContainerStarted","Data":"930a9c223992f2d6a73d76abfb2e99eb730dd3718df76150ecd3d064b4b416c7"} Dec 04 22:35:03.033491 master-0 kubenswrapper[33572]: I1204 22:35:03.033409 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j89sq" event={"ID":"320dd132-b10a-4d56-88a4-307ecb61196f","Type":"ContainerStarted","Data":"acc14b8a76e105174a5a9026f416a9ef37d4daac3e621df587edb759ea99c558"} Dec 04 22:35:03.051693 master-0 kubenswrapper[33572]: I1204 22:35:03.051618 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ms5vx" podStartSLOduration=3.488103192 podStartE2EDuration="8.051599055s" podCreationTimestamp="2025-12-04 22:34:55 +0000 UTC" firstStartedPulling="2025-12-04 22:34:57.221304081 +0000 UTC m=+960.948829730" lastFinishedPulling="2025-12-04 22:35:01.784799944 +0000 UTC m=+965.512325593" observedRunningTime="2025-12-04 22:35:03.048007785 +0000 UTC m=+966.775533474" watchObservedRunningTime="2025-12-04 22:35:03.051599055 +0000 UTC m=+966.779124704" Dec 04 22:35:03.543040 master-0 kubenswrapper[33572]: I1204 22:35:03.542975 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:35:03.562016 master-0 kubenswrapper[33572]: I1204 22:35:03.560635 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:03.636579 master-0 kubenswrapper[33572]: I1204 22:35:03.635502 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-combined-ca-bundle\") pod \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " Dec 04 22:35:03.636579 master-0 kubenswrapper[33572]: I1204 22:35:03.635634 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-scripts\") pod \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " Dec 04 22:35:03.636579 master-0 kubenswrapper[33572]: I1204 22:35:03.635685 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-credential-keys\") pod \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " Dec 04 22:35:03.636579 master-0 kubenswrapper[33572]: I1204 22:35:03.635714 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-fernet-keys\") pod \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " Dec 04 22:35:03.636579 master-0 kubenswrapper[33572]: I1204 22:35:03.635781 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfdw4\" (UniqueName: \"kubernetes.io/projected/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-kube-api-access-mfdw4\") pod \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " Dec 04 22:35:03.636579 master-0 kubenswrapper[33572]: I1204 22:35:03.635830 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-config-data\") pod \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\" (UID: \"d66b1f8f-3a86-442f-bdec-422e5a6e03ee\") " Dec 04 22:35:03.668004 master-0 kubenswrapper[33572]: I1204 22:35:03.667862 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-kube-api-access-mfdw4" (OuterVolumeSpecName: "kube-api-access-mfdw4") pod "d66b1f8f-3a86-442f-bdec-422e5a6e03ee" (UID: "d66b1f8f-3a86-442f-bdec-422e5a6e03ee"). InnerVolumeSpecName "kube-api-access-mfdw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:03.695777 master-0 kubenswrapper[33572]: I1204 22:35:03.687050 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d66b1f8f-3a86-442f-bdec-422e5a6e03ee" (UID: "d66b1f8f-3a86-442f-bdec-422e5a6e03ee"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:03.695777 master-0 kubenswrapper[33572]: I1204 22:35:03.694671 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d66b1f8f-3a86-442f-bdec-422e5a6e03ee" (UID: "d66b1f8f-3a86-442f-bdec-422e5a6e03ee"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:03.696035 master-0 kubenswrapper[33572]: I1204 22:35:03.695911 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-scripts" (OuterVolumeSpecName: "scripts") pod "d66b1f8f-3a86-442f-bdec-422e5a6e03ee" (UID: "d66b1f8f-3a86-442f-bdec-422e5a6e03ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:03.697870 master-0 kubenswrapper[33572]: I1204 22:35:03.696541 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:03.698211 master-0 kubenswrapper[33572]: I1204 22:35:03.698145 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-config-data" (OuterVolumeSpecName: "config-data") pod "d66b1f8f-3a86-442f-bdec-422e5a6e03ee" (UID: "d66b1f8f-3a86-442f-bdec-422e5a6e03ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:03.700575 master-0 kubenswrapper[33572]: I1204 22:35:03.699720 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d66b1f8f-3a86-442f-bdec-422e5a6e03ee" (UID: "d66b1f8f-3a86-442f-bdec-422e5a6e03ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:03.740380 master-0 kubenswrapper[33572]: I1204 22:35:03.740324 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:03.740380 master-0 kubenswrapper[33572]: I1204 22:35:03.740365 33572 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-credential-keys\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:03.740380 master-0 kubenswrapper[33572]: I1204 22:35:03.740376 33572 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-fernet-keys\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:03.740380 master-0 kubenswrapper[33572]: I1204 22:35:03.740387 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfdw4\" (UniqueName: \"kubernetes.io/projected/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-kube-api-access-mfdw4\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:03.740380 master-0 kubenswrapper[33572]: I1204 22:35:03.740399 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:03.740776 master-0 kubenswrapper[33572]: I1204 22:35:03.740407 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d66b1f8f-3a86-442f-bdec-422e5a6e03ee-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:04.069250 master-0 kubenswrapper[33572]: I1204 22:35:04.069201 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b7zzl" event={"ID":"d66b1f8f-3a86-442f-bdec-422e5a6e03ee","Type":"ContainerDied","Data":"be4c08f6200e18a631c7a9bfef072f6b1c1e4f4f14b95bb4b4ae17aebd07ea90"} Dec 04 22:35:04.069250 master-0 kubenswrapper[33572]: I1204 22:35:04.069251 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be4c08f6200e18a631c7a9bfef072f6b1c1e4f4f14b95bb4b4ae17aebd07ea90" Dec 04 22:35:04.071706 master-0 kubenswrapper[33572]: I1204 22:35:04.071685 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b7zzl" Dec 04 22:35:04.084601 master-0 kubenswrapper[33572]: I1204 22:35:04.077524 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-internal-api-0" event={"ID":"4916f987-e677-4aee-b52c-88534ce7b28b","Type":"ContainerStarted","Data":"7c587e73131d9e3b574fc84f4d7a73ef757331f5e2f3fd6e2e7459e88734982d"} Dec 04 22:35:04.119806 master-0 kubenswrapper[33572]: I1204 22:35:04.110120 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-7675d-default-internal-api-0" podStartSLOduration=6.110101438 podStartE2EDuration="6.110101438s" podCreationTimestamp="2025-12-04 22:34:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:04.103803204 +0000 UTC m=+967.831328863" watchObservedRunningTime="2025-12-04 22:35:04.110101438 +0000 UTC m=+967.837627087" Dec 04 22:35:04.136166 master-0 kubenswrapper[33572]: I1204 22:35:04.135883 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b7zzl"] Dec 04 22:35:04.163434 master-0 kubenswrapper[33572]: I1204 22:35:04.163364 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b7zzl"] Dec 04 22:35:04.195985 master-0 kubenswrapper[33572]: I1204 22:35:04.195928 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-l59gq"] Dec 04 22:35:04.196485 master-0 kubenswrapper[33572]: E1204 22:35:04.196453 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d66b1f8f-3a86-442f-bdec-422e5a6e03ee" containerName="keystone-bootstrap" Dec 04 22:35:04.196485 master-0 kubenswrapper[33572]: I1204 22:35:04.196474 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d66b1f8f-3a86-442f-bdec-422e5a6e03ee" containerName="keystone-bootstrap" Dec 04 22:35:04.196765 master-0 kubenswrapper[33572]: I1204 22:35:04.196730 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="d66b1f8f-3a86-442f-bdec-422e5a6e03ee" containerName="keystone-bootstrap" Dec 04 22:35:04.197452 master-0 kubenswrapper[33572]: I1204 22:35:04.197415 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.201879 master-0 kubenswrapper[33572]: I1204 22:35:04.200903 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 22:35:04.201879 master-0 kubenswrapper[33572]: I1204 22:35:04.200972 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 22:35:04.201879 master-0 kubenswrapper[33572]: I1204 22:35:04.201135 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 22:35:04.212527 master-0 kubenswrapper[33572]: I1204 22:35:04.212452 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l59gq"] Dec 04 22:35:04.261152 master-0 kubenswrapper[33572]: I1204 22:35:04.260826 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-credential-keys\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.261152 master-0 kubenswrapper[33572]: I1204 22:35:04.260885 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-config-data\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.261152 master-0 kubenswrapper[33572]: I1204 22:35:04.260912 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-fernet-keys\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.261152 master-0 kubenswrapper[33572]: I1204 22:35:04.260960 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-scripts\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.261152 master-0 kubenswrapper[33572]: I1204 22:35:04.260993 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-combined-ca-bundle\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.261152 master-0 kubenswrapper[33572]: I1204 22:35:04.261011 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x8fq\" (UniqueName: \"kubernetes.io/projected/58d5cab4-61b9-4503-8e64-09f107844457-kube-api-access-6x8fq\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.309744 master-0 kubenswrapper[33572]: I1204 22:35:04.309681 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:35:04.317653 master-0 kubenswrapper[33572]: W1204 22:35:04.315871 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbe730dd_7824_4438_981d_bf5429987895.slice/crio-475d9fedb2eaaf45ff8375b7b0f4cf62b367259bcc2edaff25997421da71fc44 WatchSource:0}: Error finding container 475d9fedb2eaaf45ff8375b7b0f4cf62b367259bcc2edaff25997421da71fc44: Status 404 returned error can't find the container with id 475d9fedb2eaaf45ff8375b7b0f4cf62b367259bcc2edaff25997421da71fc44 Dec 04 22:35:04.363935 master-0 kubenswrapper[33572]: I1204 22:35:04.363888 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-credential-keys\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.364191 master-0 kubenswrapper[33572]: I1204 22:35:04.364176 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-config-data\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.364399 master-0 kubenswrapper[33572]: I1204 22:35:04.364366 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-fernet-keys\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.364595 master-0 kubenswrapper[33572]: I1204 22:35:04.364575 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-scripts\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.364774 master-0 kubenswrapper[33572]: I1204 22:35:04.364758 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-combined-ca-bundle\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.364864 master-0 kubenswrapper[33572]: I1204 22:35:04.364851 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x8fq\" (UniqueName: \"kubernetes.io/projected/58d5cab4-61b9-4503-8e64-09f107844457-kube-api-access-6x8fq\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.368934 master-0 kubenswrapper[33572]: I1204 22:35:04.368863 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-combined-ca-bundle\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.369434 master-0 kubenswrapper[33572]: I1204 22:35:04.369386 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-config-data\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.370001 master-0 kubenswrapper[33572]: I1204 22:35:04.369961 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-credential-keys\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.370386 master-0 kubenswrapper[33572]: I1204 22:35:04.370344 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-fernet-keys\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.371411 master-0 kubenswrapper[33572]: I1204 22:35:04.371385 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-scripts\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.383441 master-0 kubenswrapper[33572]: I1204 22:35:04.383377 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x8fq\" (UniqueName: \"kubernetes.io/projected/58d5cab4-61b9-4503-8e64-09f107844457-kube-api-access-6x8fq\") pod \"keystone-bootstrap-l59gq\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.529527 master-0 kubenswrapper[33572]: I1204 22:35:04.529461 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:04.548806 master-0 kubenswrapper[33572]: I1204 22:35:04.548706 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d66b1f8f-3a86-442f-bdec-422e5a6e03ee" path="/var/lib/kubelet/pods/d66b1f8f-3a86-442f-bdec-422e5a6e03ee/volumes" Dec 04 22:35:05.088977 master-0 kubenswrapper[33572]: I1204 22:35:05.088833 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-external-api-0" event={"ID":"bbe730dd-7824-4438-981d-bf5429987895","Type":"ContainerStarted","Data":"25115e1833d783845b249aff27c8ae862e9f9351ccb20cdef6a73ad6983c1391"} Dec 04 22:35:05.088977 master-0 kubenswrapper[33572]: I1204 22:35:05.088905 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-external-api-0" event={"ID":"bbe730dd-7824-4438-981d-bf5429987895","Type":"ContainerStarted","Data":"475d9fedb2eaaf45ff8375b7b0f4cf62b367259bcc2edaff25997421da71fc44"} Dec 04 22:35:05.440402 master-0 kubenswrapper[33572]: I1204 22:35:05.440326 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l59gq"] Dec 04 22:35:06.109886 master-0 kubenswrapper[33572]: I1204 22:35:06.109682 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-external-api-0" event={"ID":"bbe730dd-7824-4438-981d-bf5429987895","Type":"ContainerStarted","Data":"1c8639fd70edb8891c15b7cf485434414e45dca658c9c6f2ba6d87161ec6ec70"} Dec 04 22:35:06.112625 master-0 kubenswrapper[33572]: I1204 22:35:06.112559 33572 generic.go:334] "Generic (PLEG): container finished" podID="8cdb5e15-32e4-4257-be61-ec0ba1a6884e" containerID="930a9c223992f2d6a73d76abfb2e99eb730dd3718df76150ecd3d064b4b416c7" exitCode=0 Dec 04 22:35:06.112625 master-0 kubenswrapper[33572]: I1204 22:35:06.112612 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ms5vx" event={"ID":"8cdb5e15-32e4-4257-be61-ec0ba1a6884e","Type":"ContainerDied","Data":"930a9c223992f2d6a73d76abfb2e99eb730dd3718df76150ecd3d064b4b416c7"} Dec 04 22:35:06.148652 master-0 kubenswrapper[33572]: I1204 22:35:06.147251 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-7675d-default-external-api-0" podStartSLOduration=5.147234538 podStartE2EDuration="5.147234538s" podCreationTimestamp="2025-12-04 22:35:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:06.145255864 +0000 UTC m=+969.872781563" watchObservedRunningTime="2025-12-04 22:35:06.147234538 +0000 UTC m=+969.874760187" Dec 04 22:35:06.427931 master-0 kubenswrapper[33572]: I1204 22:35:06.427795 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:35:06.520242 master-0 kubenswrapper[33572]: I1204 22:35:06.520198 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp"] Dec 04 22:35:06.520471 master-0 kubenswrapper[33572]: I1204 22:35:06.520445 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" podUID="7fd26913-95c6-4bbe-a234-67d8b5544c80" containerName="dnsmasq-dns" containerID="cri-o://4114a176b9e0073b647cd0952384b11b473ec0f42e83a423a441a422d5f0900b" gracePeriod=10 Dec 04 22:35:07.133149 master-0 kubenswrapper[33572]: I1204 22:35:07.132605 33572 generic.go:334] "Generic (PLEG): container finished" podID="7fd26913-95c6-4bbe-a234-67d8b5544c80" containerID="4114a176b9e0073b647cd0952384b11b473ec0f42e83a423a441a422d5f0900b" exitCode=0 Dec 04 22:35:07.133149 master-0 kubenswrapper[33572]: I1204 22:35:07.132721 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" event={"ID":"7fd26913-95c6-4bbe-a234-67d8b5544c80","Type":"ContainerDied","Data":"4114a176b9e0073b647cd0952384b11b473ec0f42e83a423a441a422d5f0900b"} Dec 04 22:35:08.172316 master-0 kubenswrapper[33572]: I1204 22:35:08.172245 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" podUID="7fd26913-95c6-4bbe-a234-67d8b5544c80" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.204:5353: connect: connection refused" Dec 04 22:35:11.596958 master-0 kubenswrapper[33572]: I1204 22:35:11.595790 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:35:11.596958 master-0 kubenswrapper[33572]: I1204 22:35:11.595850 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:35:11.630819 master-0 kubenswrapper[33572]: I1204 22:35:11.630463 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:35:11.658456 master-0 kubenswrapper[33572]: I1204 22:35:11.658406 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:35:12.196661 master-0 kubenswrapper[33572]: I1204 22:35:12.196590 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:35:12.196661 master-0 kubenswrapper[33572]: I1204 22:35:12.196656 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:35:13.179566 master-0 kubenswrapper[33572]: I1204 22:35:13.179450 33572 scope.go:117] "RemoveContainer" containerID="7812d950dc2a5238df5e790ede80d928a2f3777e9a6203d7fd80ddc09d5bafae" Dec 04 22:35:13.697313 master-0 kubenswrapper[33572]: I1204 22:35:13.697260 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:13.697313 master-0 kubenswrapper[33572]: I1204 22:35:13.697310 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:13.736211 master-0 kubenswrapper[33572]: I1204 22:35:13.736167 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:13.744584 master-0 kubenswrapper[33572]: I1204 22:35:13.744017 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:14.201096 master-0 kubenswrapper[33572]: I1204 22:35:14.201028 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:35:14.205865 master-0 kubenswrapper[33572]: I1204 22:35:14.205781 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:35:14.218908 master-0 kubenswrapper[33572]: I1204 22:35:14.218855 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:14.218908 master-0 kubenswrapper[33572]: I1204 22:35:14.218900 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:16.258895 master-0 kubenswrapper[33572]: I1204 22:35:16.256879 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:16.258895 master-0 kubenswrapper[33572]: I1204 22:35:16.256998 33572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:35:16.265633 master-0 kubenswrapper[33572]: I1204 22:35:16.265588 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:35:17.157004 master-0 kubenswrapper[33572]: W1204 22:35:17.156916 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58d5cab4_61b9_4503_8e64_09f107844457.slice/crio-9c237d5e207de9c7776d152ef132d00bd840a5f47b135443f1fb18a8a6ce012b WatchSource:0}: Error finding container 9c237d5e207de9c7776d152ef132d00bd840a5f47b135443f1fb18a8a6ce012b: Status 404 returned error can't find the container with id 9c237d5e207de9c7776d152ef132d00bd840a5f47b135443f1fb18a8a6ce012b Dec 04 22:35:17.277767 master-0 kubenswrapper[33572]: I1204 22:35:17.277676 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" event={"ID":"7fd26913-95c6-4bbe-a234-67d8b5544c80","Type":"ContainerDied","Data":"f5225067757a90b91bb3e324518f389bf52deef06deb67e65ca168e0282042e7"} Dec 04 22:35:17.277767 master-0 kubenswrapper[33572]: I1204 22:35:17.277758 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5225067757a90b91bb3e324518f389bf52deef06deb67e65ca168e0282042e7" Dec 04 22:35:17.285715 master-0 kubenswrapper[33572]: I1204 22:35:17.285670 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ms5vx" event={"ID":"8cdb5e15-32e4-4257-be61-ec0ba1a6884e","Type":"ContainerDied","Data":"2cf56dddc2abb084322e094fa18d088ee51a6200b47ed45fbd471da397b778d6"} Dec 04 22:35:17.285715 master-0 kubenswrapper[33572]: I1204 22:35:17.285711 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cf56dddc2abb084322e094fa18d088ee51a6200b47ed45fbd471da397b778d6" Dec 04 22:35:17.287749 master-0 kubenswrapper[33572]: I1204 22:35:17.287719 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l59gq" event={"ID":"58d5cab4-61b9-4503-8e64-09f107844457","Type":"ContainerStarted","Data":"9c237d5e207de9c7776d152ef132d00bd840a5f47b135443f1fb18a8a6ce012b"} Dec 04 22:35:17.302057 master-0 kubenswrapper[33572]: I1204 22:35:17.301996 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ms5vx" Dec 04 22:35:17.312248 master-0 kubenswrapper[33572]: I1204 22:35:17.312200 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:35:17.481610 master-0 kubenswrapper[33572]: I1204 22:35:17.481449 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-config-data\") pod \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " Dec 04 22:35:17.481827 master-0 kubenswrapper[33572]: I1204 22:35:17.481622 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-logs\") pod \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " Dec 04 22:35:17.481827 master-0 kubenswrapper[33572]: I1204 22:35:17.481655 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-config\") pod \"7fd26913-95c6-4bbe-a234-67d8b5544c80\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " Dec 04 22:35:17.481827 master-0 kubenswrapper[33572]: I1204 22:35:17.481691 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94rs6\" (UniqueName: \"kubernetes.io/projected/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-kube-api-access-94rs6\") pod \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " Dec 04 22:35:17.481827 master-0 kubenswrapper[33572]: I1204 22:35:17.481742 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-svc\") pod \"7fd26913-95c6-4bbe-a234-67d8b5544c80\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " Dec 04 22:35:17.481996 master-0 kubenswrapper[33572]: I1204 22:35:17.481871 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkdtz\" (UniqueName: \"kubernetes.io/projected/7fd26913-95c6-4bbe-a234-67d8b5544c80-kube-api-access-xkdtz\") pod \"7fd26913-95c6-4bbe-a234-67d8b5544c80\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " Dec 04 22:35:17.481996 master-0 kubenswrapper[33572]: I1204 22:35:17.481914 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-sb\") pod \"7fd26913-95c6-4bbe-a234-67d8b5544c80\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " Dec 04 22:35:17.481996 master-0 kubenswrapper[33572]: I1204 22:35:17.481939 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-swift-storage-0\") pod \"7fd26913-95c6-4bbe-a234-67d8b5544c80\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " Dec 04 22:35:17.481996 master-0 kubenswrapper[33572]: I1204 22:35:17.481974 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-combined-ca-bundle\") pod \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " Dec 04 22:35:17.482141 master-0 kubenswrapper[33572]: I1204 22:35:17.481999 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-nb\") pod \"7fd26913-95c6-4bbe-a234-67d8b5544c80\" (UID: \"7fd26913-95c6-4bbe-a234-67d8b5544c80\") " Dec 04 22:35:17.482141 master-0 kubenswrapper[33572]: I1204 22:35:17.482075 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-logs" (OuterVolumeSpecName: "logs") pod "8cdb5e15-32e4-4257-be61-ec0ba1a6884e" (UID: "8cdb5e15-32e4-4257-be61-ec0ba1a6884e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:35:17.483387 master-0 kubenswrapper[33572]: I1204 22:35:17.482768 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-scripts\") pod \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\" (UID: \"8cdb5e15-32e4-4257-be61-ec0ba1a6884e\") " Dec 04 22:35:17.483934 master-0 kubenswrapper[33572]: I1204 22:35:17.483839 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:17.486485 master-0 kubenswrapper[33572]: I1204 22:35:17.486408 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-scripts" (OuterVolumeSpecName: "scripts") pod "8cdb5e15-32e4-4257-be61-ec0ba1a6884e" (UID: "8cdb5e15-32e4-4257-be61-ec0ba1a6884e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:17.487200 master-0 kubenswrapper[33572]: I1204 22:35:17.487148 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd26913-95c6-4bbe-a234-67d8b5544c80-kube-api-access-xkdtz" (OuterVolumeSpecName: "kube-api-access-xkdtz") pod "7fd26913-95c6-4bbe-a234-67d8b5544c80" (UID: "7fd26913-95c6-4bbe-a234-67d8b5544c80"). InnerVolumeSpecName "kube-api-access-xkdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:17.492895 master-0 kubenswrapper[33572]: I1204 22:35:17.492834 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-kube-api-access-94rs6" (OuterVolumeSpecName: "kube-api-access-94rs6") pod "8cdb5e15-32e4-4257-be61-ec0ba1a6884e" (UID: "8cdb5e15-32e4-4257-be61-ec0ba1a6884e"). InnerVolumeSpecName "kube-api-access-94rs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:17.518336 master-0 kubenswrapper[33572]: I1204 22:35:17.517817 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-config-data" (OuterVolumeSpecName: "config-data") pod "8cdb5e15-32e4-4257-be61-ec0ba1a6884e" (UID: "8cdb5e15-32e4-4257-be61-ec0ba1a6884e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:17.543804 master-0 kubenswrapper[33572]: I1204 22:35:17.543757 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cdb5e15-32e4-4257-be61-ec0ba1a6884e" (UID: "8cdb5e15-32e4-4257-be61-ec0ba1a6884e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:17.552134 master-0 kubenswrapper[33572]: I1204 22:35:17.548679 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7fd26913-95c6-4bbe-a234-67d8b5544c80" (UID: "7fd26913-95c6-4bbe-a234-67d8b5544c80"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:17.552134 master-0 kubenswrapper[33572]: I1204 22:35:17.551708 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fd26913-95c6-4bbe-a234-67d8b5544c80" (UID: "7fd26913-95c6-4bbe-a234-67d8b5544c80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:17.565596 master-0 kubenswrapper[33572]: I1204 22:35:17.565290 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7fd26913-95c6-4bbe-a234-67d8b5544c80" (UID: "7fd26913-95c6-4bbe-a234-67d8b5544c80"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:17.567137 master-0 kubenswrapper[33572]: I1204 22:35:17.567087 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-config" (OuterVolumeSpecName: "config") pod "7fd26913-95c6-4bbe-a234-67d8b5544c80" (UID: "7fd26913-95c6-4bbe-a234-67d8b5544c80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:17.572311 master-0 kubenswrapper[33572]: I1204 22:35:17.572253 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7fd26913-95c6-4bbe-a234-67d8b5544c80" (UID: "7fd26913-95c6-4bbe-a234-67d8b5544c80"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:17.589092 master-0 kubenswrapper[33572]: I1204 22:35:17.589016 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:17.589092 master-0 kubenswrapper[33572]: I1204 22:35:17.589083 33572 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:17.589092 master-0 kubenswrapper[33572]: I1204 22:35:17.589102 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:17.589352 master-0 kubenswrapper[33572]: I1204 22:35:17.589114 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:17.589352 master-0 kubenswrapper[33572]: I1204 22:35:17.589128 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:17.589352 master-0 kubenswrapper[33572]: I1204 22:35:17.589139 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:17.589352 master-0 kubenswrapper[33572]: I1204 22:35:17.589151 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:17.589352 master-0 kubenswrapper[33572]: I1204 22:35:17.589285 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94rs6\" (UniqueName: \"kubernetes.io/projected/8cdb5e15-32e4-4257-be61-ec0ba1a6884e-kube-api-access-94rs6\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:17.589352 master-0 kubenswrapper[33572]: I1204 22:35:17.589352 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fd26913-95c6-4bbe-a234-67d8b5544c80-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:17.589606 master-0 kubenswrapper[33572]: I1204 22:35:17.589365 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkdtz\" (UniqueName: \"kubernetes.io/projected/7fd26913-95c6-4bbe-a234-67d8b5544c80-kube-api-access-xkdtz\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:18.172625 master-0 kubenswrapper[33572]: I1204 22:35:18.172565 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" podUID="7fd26913-95c6-4bbe-a234-67d8b5544c80" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.204:5353: i/o timeout" Dec 04 22:35:18.317475 master-0 kubenswrapper[33572]: I1204 22:35:18.317405 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ms5vx" Dec 04 22:35:18.318034 master-0 kubenswrapper[33572]: I1204 22:35:18.317421 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp" Dec 04 22:35:18.378532 master-0 kubenswrapper[33572]: I1204 22:35:18.378478 33572 scope.go:117] "RemoveContainer" containerID="7bce740548513a35b1fba24a420883b2a7523f5a54ed583acb5349a57f7217f4" Dec 04 22:35:18.382879 master-0 kubenswrapper[33572]: I1204 22:35:18.382841 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp"] Dec 04 22:35:18.395681 master-0 kubenswrapper[33572]: I1204 22:35:18.395609 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f5b6cc9d9-fl8vp"] Dec 04 22:35:18.569070 master-0 kubenswrapper[33572]: I1204 22:35:18.569028 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fd26913-95c6-4bbe-a234-67d8b5544c80" path="/var/lib/kubelet/pods/7fd26913-95c6-4bbe-a234-67d8b5544c80/volumes" Dec 04 22:35:18.570446 master-0 kubenswrapper[33572]: I1204 22:35:18.570405 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-84794cb9bb-xw9hq"] Dec 04 22:35:18.570800 master-0 kubenswrapper[33572]: E1204 22:35:18.570783 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd26913-95c6-4bbe-a234-67d8b5544c80" containerName="init" Dec 04 22:35:18.570864 master-0 kubenswrapper[33572]: I1204 22:35:18.570801 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd26913-95c6-4bbe-a234-67d8b5544c80" containerName="init" Dec 04 22:35:18.570864 master-0 kubenswrapper[33572]: E1204 22:35:18.570833 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd26913-95c6-4bbe-a234-67d8b5544c80" containerName="dnsmasq-dns" Dec 04 22:35:18.570864 master-0 kubenswrapper[33572]: I1204 22:35:18.570840 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd26913-95c6-4bbe-a234-67d8b5544c80" containerName="dnsmasq-dns" Dec 04 22:35:18.570864 master-0 kubenswrapper[33572]: E1204 22:35:18.570860 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cdb5e15-32e4-4257-be61-ec0ba1a6884e" containerName="placement-db-sync" Dec 04 22:35:18.570864 master-0 kubenswrapper[33572]: I1204 22:35:18.570867 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cdb5e15-32e4-4257-be61-ec0ba1a6884e" containerName="placement-db-sync" Dec 04 22:35:18.571100 master-0 kubenswrapper[33572]: I1204 22:35:18.571081 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cdb5e15-32e4-4257-be61-ec0ba1a6884e" containerName="placement-db-sync" Dec 04 22:35:18.571160 master-0 kubenswrapper[33572]: I1204 22:35:18.571126 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd26913-95c6-4bbe-a234-67d8b5544c80" containerName="dnsmasq-dns" Dec 04 22:35:18.585803 master-0 kubenswrapper[33572]: I1204 22:35:18.585743 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-84794cb9bb-xw9hq"] Dec 04 22:35:18.590362 master-0 kubenswrapper[33572]: I1204 22:35:18.590303 33572 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd2a41736-e271-4aef-a67f-77937bc3e446"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd2a41736-e271-4aef-a67f-77937bc3e446] : Timed out while waiting for systemd to remove kubepods-besteffort-podd2a41736_e271_4aef_a67f_77937bc3e446.slice" Dec 04 22:35:18.592880 master-0 kubenswrapper[33572]: I1204 22:35:18.592837 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.595072 master-0 kubenswrapper[33572]: I1204 22:35:18.595035 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 04 22:35:18.595241 master-0 kubenswrapper[33572]: I1204 22:35:18.595218 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 04 22:35:18.601815 master-0 kubenswrapper[33572]: I1204 22:35:18.601791 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 22:35:18.602563 master-0 kubenswrapper[33572]: I1204 22:35:18.602079 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 22:35:18.614459 master-0 kubenswrapper[33572]: I1204 22:35:18.614431 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-public-tls-certs\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.616480 master-0 kubenswrapper[33572]: I1204 22:35:18.616459 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjvsc\" (UniqueName: \"kubernetes.io/projected/ba11d04b-2da6-4359-b722-74134acd704e-kube-api-access-bjvsc\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.616839 master-0 kubenswrapper[33572]: I1204 22:35:18.616822 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-combined-ca-bundle\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.617004 master-0 kubenswrapper[33572]: I1204 22:35:18.616987 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-scripts\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.617256 master-0 kubenswrapper[33572]: I1204 22:35:18.617238 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-internal-tls-certs\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.617710 master-0 kubenswrapper[33572]: I1204 22:35:18.617694 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba11d04b-2da6-4359-b722-74134acd704e-logs\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.618031 master-0 kubenswrapper[33572]: I1204 22:35:18.618017 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-config-data\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.721909 master-0 kubenswrapper[33572]: I1204 22:35:18.721786 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-internal-tls-certs\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.721909 master-0 kubenswrapper[33572]: I1204 22:35:18.721881 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba11d04b-2da6-4359-b722-74134acd704e-logs\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.721909 master-0 kubenswrapper[33572]: I1204 22:35:18.721905 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-config-data\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.722178 master-0 kubenswrapper[33572]: I1204 22:35:18.721976 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-public-tls-certs\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.722178 master-0 kubenswrapper[33572]: I1204 22:35:18.722004 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvsc\" (UniqueName: \"kubernetes.io/projected/ba11d04b-2da6-4359-b722-74134acd704e-kube-api-access-bjvsc\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.722178 master-0 kubenswrapper[33572]: I1204 22:35:18.722024 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-combined-ca-bundle\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.722178 master-0 kubenswrapper[33572]: I1204 22:35:18.722042 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-scripts\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.722965 master-0 kubenswrapper[33572]: I1204 22:35:18.722939 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba11d04b-2da6-4359-b722-74134acd704e-logs\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.726527 master-0 kubenswrapper[33572]: I1204 22:35:18.726485 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-public-tls-certs\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.726866 master-0 kubenswrapper[33572]: I1204 22:35:18.726847 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-internal-tls-certs\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.727437 master-0 kubenswrapper[33572]: I1204 22:35:18.727289 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-scripts\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.727541 master-0 kubenswrapper[33572]: I1204 22:35:18.727393 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-combined-ca-bundle\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.731317 master-0 kubenswrapper[33572]: I1204 22:35:18.731283 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba11d04b-2da6-4359-b722-74134acd704e-config-data\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:18.741164 master-0 kubenswrapper[33572]: I1204 22:35:18.741119 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjvsc\" (UniqueName: \"kubernetes.io/projected/ba11d04b-2da6-4359-b722-74134acd704e-kube-api-access-bjvsc\") pod \"placement-84794cb9bb-xw9hq\" (UID: \"ba11d04b-2da6-4359-b722-74134acd704e\") " pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:19.038164 master-0 kubenswrapper[33572]: I1204 22:35:19.038052 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:19.331883 master-0 kubenswrapper[33572]: I1204 22:35:19.330950 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-db-sync-d9l4w" event={"ID":"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2","Type":"ContainerStarted","Data":"4027c31e26a3a9c5fe6bcba8d9a071357e11963e4c240501d1c05241a7ed24bd"} Dec 04 22:35:19.333427 master-0 kubenswrapper[33572]: I1204 22:35:19.333316 33572 generic.go:334] "Generic (PLEG): container finished" podID="320dd132-b10a-4d56-88a4-307ecb61196f" containerID="f348291203168a853e748fd118c52a1a8854a4b78ba57f8f0ea43f0ebaae2bbc" exitCode=0 Dec 04 22:35:19.333427 master-0 kubenswrapper[33572]: I1204 22:35:19.333386 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j89sq" event={"ID":"320dd132-b10a-4d56-88a4-307ecb61196f","Type":"ContainerDied","Data":"f348291203168a853e748fd118c52a1a8854a4b78ba57f8f0ea43f0ebaae2bbc"} Dec 04 22:35:19.335860 master-0 kubenswrapper[33572]: I1204 22:35:19.335808 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l59gq" event={"ID":"58d5cab4-61b9-4503-8e64-09f107844457","Type":"ContainerStarted","Data":"84422d6e37213b635208a37b386a19d499b708db49e9be10ed48145dcfa136bd"} Dec 04 22:35:19.360453 master-0 kubenswrapper[33572]: I1204 22:35:19.360380 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7675d-db-sync-d9l4w" podStartSLOduration=2.934451175 podStartE2EDuration="24.360353744s" podCreationTimestamp="2025-12-04 22:34:55 +0000 UTC" firstStartedPulling="2025-12-04 22:34:57.178848214 +0000 UTC m=+960.906373863" lastFinishedPulling="2025-12-04 22:35:18.604750783 +0000 UTC m=+982.332276432" observedRunningTime="2025-12-04 22:35:19.354234104 +0000 UTC m=+983.081759753" watchObservedRunningTime="2025-12-04 22:35:19.360353744 +0000 UTC m=+983.087879403" Dec 04 22:35:19.421104 master-0 kubenswrapper[33572]: I1204 22:35:19.421015 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-l59gq" podStartSLOduration=15.420990746 podStartE2EDuration="15.420990746s" podCreationTimestamp="2025-12-04 22:35:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:19.384699479 +0000 UTC m=+983.112225169" watchObservedRunningTime="2025-12-04 22:35:19.420990746 +0000 UTC m=+983.148516405" Dec 04 22:35:19.502327 master-0 kubenswrapper[33572]: I1204 22:35:19.500901 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-84794cb9bb-xw9hq"] Dec 04 22:35:20.356545 master-0 kubenswrapper[33572]: I1204 22:35:20.356456 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j89sq" event={"ID":"320dd132-b10a-4d56-88a4-307ecb61196f","Type":"ContainerStarted","Data":"5ee780f89386259b20927daa58537ef5615e819e02fafc58790a9eea8e5f1738"} Dec 04 22:35:20.359831 master-0 kubenswrapper[33572]: I1204 22:35:20.358950 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84794cb9bb-xw9hq" event={"ID":"ba11d04b-2da6-4359-b722-74134acd704e","Type":"ContainerStarted","Data":"333a1a47d29d0499f2be822ffb47baf2e55aaed1bb0186b01065905b561ceb00"} Dec 04 22:35:20.359831 master-0 kubenswrapper[33572]: I1204 22:35:20.358998 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84794cb9bb-xw9hq" event={"ID":"ba11d04b-2da6-4359-b722-74134acd704e","Type":"ContainerStarted","Data":"87aece9fec6d5b0e721fd3ae1dc20f128aedae53f218b66cb489c41032213068"} Dec 04 22:35:20.359831 master-0 kubenswrapper[33572]: I1204 22:35:20.359008 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84794cb9bb-xw9hq" event={"ID":"ba11d04b-2da6-4359-b722-74134acd704e","Type":"ContainerStarted","Data":"8edc03766bfed785553feda199166b79208dbea0102629fb896ff4b0d5b1eec2"} Dec 04 22:35:20.397567 master-0 kubenswrapper[33572]: I1204 22:35:20.397427 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-j89sq" podStartSLOduration=4.348759949 podStartE2EDuration="20.397407132s" podCreationTimestamp="2025-12-04 22:35:00 +0000 UTC" firstStartedPulling="2025-12-04 22:35:02.384948652 +0000 UTC m=+966.112474311" lastFinishedPulling="2025-12-04 22:35:18.433595835 +0000 UTC m=+982.161121494" observedRunningTime="2025-12-04 22:35:20.392157996 +0000 UTC m=+984.119683635" watchObservedRunningTime="2025-12-04 22:35:20.397407132 +0000 UTC m=+984.124932801" Dec 04 22:35:20.435822 master-0 kubenswrapper[33572]: I1204 22:35:20.435741 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-84794cb9bb-xw9hq" podStartSLOduration=2.435718725 podStartE2EDuration="2.435718725s" podCreationTimestamp="2025-12-04 22:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:20.426299114 +0000 UTC m=+984.153824773" watchObservedRunningTime="2025-12-04 22:35:20.435718725 +0000 UTC m=+984.163244384" Dec 04 22:35:21.371343 master-0 kubenswrapper[33572]: I1204 22:35:21.371219 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:21.371343 master-0 kubenswrapper[33572]: I1204 22:35:21.371305 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:22.384834 master-0 kubenswrapper[33572]: I1204 22:35:22.384770 33572 generic.go:334] "Generic (PLEG): container finished" podID="58d5cab4-61b9-4503-8e64-09f107844457" containerID="84422d6e37213b635208a37b386a19d499b708db49e9be10ed48145dcfa136bd" exitCode=0 Dec 04 22:35:22.386206 master-0 kubenswrapper[33572]: I1204 22:35:22.384832 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l59gq" event={"ID":"58d5cab4-61b9-4503-8e64-09f107844457","Type":"ContainerDied","Data":"84422d6e37213b635208a37b386a19d499b708db49e9be10ed48145dcfa136bd"} Dec 04 22:35:23.970394 master-0 kubenswrapper[33572]: I1204 22:35:23.970322 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:24.051959 master-0 kubenswrapper[33572]: I1204 22:35:24.051855 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-config-data\") pod \"58d5cab4-61b9-4503-8e64-09f107844457\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " Dec 04 22:35:24.052206 master-0 kubenswrapper[33572]: I1204 22:35:24.052164 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-credential-keys\") pod \"58d5cab4-61b9-4503-8e64-09f107844457\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " Dec 04 22:35:24.052375 master-0 kubenswrapper[33572]: I1204 22:35:24.052324 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-fernet-keys\") pod \"58d5cab4-61b9-4503-8e64-09f107844457\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " Dec 04 22:35:24.052466 master-0 kubenswrapper[33572]: I1204 22:35:24.052413 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-scripts\") pod \"58d5cab4-61b9-4503-8e64-09f107844457\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " Dec 04 22:35:24.053575 master-0 kubenswrapper[33572]: I1204 22:35:24.052653 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-combined-ca-bundle\") pod \"58d5cab4-61b9-4503-8e64-09f107844457\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " Dec 04 22:35:24.053575 master-0 kubenswrapper[33572]: I1204 22:35:24.052843 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x8fq\" (UniqueName: \"kubernetes.io/projected/58d5cab4-61b9-4503-8e64-09f107844457-kube-api-access-6x8fq\") pod \"58d5cab4-61b9-4503-8e64-09f107844457\" (UID: \"58d5cab4-61b9-4503-8e64-09f107844457\") " Dec 04 22:35:24.055675 master-0 kubenswrapper[33572]: I1204 22:35:24.055597 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-scripts" (OuterVolumeSpecName: "scripts") pod "58d5cab4-61b9-4503-8e64-09f107844457" (UID: "58d5cab4-61b9-4503-8e64-09f107844457"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:24.055894 master-0 kubenswrapper[33572]: I1204 22:35:24.055720 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "58d5cab4-61b9-4503-8e64-09f107844457" (UID: "58d5cab4-61b9-4503-8e64-09f107844457"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:24.057058 master-0 kubenswrapper[33572]: I1204 22:35:24.056962 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d5cab4-61b9-4503-8e64-09f107844457-kube-api-access-6x8fq" (OuterVolumeSpecName: "kube-api-access-6x8fq") pod "58d5cab4-61b9-4503-8e64-09f107844457" (UID: "58d5cab4-61b9-4503-8e64-09f107844457"). InnerVolumeSpecName "kube-api-access-6x8fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:24.057793 master-0 kubenswrapper[33572]: I1204 22:35:24.057687 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "58d5cab4-61b9-4503-8e64-09f107844457" (UID: "58d5cab4-61b9-4503-8e64-09f107844457"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:24.108852 master-0 kubenswrapper[33572]: I1204 22:35:24.108676 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-config-data" (OuterVolumeSpecName: "config-data") pod "58d5cab4-61b9-4503-8e64-09f107844457" (UID: "58d5cab4-61b9-4503-8e64-09f107844457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:24.116838 master-0 kubenswrapper[33572]: I1204 22:35:24.116757 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58d5cab4-61b9-4503-8e64-09f107844457" (UID: "58d5cab4-61b9-4503-8e64-09f107844457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:24.155867 master-0 kubenswrapper[33572]: I1204 22:35:24.155807 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:24.155867 master-0 kubenswrapper[33572]: I1204 22:35:24.155867 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x8fq\" (UniqueName: \"kubernetes.io/projected/58d5cab4-61b9-4503-8e64-09f107844457-kube-api-access-6x8fq\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:24.156272 master-0 kubenswrapper[33572]: I1204 22:35:24.155890 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:24.156272 master-0 kubenswrapper[33572]: I1204 22:35:24.155913 33572 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-credential-keys\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:24.156272 master-0 kubenswrapper[33572]: I1204 22:35:24.155931 33572 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-fernet-keys\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:24.156272 master-0 kubenswrapper[33572]: I1204 22:35:24.155948 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58d5cab4-61b9-4503-8e64-09f107844457-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:24.414721 master-0 kubenswrapper[33572]: I1204 22:35:24.414572 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l59gq" event={"ID":"58d5cab4-61b9-4503-8e64-09f107844457","Type":"ContainerDied","Data":"9c237d5e207de9c7776d152ef132d00bd840a5f47b135443f1fb18a8a6ce012b"} Dec 04 22:35:24.414721 master-0 kubenswrapper[33572]: I1204 22:35:24.414636 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c237d5e207de9c7776d152ef132d00bd840a5f47b135443f1fb18a8a6ce012b" Dec 04 22:35:24.414721 master-0 kubenswrapper[33572]: I1204 22:35:24.414676 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l59gq" Dec 04 22:35:24.547106 master-0 kubenswrapper[33572]: I1204 22:35:24.547039 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fd96d6774-6kg5t"] Dec 04 22:35:24.547687 master-0 kubenswrapper[33572]: E1204 22:35:24.547657 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d5cab4-61b9-4503-8e64-09f107844457" containerName="keystone-bootstrap" Dec 04 22:35:24.547687 master-0 kubenswrapper[33572]: I1204 22:35:24.547678 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d5cab4-61b9-4503-8e64-09f107844457" containerName="keystone-bootstrap" Dec 04 22:35:24.547984 master-0 kubenswrapper[33572]: I1204 22:35:24.547938 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d5cab4-61b9-4503-8e64-09f107844457" containerName="keystone-bootstrap" Dec 04 22:35:24.548668 master-0 kubenswrapper[33572]: I1204 22:35:24.548640 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.556025 master-0 kubenswrapper[33572]: I1204 22:35:24.555950 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 22:35:24.556292 master-0 kubenswrapper[33572]: I1204 22:35:24.556264 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 22:35:24.556665 master-0 kubenswrapper[33572]: I1204 22:35:24.556571 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 04 22:35:24.556748 master-0 kubenswrapper[33572]: I1204 22:35:24.556704 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 22:35:24.563250 master-0 kubenswrapper[33572]: I1204 22:35:24.563175 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 04 22:35:24.573481 master-0 kubenswrapper[33572]: I1204 22:35:24.571275 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-config-data\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.573481 master-0 kubenswrapper[33572]: I1204 22:35:24.571359 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-credential-keys\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.573481 master-0 kubenswrapper[33572]: I1204 22:35:24.571431 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-internal-tls-certs\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.573481 master-0 kubenswrapper[33572]: I1204 22:35:24.571496 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-fernet-keys\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.573481 master-0 kubenswrapper[33572]: I1204 22:35:24.571548 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-combined-ca-bundle\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.573481 master-0 kubenswrapper[33572]: I1204 22:35:24.571584 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl4j2\" (UniqueName: \"kubernetes.io/projected/883c574b-0140-4a08-a542-1a2444ff7512-kube-api-access-sl4j2\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.573481 master-0 kubenswrapper[33572]: I1204 22:35:24.571628 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-scripts\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.573481 master-0 kubenswrapper[33572]: I1204 22:35:24.571693 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-public-tls-certs\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.612547 master-0 kubenswrapper[33572]: I1204 22:35:24.609134 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fd96d6774-6kg5t"] Dec 04 22:35:24.675616 master-0 kubenswrapper[33572]: I1204 22:35:24.674546 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-fernet-keys\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.675616 master-0 kubenswrapper[33572]: I1204 22:35:24.674598 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-combined-ca-bundle\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.675616 master-0 kubenswrapper[33572]: I1204 22:35:24.674624 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl4j2\" (UniqueName: \"kubernetes.io/projected/883c574b-0140-4a08-a542-1a2444ff7512-kube-api-access-sl4j2\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.675616 master-0 kubenswrapper[33572]: I1204 22:35:24.674657 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-scripts\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.675616 master-0 kubenswrapper[33572]: I1204 22:35:24.674699 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-public-tls-certs\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.675616 master-0 kubenswrapper[33572]: I1204 22:35:24.674768 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-config-data\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.675616 master-0 kubenswrapper[33572]: I1204 22:35:24.674795 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-credential-keys\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.675616 master-0 kubenswrapper[33572]: I1204 22:35:24.674825 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-internal-tls-certs\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.680787 master-0 kubenswrapper[33572]: I1204 22:35:24.678992 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-internal-tls-certs\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.690524 master-0 kubenswrapper[33572]: I1204 22:35:24.685320 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-config-data\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.690524 master-0 kubenswrapper[33572]: I1204 22:35:24.687529 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-combined-ca-bundle\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.694527 master-0 kubenswrapper[33572]: I1204 22:35:24.693111 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-fernet-keys\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.698520 master-0 kubenswrapper[33572]: I1204 22:35:24.696947 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-scripts\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.698520 master-0 kubenswrapper[33572]: I1204 22:35:24.697554 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-public-tls-certs\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.698520 master-0 kubenswrapper[33572]: I1204 22:35:24.697877 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883c574b-0140-4a08-a542-1a2444ff7512-credential-keys\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.707334 master-0 kubenswrapper[33572]: I1204 22:35:24.707286 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl4j2\" (UniqueName: \"kubernetes.io/projected/883c574b-0140-4a08-a542-1a2444ff7512-kube-api-access-sl4j2\") pod \"keystone-fd96d6774-6kg5t\" (UID: \"883c574b-0140-4a08-a542-1a2444ff7512\") " pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:24.874622 master-0 kubenswrapper[33572]: I1204 22:35:24.874525 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:25.210000 master-0 kubenswrapper[33572]: I1204 22:35:25.209915 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fd96d6774-6kg5t"] Dec 04 22:35:25.430466 master-0 kubenswrapper[33572]: I1204 22:35:25.430408 33572 generic.go:334] "Generic (PLEG): container finished" podID="2609a439-7ac0-4253-a74e-e4c90a023832" containerID="f5642154d0543171958cf65f3dc2a6a8cd75c02eb1c3577bddc47c91c3995270" exitCode=0 Dec 04 22:35:25.430859 master-0 kubenswrapper[33572]: I1204 22:35:25.430452 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lxrlk" event={"ID":"2609a439-7ac0-4253-a74e-e4c90a023832","Type":"ContainerDied","Data":"f5642154d0543171958cf65f3dc2a6a8cd75c02eb1c3577bddc47c91c3995270"} Dec 04 22:35:25.433098 master-0 kubenswrapper[33572]: I1204 22:35:25.433054 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fd96d6774-6kg5t" event={"ID":"883c574b-0140-4a08-a542-1a2444ff7512","Type":"ContainerStarted","Data":"a7e98e9c59beeff5727fef57851697c980212d18e76f22456d4ae9e8b8f93801"} Dec 04 22:35:26.458626 master-0 kubenswrapper[33572]: I1204 22:35:26.458456 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fd96d6774-6kg5t" event={"ID":"883c574b-0140-4a08-a542-1a2444ff7512","Type":"ContainerStarted","Data":"72fdddaff4e30c0e759f3d98ed75053c070b5adab000155e6a863aeafebaf4a1"} Dec 04 22:35:26.521804 master-0 kubenswrapper[33572]: I1204 22:35:26.521679 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-fd96d6774-6kg5t" podStartSLOduration=2.521639859 podStartE2EDuration="2.521639859s" podCreationTimestamp="2025-12-04 22:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:26.501539881 +0000 UTC m=+990.229065570" watchObservedRunningTime="2025-12-04 22:35:26.521639859 +0000 UTC m=+990.249165598" Dec 04 22:35:26.936190 master-0 kubenswrapper[33572]: I1204 22:35:26.936128 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:35:26.980524 master-0 kubenswrapper[33572]: I1204 22:35:26.980398 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-config\") pod \"2609a439-7ac0-4253-a74e-e4c90a023832\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " Dec 04 22:35:26.980851 master-0 kubenswrapper[33572]: I1204 22:35:26.980649 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m92n\" (UniqueName: \"kubernetes.io/projected/2609a439-7ac0-4253-a74e-e4c90a023832-kube-api-access-9m92n\") pod \"2609a439-7ac0-4253-a74e-e4c90a023832\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " Dec 04 22:35:26.981051 master-0 kubenswrapper[33572]: I1204 22:35:26.980998 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-combined-ca-bundle\") pod \"2609a439-7ac0-4253-a74e-e4c90a023832\" (UID: \"2609a439-7ac0-4253-a74e-e4c90a023832\") " Dec 04 22:35:26.995248 master-0 kubenswrapper[33572]: I1204 22:35:26.994874 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2609a439-7ac0-4253-a74e-e4c90a023832-kube-api-access-9m92n" (OuterVolumeSpecName: "kube-api-access-9m92n") pod "2609a439-7ac0-4253-a74e-e4c90a023832" (UID: "2609a439-7ac0-4253-a74e-e4c90a023832"). InnerVolumeSpecName "kube-api-access-9m92n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:27.033618 master-0 kubenswrapper[33572]: I1204 22:35:27.033562 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-config" (OuterVolumeSpecName: "config") pod "2609a439-7ac0-4253-a74e-e4c90a023832" (UID: "2609a439-7ac0-4253-a74e-e4c90a023832"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:27.036275 master-0 kubenswrapper[33572]: I1204 22:35:27.036202 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2609a439-7ac0-4253-a74e-e4c90a023832" (UID: "2609a439-7ac0-4253-a74e-e4c90a023832"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:27.084037 master-0 kubenswrapper[33572]: I1204 22:35:27.083930 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:27.084037 master-0 kubenswrapper[33572]: I1204 22:35:27.083968 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2609a439-7ac0-4253-a74e-e4c90a023832-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:27.084037 master-0 kubenswrapper[33572]: I1204 22:35:27.083979 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m92n\" (UniqueName: \"kubernetes.io/projected/2609a439-7ac0-4253-a74e-e4c90a023832-kube-api-access-9m92n\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:27.484660 master-0 kubenswrapper[33572]: I1204 22:35:27.484316 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-lxrlk" Dec 04 22:35:27.494022 master-0 kubenswrapper[33572]: I1204 22:35:27.493813 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-lxrlk" event={"ID":"2609a439-7ac0-4253-a74e-e4c90a023832","Type":"ContainerDied","Data":"6e082596aabe4ae5272edacf962d17659440664e3ccb24f2c8eacfe799775895"} Dec 04 22:35:27.494022 master-0 kubenswrapper[33572]: I1204 22:35:27.493922 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e082596aabe4ae5272edacf962d17659440664e3ccb24f2c8eacfe799775895" Dec 04 22:35:27.494022 master-0 kubenswrapper[33572]: I1204 22:35:27.493973 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:27.609036 master-0 kubenswrapper[33572]: E1204 22:35:27.608886 33572 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2609a439_7ac0_4253_a74e_e4c90a023832.slice\": RecentStats: unable to find data in memory cache]" Dec 04 22:35:27.725567 master-0 kubenswrapper[33572]: I1204 22:35:27.725463 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bc64b79f9-dpgqw"] Dec 04 22:35:27.725999 master-0 kubenswrapper[33572]: E1204 22:35:27.725968 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2609a439-7ac0-4253-a74e-e4c90a023832" containerName="neutron-db-sync" Dec 04 22:35:27.725999 master-0 kubenswrapper[33572]: I1204 22:35:27.725987 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2609a439-7ac0-4253-a74e-e4c90a023832" containerName="neutron-db-sync" Dec 04 22:35:27.726304 master-0 kubenswrapper[33572]: I1204 22:35:27.726249 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2609a439-7ac0-4253-a74e-e4c90a023832" containerName="neutron-db-sync" Dec 04 22:35:27.731079 master-0 kubenswrapper[33572]: I1204 22:35:27.730467 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.773835 master-0 kubenswrapper[33572]: I1204 22:35:27.767091 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bc64b79f9-dpgqw"] Dec 04 22:35:27.802838 master-0 kubenswrapper[33572]: I1204 22:35:27.802770 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp48z\" (UniqueName: \"kubernetes.io/projected/2fb350b1-0cc9-4d29-8333-c67277e45842-kube-api-access-hp48z\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.803050 master-0 kubenswrapper[33572]: I1204 22:35:27.802842 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-nb\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.803050 master-0 kubenswrapper[33572]: I1204 22:35:27.802891 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-config\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.803050 master-0 kubenswrapper[33572]: I1204 22:35:27.802912 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-svc\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.803050 master-0 kubenswrapper[33572]: I1204 22:35:27.803009 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-swift-storage-0\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.803186 master-0 kubenswrapper[33572]: I1204 22:35:27.803053 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-sb\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.864709 master-0 kubenswrapper[33572]: I1204 22:35:27.864651 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8449cb68d4-sd8ww"] Dec 04 22:35:27.866824 master-0 kubenswrapper[33572]: I1204 22:35:27.866781 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:27.873784 master-0 kubenswrapper[33572]: I1204 22:35:27.870847 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 04 22:35:27.873784 master-0 kubenswrapper[33572]: I1204 22:35:27.870894 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 22:35:27.873784 master-0 kubenswrapper[33572]: I1204 22:35:27.870847 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 22:35:27.881646 master-0 kubenswrapper[33572]: I1204 22:35:27.880510 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8449cb68d4-sd8ww"] Dec 04 22:35:27.904980 master-0 kubenswrapper[33572]: I1204 22:35:27.904922 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzzb5\" (UniqueName: \"kubernetes.io/projected/b143d9c0-797c-4160-ac50-6264b20e4e34-kube-api-access-dzzb5\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:27.905189 master-0 kubenswrapper[33572]: I1204 22:35:27.905022 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-combined-ca-bundle\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:27.905189 master-0 kubenswrapper[33572]: I1204 22:35:27.905061 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-swift-storage-0\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.905189 master-0 kubenswrapper[33572]: I1204 22:35:27.905116 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-sb\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.905289 master-0 kubenswrapper[33572]: I1204 22:35:27.905200 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp48z\" (UniqueName: \"kubernetes.io/projected/2fb350b1-0cc9-4d29-8333-c67277e45842-kube-api-access-hp48z\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.905340 master-0 kubenswrapper[33572]: I1204 22:35:27.905319 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-nb\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.905375 master-0 kubenswrapper[33572]: I1204 22:35:27.905353 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-config\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:27.905407 master-0 kubenswrapper[33572]: I1204 22:35:27.905386 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-httpd-config\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:27.905446 master-0 kubenswrapper[33572]: I1204 22:35:27.905412 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-ovndb-tls-certs\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:27.905580 master-0 kubenswrapper[33572]: I1204 22:35:27.905476 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-config\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.905682 master-0 kubenswrapper[33572]: I1204 22:35:27.905544 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-svc\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.906298 master-0 kubenswrapper[33572]: I1204 22:35:27.906264 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-swift-storage-0\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.906564 master-0 kubenswrapper[33572]: I1204 22:35:27.906521 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-svc\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.906564 master-0 kubenswrapper[33572]: I1204 22:35:27.906490 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-nb\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.906634 master-0 kubenswrapper[33572]: I1204 22:35:27.906575 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-sb\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.907198 master-0 kubenswrapper[33572]: I1204 22:35:27.907167 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-config\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:27.928522 master-0 kubenswrapper[33572]: I1204 22:35:27.926264 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp48z\" (UniqueName: \"kubernetes.io/projected/2fb350b1-0cc9-4d29-8333-c67277e45842-kube-api-access-hp48z\") pod \"dnsmasq-dns-5bc64b79f9-dpgqw\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:28.007569 master-0 kubenswrapper[33572]: I1204 22:35:28.007495 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzzb5\" (UniqueName: \"kubernetes.io/projected/b143d9c0-797c-4160-ac50-6264b20e4e34-kube-api-access-dzzb5\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.007806 master-0 kubenswrapper[33572]: I1204 22:35:28.007611 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-combined-ca-bundle\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.007806 master-0 kubenswrapper[33572]: I1204 22:35:28.007762 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-config\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.007909 master-0 kubenswrapper[33572]: I1204 22:35:28.007804 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-httpd-config\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.007909 master-0 kubenswrapper[33572]: I1204 22:35:28.007833 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-ovndb-tls-certs\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.013201 master-0 kubenswrapper[33572]: I1204 22:35:28.013153 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-ovndb-tls-certs\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.013365 master-0 kubenswrapper[33572]: I1204 22:35:28.013331 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-httpd-config\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.013439 master-0 kubenswrapper[33572]: I1204 22:35:28.013414 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-config\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.021525 master-0 kubenswrapper[33572]: I1204 22:35:28.018779 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-combined-ca-bundle\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.027038 master-0 kubenswrapper[33572]: I1204 22:35:28.026602 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzzb5\" (UniqueName: \"kubernetes.io/projected/b143d9c0-797c-4160-ac50-6264b20e4e34-kube-api-access-dzzb5\") pod \"neutron-8449cb68d4-sd8ww\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.147863 master-0 kubenswrapper[33572]: I1204 22:35:28.147805 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:28.191416 master-0 kubenswrapper[33572]: I1204 22:35:28.191328 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:28.509889 master-0 kubenswrapper[33572]: I1204 22:35:28.509828 33572 generic.go:334] "Generic (PLEG): container finished" podID="c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" containerID="4027c31e26a3a9c5fe6bcba8d9a071357e11963e4c240501d1c05241a7ed24bd" exitCode=0 Dec 04 22:35:28.510779 master-0 kubenswrapper[33572]: I1204 22:35:28.510745 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-db-sync-d9l4w" event={"ID":"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2","Type":"ContainerDied","Data":"4027c31e26a3a9c5fe6bcba8d9a071357e11963e4c240501d1c05241a7ed24bd"} Dec 04 22:35:28.709585 master-0 kubenswrapper[33572]: W1204 22:35:28.709395 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fb350b1_0cc9_4d29_8333_c67277e45842.slice/crio-7b87fba2f14ad1983a2551e02aaf5bc56ce7c19977fe7ee83abe657153a3ba33 WatchSource:0}: Error finding container 7b87fba2f14ad1983a2551e02aaf5bc56ce7c19977fe7ee83abe657153a3ba33: Status 404 returned error can't find the container with id 7b87fba2f14ad1983a2551e02aaf5bc56ce7c19977fe7ee83abe657153a3ba33 Dec 04 22:35:28.713761 master-0 kubenswrapper[33572]: I1204 22:35:28.713706 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bc64b79f9-dpgqw"] Dec 04 22:35:29.296363 master-0 kubenswrapper[33572]: I1204 22:35:29.296285 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8449cb68d4-sd8ww"] Dec 04 22:35:29.524181 master-0 kubenswrapper[33572]: I1204 22:35:29.524126 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8449cb68d4-sd8ww" event={"ID":"b143d9c0-797c-4160-ac50-6264b20e4e34","Type":"ContainerStarted","Data":"cbcf2a201a61d69f3a6b4f0772dc469859f258b237f9432745bc778d7749e3bc"} Dec 04 22:35:29.530872 master-0 kubenswrapper[33572]: I1204 22:35:29.527313 33572 generic.go:334] "Generic (PLEG): container finished" podID="2fb350b1-0cc9-4d29-8333-c67277e45842" containerID="ebe3f6eb3ed8b8d7987c86b72a47808848c24c414536426ce2c00c9dcb772c47" exitCode=0 Dec 04 22:35:29.530872 master-0 kubenswrapper[33572]: I1204 22:35:29.527354 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" event={"ID":"2fb350b1-0cc9-4d29-8333-c67277e45842","Type":"ContainerDied","Data":"ebe3f6eb3ed8b8d7987c86b72a47808848c24c414536426ce2c00c9dcb772c47"} Dec 04 22:35:29.530872 master-0 kubenswrapper[33572]: I1204 22:35:29.527397 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" event={"ID":"2fb350b1-0cc9-4d29-8333-c67277e45842","Type":"ContainerStarted","Data":"7b87fba2f14ad1983a2551e02aaf5bc56ce7c19977fe7ee83abe657153a3ba33"} Dec 04 22:35:30.099620 master-0 kubenswrapper[33572]: I1204 22:35:30.099519 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:35:30.285570 master-0 kubenswrapper[33572]: I1204 22:35:30.285518 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-etc-machine-id\") pod \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " Dec 04 22:35:30.285872 master-0 kubenswrapper[33572]: I1204 22:35:30.285855 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-config-data\") pod \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " Dec 04 22:35:30.285977 master-0 kubenswrapper[33572]: I1204 22:35:30.285962 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwtc7\" (UniqueName: \"kubernetes.io/projected/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-kube-api-access-vwtc7\") pod \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " Dec 04 22:35:30.286072 master-0 kubenswrapper[33572]: I1204 22:35:30.285631 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" (UID: "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:30.286240 master-0 kubenswrapper[33572]: I1204 22:35:30.286218 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-db-sync-config-data\") pod \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " Dec 04 22:35:30.286414 master-0 kubenswrapper[33572]: I1204 22:35:30.286401 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-combined-ca-bundle\") pod \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " Dec 04 22:35:30.286577 master-0 kubenswrapper[33572]: I1204 22:35:30.286563 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-scripts\") pod \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\" (UID: \"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2\") " Dec 04 22:35:30.287597 master-0 kubenswrapper[33572]: I1204 22:35:30.287579 33572 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:30.289037 master-0 kubenswrapper[33572]: I1204 22:35:30.288989 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-kube-api-access-vwtc7" (OuterVolumeSpecName: "kube-api-access-vwtc7") pod "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" (UID: "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2"). InnerVolumeSpecName "kube-api-access-vwtc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:30.289623 master-0 kubenswrapper[33572]: I1204 22:35:30.289573 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" (UID: "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:30.293059 master-0 kubenswrapper[33572]: I1204 22:35:30.293015 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-scripts" (OuterVolumeSpecName: "scripts") pod "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" (UID: "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:30.335769 master-0 kubenswrapper[33572]: I1204 22:35:30.335706 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" (UID: "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:30.359640 master-0 kubenswrapper[33572]: I1204 22:35:30.358933 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-config-data" (OuterVolumeSpecName: "config-data") pod "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" (UID: "c952fb4f-1a86-4eab-9c4e-4b046b8d81b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:30.390809 master-0 kubenswrapper[33572]: I1204 22:35:30.390726 33572 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:30.390809 master-0 kubenswrapper[33572]: I1204 22:35:30.390776 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:30.390809 master-0 kubenswrapper[33572]: I1204 22:35:30.390790 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:30.390809 master-0 kubenswrapper[33572]: I1204 22:35:30.390803 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:30.390809 master-0 kubenswrapper[33572]: I1204 22:35:30.390817 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwtc7\" (UniqueName: \"kubernetes.io/projected/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2-kube-api-access-vwtc7\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:30.543514 master-0 kubenswrapper[33572]: I1204 22:35:30.543020 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" event={"ID":"2fb350b1-0cc9-4d29-8333-c67277e45842","Type":"ContainerStarted","Data":"1a7bda0910d2a9338d07e4e489bd27514abea50ad68dffe3333b65f4e879c4f0"} Dec 04 22:35:30.543514 master-0 kubenswrapper[33572]: I1204 22:35:30.543193 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:30.547344 master-0 kubenswrapper[33572]: I1204 22:35:30.547284 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-db-sync-d9l4w" event={"ID":"c952fb4f-1a86-4eab-9c4e-4b046b8d81b2","Type":"ContainerDied","Data":"fb74acd76d089e5817e410933fb939ec607f974ab84b468ed51676cbb1458098"} Dec 04 22:35:30.547471 master-0 kubenswrapper[33572]: I1204 22:35:30.547358 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb74acd76d089e5817e410933fb939ec607f974ab84b468ed51676cbb1458098" Dec 04 22:35:30.547471 master-0 kubenswrapper[33572]: I1204 22:35:30.547299 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-db-sync-d9l4w" Dec 04 22:35:30.549956 master-0 kubenswrapper[33572]: I1204 22:35:30.549228 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8449cb68d4-sd8ww" event={"ID":"b143d9c0-797c-4160-ac50-6264b20e4e34","Type":"ContainerStarted","Data":"fb8a84152e354a7086630d5fdd8b6221056abd41809a71bb9eeeab0f0e215536"} Dec 04 22:35:30.549956 master-0 kubenswrapper[33572]: I1204 22:35:30.549262 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8449cb68d4-sd8ww" event={"ID":"b143d9c0-797c-4160-ac50-6264b20e4e34","Type":"ContainerStarted","Data":"19ae8bad0cc98de0a31ce98e9a98f9b0f3fad2fcfe775e4a7419a369980623df"} Dec 04 22:35:30.549956 master-0 kubenswrapper[33572]: I1204 22:35:30.549910 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:30.572561 master-0 kubenswrapper[33572]: I1204 22:35:30.572457 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" podStartSLOduration=3.572442539 podStartE2EDuration="3.572442539s" podCreationTimestamp="2025-12-04 22:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:30.571563695 +0000 UTC m=+994.299089374" watchObservedRunningTime="2025-12-04 22:35:30.572442539 +0000 UTC m=+994.299968188" Dec 04 22:35:30.608190 master-0 kubenswrapper[33572]: I1204 22:35:30.607448 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8449cb68d4-sd8ww" podStartSLOduration=3.60742674 podStartE2EDuration="3.60742674s" podCreationTimestamp="2025-12-04 22:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:30.593873064 +0000 UTC m=+994.321398753" watchObservedRunningTime="2025-12-04 22:35:30.60742674 +0000 UTC m=+994.334952399" Dec 04 22:35:30.988177 master-0 kubenswrapper[33572]: I1204 22:35:30.987697 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7675d-scheduler-0"] Dec 04 22:35:30.988314 master-0 kubenswrapper[33572]: E1204 22:35:30.988177 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" containerName="cinder-7675d-db-sync" Dec 04 22:35:30.988314 master-0 kubenswrapper[33572]: I1204 22:35:30.988193 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" containerName="cinder-7675d-db-sync" Dec 04 22:35:31.024218 master-0 kubenswrapper[33572]: I1204 22:35:31.024170 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" containerName="cinder-7675d-db-sync" Dec 04 22:35:31.025446 master-0 kubenswrapper[33572]: I1204 22:35:31.025429 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.032547 master-0 kubenswrapper[33572]: I1204 22:35:31.029319 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-scripts" Dec 04 22:35:31.032547 master-0 kubenswrapper[33572]: I1204 22:35:31.029625 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-config-data" Dec 04 22:35:31.032547 master-0 kubenswrapper[33572]: I1204 22:35:31.031141 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-scheduler-config-data" Dec 04 22:35:31.044366 master-0 kubenswrapper[33572]: I1204 22:35:31.042239 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7675d-volume-lvm-iscsi-0"] Dec 04 22:35:31.044808 master-0 kubenswrapper[33572]: I1204 22:35:31.044776 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.049899 master-0 kubenswrapper[33572]: I1204 22:35:31.049833 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-volume-lvm-iscsi-config-data" Dec 04 22:35:31.067640 master-0 kubenswrapper[33572]: I1204 22:35:31.063615 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-scheduler-0"] Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.116829 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.116880 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kc7r\" (UniqueName: \"kubernetes.io/projected/6dfe0d20-6730-407c-b131-00b753ab28a3-kube-api-access-7kc7r\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.116903 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-combined-ca-bundle\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.116919 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-lib-modules\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.116936 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-dev\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.116968 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-combined-ca-bundle\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.116989 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data-custom\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117021 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117050 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-run\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117082 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-brick\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117107 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-lib-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117136 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-nvme\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117156 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-scripts\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117182 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data-custom\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117226 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-machine-id\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117266 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117285 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-scripts\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117307 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwlr4\" (UniqueName: \"kubernetes.io/projected/24240a78-0969-401e-b1e8-0b3f01a4a434-kube-api-access-fwlr4\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117325 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-sys\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117344 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dfe0d20-6730-407c-b131-00b753ab28a3-etc-machine-id\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.121901 master-0 kubenswrapper[33572]: I1204 22:35:31.117359 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-iscsi\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.147571 master-0 kubenswrapper[33572]: I1204 22:35:31.146554 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-volume-lvm-iscsi-0"] Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229074 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-machine-id\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229134 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229153 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-scripts\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229172 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwlr4\" (UniqueName: \"kubernetes.io/projected/24240a78-0969-401e-b1e8-0b3f01a4a434-kube-api-access-fwlr4\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229193 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-sys\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229216 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dfe0d20-6730-407c-b131-00b753ab28a3-etc-machine-id\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229231 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-iscsi\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229261 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229276 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kc7r\" (UniqueName: \"kubernetes.io/projected/6dfe0d20-6730-407c-b131-00b753ab28a3-kube-api-access-7kc7r\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229295 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-combined-ca-bundle\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229310 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-lib-modules\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229327 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-dev\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229355 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-combined-ca-bundle\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229377 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data-custom\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229406 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229436 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-run\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229466 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-brick\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229492 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-lib-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229532 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-nvme\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229551 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-scripts\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.230583 master-0 kubenswrapper[33572]: I1204 22:35:31.229577 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data-custom\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.231350 master-0 kubenswrapper[33572]: I1204 22:35:31.230998 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-nvme\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.243974 master-0 kubenswrapper[33572]: I1204 22:35:31.233036 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data-custom\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.243974 master-0 kubenswrapper[33572]: I1204 22:35:31.233097 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dfe0d20-6730-407c-b131-00b753ab28a3-etc-machine-id\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.243974 master-0 kubenswrapper[33572]: I1204 22:35:31.233123 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-iscsi\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.243974 master-0 kubenswrapper[33572]: I1204 22:35:31.233274 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.243974 master-0 kubenswrapper[33572]: I1204 22:35:31.237435 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data-custom\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.243974 master-0 kubenswrapper[33572]: I1204 22:35:31.238909 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-sys\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.243974 master-0 kubenswrapper[33572]: I1204 22:35:31.242770 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-scripts\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.246654 master-0 kubenswrapper[33572]: I1204 22:35:31.246613 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-lib-modules\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.246762 master-0 kubenswrapper[33572]: I1204 22:35:31.246667 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-dev\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.247114 master-0 kubenswrapper[33572]: I1204 22:35:31.247017 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-lib-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.247114 master-0 kubenswrapper[33572]: I1204 22:35:31.247050 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-run\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.248477 master-0 kubenswrapper[33572]: I1204 22:35:31.248432 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-brick\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.248579 master-0 kubenswrapper[33572]: I1204 22:35:31.248542 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-machine-id\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.254688 master-0 kubenswrapper[33572]: I1204 22:35:31.251258 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-combined-ca-bundle\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.254688 master-0 kubenswrapper[33572]: I1204 22:35:31.251450 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-scripts\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.254688 master-0 kubenswrapper[33572]: I1204 22:35:31.252068 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-659f96b785-cjs9k"] Dec 04 22:35:31.258843 master-0 kubenswrapper[33572]: I1204 22:35:31.258645 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-combined-ca-bundle\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.259850 master-0 kubenswrapper[33572]: I1204 22:35:31.259822 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.261547 master-0 kubenswrapper[33572]: I1204 22:35:31.261277 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.265095 master-0 kubenswrapper[33572]: I1204 22:35:31.265052 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 04 22:35:31.265432 master-0 kubenswrapper[33572]: I1204 22:35:31.265404 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 04 22:35:31.266066 master-0 kubenswrapper[33572]: I1204 22:35:31.266031 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-659f96b785-cjs9k"] Dec 04 22:35:31.275004 master-0 kubenswrapper[33572]: I1204 22:35:31.269054 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwlr4\" (UniqueName: \"kubernetes.io/projected/24240a78-0969-401e-b1e8-0b3f01a4a434-kube-api-access-fwlr4\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.279895 master-0 kubenswrapper[33572]: I1204 22:35:31.279043 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7675d-backup-0"] Dec 04 22:35:31.283102 master-0 kubenswrapper[33572]: I1204 22:35:31.281339 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.295330 master-0 kubenswrapper[33572]: I1204 22:35:31.286903 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kc7r\" (UniqueName: \"kubernetes.io/projected/6dfe0d20-6730-407c-b131-00b753ab28a3-kube-api-access-7kc7r\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.295330 master-0 kubenswrapper[33572]: I1204 22:35:31.287008 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data\") pod \"cinder-7675d-scheduler-0\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.295330 master-0 kubenswrapper[33572]: I1204 22:35:31.287564 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-backup-config-data" Dec 04 22:35:31.310054 master-0 kubenswrapper[33572]: I1204 22:35:31.309961 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bc64b79f9-dpgqw"] Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.331584 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-lib-modules\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.331642 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-sys\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.331695 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-dev\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.331712 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.331732 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-public-tls-certs\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.331956 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data-custom\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.332049 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-backup-0"] Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.332445 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-config\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.333478 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-brick\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.333557 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-httpd-config\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.333583 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxkm7\" (UniqueName: \"kubernetes.io/projected/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-kube-api-access-xxkm7\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.333630 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-iscsi\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.333679 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-scripts\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.333713 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-combined-ca-bundle\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.333764 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdlnp\" (UniqueName: \"kubernetes.io/projected/df4511d6-1b75-449e-b5e2-754c93cca72c-kube-api-access-xdlnp\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.333802 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-ovndb-tls-certs\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.333827 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-combined-ca-bundle\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.337317 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.337348 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-lib-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.337376 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-nvme\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.337415 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-run\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.337515 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-internal-tls-certs\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.338243 master-0 kubenswrapper[33572]: I1204 22:35:31.337550 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-machine-id\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.370067 master-0 kubenswrapper[33572]: I1204 22:35:31.370016 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bf87966c-ng7wp"] Dec 04 22:35:31.378142 master-0 kubenswrapper[33572]: I1204 22:35:31.378102 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.378723 master-0 kubenswrapper[33572]: I1204 22:35:31.378699 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bf87966c-ng7wp"] Dec 04 22:35:31.395063 master-0 kubenswrapper[33572]: I1204 22:35:31.394999 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7675d-api-0"] Dec 04 22:35:31.403632 master-0 kubenswrapper[33572]: I1204 22:35:31.398210 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.407518 master-0 kubenswrapper[33572]: I1204 22:35:31.403999 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-api-config-data" Dec 04 22:35:31.407518 master-0 kubenswrapper[33572]: I1204 22:35:31.404035 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-api-0"] Dec 04 22:35:31.424265 master-0 kubenswrapper[33572]: I1204 22:35:31.424112 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:31.442551 master-0 kubenswrapper[33572]: I1204 22:35:31.442350 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-config\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.442729 master-0 kubenswrapper[33572]: I1204 22:35:31.442584 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrvg9\" (UniqueName: \"kubernetes.io/projected/3ac59098-23a5-4191-acf7-07e7aad0c17c-kube-api-access-nrvg9\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.442729 master-0 kubenswrapper[33572]: I1204 22:35:31.442668 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-scripts\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.442729 master-0 kubenswrapper[33572]: I1204 22:35:31.442708 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-combined-ca-bundle\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.442958 master-0 kubenswrapper[33572]: I1204 22:35:31.442938 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdlnp\" (UniqueName: \"kubernetes.io/projected/df4511d6-1b75-449e-b5e2-754c93cca72c-kube-api-access-xdlnp\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.443017 master-0 kubenswrapper[33572]: I1204 22:35:31.442982 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-svc\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.443055 master-0 kubenswrapper[33572]: I1204 22:35:31.443017 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-ovndb-tls-certs\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.443219 master-0 kubenswrapper[33572]: I1204 22:35:31.443199 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-combined-ca-bundle\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.443308 master-0 kubenswrapper[33572]: I1204 22:35:31.443292 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-lib-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.443401 master-0 kubenswrapper[33572]: I1204 22:35:31.443386 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.443553 master-0 kubenswrapper[33572]: I1204 22:35:31.443415 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-scripts\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.443662 master-0 kubenswrapper[33572]: I1204 22:35:31.443566 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data-custom\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.443728 master-0 kubenswrapper[33572]: I1204 22:35:31.443712 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-nvme\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.443800 master-0 kubenswrapper[33572]: I1204 22:35:31.443784 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-run\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.443878 master-0 kubenswrapper[33572]: I1204 22:35:31.443864 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40b2cc3d-21a7-4959-af99-7224b50a9d94-etc-machine-id\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.443912 master-0 kubenswrapper[33572]: I1204 22:35:31.443901 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrx8z\" (UniqueName: \"kubernetes.io/projected/40b2cc3d-21a7-4959-af99-7224b50a9d94-kube-api-access-rrx8z\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.444656 master-0 kubenswrapper[33572]: I1204 22:35:31.444610 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-internal-tls-certs\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.445557 master-0 kubenswrapper[33572]: I1204 22:35:31.445527 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-lib-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.445647 master-0 kubenswrapper[33572]: I1204 22:35:31.445626 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-nvme\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.445786 master-0 kubenswrapper[33572]: I1204 22:35:31.445765 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-machine-id\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.445899 master-0 kubenswrapper[33572]: I1204 22:35:31.445878 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-machine-id\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.446000 master-0 kubenswrapper[33572]: I1204 22:35:31.445979 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-lib-modules\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.446040 master-0 kubenswrapper[33572]: I1204 22:35:31.446010 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-sys\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.446082 master-0 kubenswrapper[33572]: I1204 22:35:31.446064 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-dev\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.446121 master-0 kubenswrapper[33572]: I1204 22:35:31.446093 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.446275 master-0 kubenswrapper[33572]: I1204 22:35:31.446256 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-combined-ca-bundle\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.446431 master-0 kubenswrapper[33572]: I1204 22:35:31.446286 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-public-tls-certs\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.446824 master-0 kubenswrapper[33572]: I1204 22:35:31.446801 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data-custom\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.447745 master-0 kubenswrapper[33572]: I1204 22:35:31.447650 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b2cc3d-21a7-4959-af99-7224b50a9d94-logs\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.448137 master-0 kubenswrapper[33572]: I1204 22:35:31.448112 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-ovndb-tls-certs\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.449236 master-0 kubenswrapper[33572]: I1204 22:35:31.448618 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.449236 master-0 kubenswrapper[33572]: I1204 22:35:31.448641 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-run\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.449236 master-0 kubenswrapper[33572]: I1204 22:35:31.448804 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-lib-modules\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.449236 master-0 kubenswrapper[33572]: I1204 22:35:31.448851 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-dev\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.453673 master-0 kubenswrapper[33572]: I1204 22:35:31.450252 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-sys\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.456929 master-0 kubenswrapper[33572]: I1204 22:35:31.447707 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-config\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.456929 master-0 kubenswrapper[33572]: I1204 22:35:31.456691 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.456929 master-0 kubenswrapper[33572]: I1204 22:35:31.456732 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-brick\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.456929 master-0 kubenswrapper[33572]: I1204 22:35:31.456797 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-httpd-config\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.456929 master-0 kubenswrapper[33572]: I1204 22:35:31.456831 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxkm7\" (UniqueName: \"kubernetes.io/projected/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-kube-api-access-xxkm7\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.456929 master-0 kubenswrapper[33572]: I1204 22:35:31.456860 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-nb\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.456929 master-0 kubenswrapper[33572]: I1204 22:35:31.456891 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-swift-storage-0\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.456929 master-0 kubenswrapper[33572]: I1204 22:35:31.456918 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-sb\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.457218 master-0 kubenswrapper[33572]: I1204 22:35:31.456960 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-iscsi\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.459200 master-0 kubenswrapper[33572]: I1204 22:35:31.459175 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-scripts\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.459369 master-0 kubenswrapper[33572]: I1204 22:35:31.459335 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-config\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.459837 master-0 kubenswrapper[33572]: I1204 22:35:31.459795 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-internal-tls-certs\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.460728 master-0 kubenswrapper[33572]: I1204 22:35:31.460698 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-combined-ca-bundle\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.460873 master-0 kubenswrapper[33572]: I1204 22:35:31.460833 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-public-tls-certs\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.461291 master-0 kubenswrapper[33572]: I1204 22:35:31.461269 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-combined-ca-bundle\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.461347 master-0 kubenswrapper[33572]: I1204 22:35:31.461332 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-brick\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.461683 master-0 kubenswrapper[33572]: I1204 22:35:31.461653 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-iscsi\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.463577 master-0 kubenswrapper[33572]: I1204 22:35:31.463164 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.463577 master-0 kubenswrapper[33572]: I1204 22:35:31.463357 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:31.467531 master-0 kubenswrapper[33572]: I1204 22:35:31.466786 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data-custom\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.471516 master-0 kubenswrapper[33572]: I1204 22:35:31.471465 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-httpd-config\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.475756 master-0 kubenswrapper[33572]: I1204 22:35:31.475720 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdlnp\" (UniqueName: \"kubernetes.io/projected/df4511d6-1b75-449e-b5e2-754c93cca72c-kube-api-access-xdlnp\") pod \"cinder-7675d-backup-0\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.479534 master-0 kubenswrapper[33572]: I1204 22:35:31.478800 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxkm7\" (UniqueName: \"kubernetes.io/projected/a92b60af-9aa0-459d-b519-1e8bdcd9a0da-kube-api-access-xxkm7\") pod \"neutron-659f96b785-cjs9k\" (UID: \"a92b60af-9aa0-459d-b519-1e8bdcd9a0da\") " pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.563784 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.563863 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-nb\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.563887 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-swift-storage-0\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.563904 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-sb\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.563931 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-config\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.563949 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrvg9\" (UniqueName: \"kubernetes.io/projected/3ac59098-23a5-4191-acf7-07e7aad0c17c-kube-api-access-nrvg9\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.563993 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-svc\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.564043 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-scripts\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.564064 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data-custom\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.564108 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40b2cc3d-21a7-4959-af99-7224b50a9d94-etc-machine-id\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.564128 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrx8z\" (UniqueName: \"kubernetes.io/projected/40b2cc3d-21a7-4959-af99-7224b50a9d94-kube-api-access-rrx8z\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.564212 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-combined-ca-bundle\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.564264 master-0 kubenswrapper[33572]: I1204 22:35:31.564269 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b2cc3d-21a7-4959-af99-7224b50a9d94-logs\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.565071 master-0 kubenswrapper[33572]: I1204 22:35:31.564834 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b2cc3d-21a7-4959-af99-7224b50a9d94-logs\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.568268 master-0 kubenswrapper[33572]: I1204 22:35:31.568182 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-config\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.568361 master-0 kubenswrapper[33572]: I1204 22:35:31.568322 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40b2cc3d-21a7-4959-af99-7224b50a9d94-etc-machine-id\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.569148 master-0 kubenswrapper[33572]: I1204 22:35:31.569109 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-nb\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.569897 master-0 kubenswrapper[33572]: I1204 22:35:31.569870 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.571259 master-0 kubenswrapper[33572]: I1204 22:35:31.571223 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-swift-storage-0\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.571760 master-0 kubenswrapper[33572]: I1204 22:35:31.571734 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-sb\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.573878 master-0 kubenswrapper[33572]: I1204 22:35:31.573756 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-svc\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.584473 master-0 kubenswrapper[33572]: I1204 22:35:31.584405 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-combined-ca-bundle\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.585113 master-0 kubenswrapper[33572]: I1204 22:35:31.585078 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-scripts\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.585223 master-0 kubenswrapper[33572]: I1204 22:35:31.585195 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data-custom\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.591039 master-0 kubenswrapper[33572]: I1204 22:35:31.587077 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrvg9\" (UniqueName: \"kubernetes.io/projected/3ac59098-23a5-4191-acf7-07e7aad0c17c-kube-api-access-nrvg9\") pod \"dnsmasq-dns-75bf87966c-ng7wp\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.591039 master-0 kubenswrapper[33572]: I1204 22:35:31.589517 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrx8z\" (UniqueName: \"kubernetes.io/projected/40b2cc3d-21a7-4959-af99-7224b50a9d94-kube-api-access-rrx8z\") pod \"cinder-7675d-api-0\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:31.657386 master-0 kubenswrapper[33572]: I1204 22:35:31.657332 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:31.668894 master-0 kubenswrapper[33572]: I1204 22:35:31.666301 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:31.712408 master-0 kubenswrapper[33572]: I1204 22:35:31.712310 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:31.734123 master-0 kubenswrapper[33572]: I1204 22:35:31.734069 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-api-0" Dec 04 22:35:32.594006 master-0 kubenswrapper[33572]: I1204 22:35:32.593916 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" podUID="2fb350b1-0cc9-4d29-8333-c67277e45842" containerName="dnsmasq-dns" containerID="cri-o://1a7bda0910d2a9338d07e4e489bd27514abea50ad68dffe3333b65f4e879c4f0" gracePeriod=10 Dec 04 22:35:32.749745 master-0 kubenswrapper[33572]: I1204 22:35:32.749637 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-scheduler-0"] Dec 04 22:35:32.768796 master-0 kubenswrapper[33572]: I1204 22:35:32.768251 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-volume-lvm-iscsi-0"] Dec 04 22:35:32.772611 master-0 kubenswrapper[33572]: W1204 22:35:32.772559 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24240a78_0969_401e_b1e8_0b3f01a4a434.slice/crio-af63b680cdae9c435bd3cd2385d04a8ede90280990befa69a548f5e84b2bdc41 WatchSource:0}: Error finding container af63b680cdae9c435bd3cd2385d04a8ede90280990befa69a548f5e84b2bdc41: Status 404 returned error can't find the container with id af63b680cdae9c435bd3cd2385d04a8ede90280990befa69a548f5e84b2bdc41 Dec 04 22:35:32.779948 master-0 kubenswrapper[33572]: I1204 22:35:32.779924 33572 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 22:35:32.952662 master-0 kubenswrapper[33572]: I1204 22:35:32.952580 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bf87966c-ng7wp"] Dec 04 22:35:32.953290 master-0 kubenswrapper[33572]: W1204 22:35:32.953227 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40b2cc3d_21a7_4959_af99_7224b50a9d94.slice/crio-756a1cb7f0d9273db96eb9225502913cb152b26c07abba87f830224e53574a0d WatchSource:0}: Error finding container 756a1cb7f0d9273db96eb9225502913cb152b26c07abba87f830224e53574a0d: Status 404 returned error can't find the container with id 756a1cb7f0d9273db96eb9225502913cb152b26c07abba87f830224e53574a0d Dec 04 22:35:32.955600 master-0 kubenswrapper[33572]: W1204 22:35:32.955553 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ac59098_23a5_4191_acf7_07e7aad0c17c.slice/crio-d7f27dc7027ea547ce1e4408cac1df385c8c377e16be84a718ac9ce13a6b8c2b WatchSource:0}: Error finding container d7f27dc7027ea547ce1e4408cac1df385c8c377e16be84a718ac9ce13a6b8c2b: Status 404 returned error can't find the container with id d7f27dc7027ea547ce1e4408cac1df385c8c377e16be84a718ac9ce13a6b8c2b Dec 04 22:35:32.960838 master-0 kubenswrapper[33572]: I1204 22:35:32.960806 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-api-0"] Dec 04 22:35:33.416036 master-0 kubenswrapper[33572]: I1204 22:35:33.415979 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-659f96b785-cjs9k"] Dec 04 22:35:33.439060 master-0 kubenswrapper[33572]: W1204 22:35:33.438984 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda92b60af_9aa0_459d_b519_1e8bdcd9a0da.slice/crio-e097f4fbaad7fe12e81852d5127346137e48cd27bf89934eab3d423009150c87 WatchSource:0}: Error finding container e097f4fbaad7fe12e81852d5127346137e48cd27bf89934eab3d423009150c87: Status 404 returned error can't find the container with id e097f4fbaad7fe12e81852d5127346137e48cd27bf89934eab3d423009150c87 Dec 04 22:35:33.619529 master-0 kubenswrapper[33572]: I1204 22:35:33.619388 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-659f96b785-cjs9k" event={"ID":"a92b60af-9aa0-459d-b519-1e8bdcd9a0da","Type":"ContainerStarted","Data":"e097f4fbaad7fe12e81852d5127346137e48cd27bf89934eab3d423009150c87"} Dec 04 22:35:33.627982 master-0 kubenswrapper[33572]: I1204 22:35:33.623353 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-scheduler-0" event={"ID":"6dfe0d20-6730-407c-b131-00b753ab28a3","Type":"ContainerStarted","Data":"e843d8746f35ed9c0f9aea42bede543cfb6e8c1092782d0b4c8c077f795a3135"} Dec 04 22:35:33.627982 master-0 kubenswrapper[33572]: I1204 22:35:33.625516 33572 generic.go:334] "Generic (PLEG): container finished" podID="2fb350b1-0cc9-4d29-8333-c67277e45842" containerID="1a7bda0910d2a9338d07e4e489bd27514abea50ad68dffe3333b65f4e879c4f0" exitCode=0 Dec 04 22:35:33.627982 master-0 kubenswrapper[33572]: I1204 22:35:33.625576 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" event={"ID":"2fb350b1-0cc9-4d29-8333-c67277e45842","Type":"ContainerDied","Data":"1a7bda0910d2a9338d07e4e489bd27514abea50ad68dffe3333b65f4e879c4f0"} Dec 04 22:35:33.628862 master-0 kubenswrapper[33572]: I1204 22:35:33.628325 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" event={"ID":"24240a78-0969-401e-b1e8-0b3f01a4a434","Type":"ContainerStarted","Data":"af63b680cdae9c435bd3cd2385d04a8ede90280990befa69a548f5e84b2bdc41"} Dec 04 22:35:33.637547 master-0 kubenswrapper[33572]: I1204 22:35:33.636357 33572 generic.go:334] "Generic (PLEG): container finished" podID="3ac59098-23a5-4191-acf7-07e7aad0c17c" containerID="602d579d7d64da1adab8b0266b0ff26f3b6ad1eebe55a8fa7364325bbe0f35f2" exitCode=0 Dec 04 22:35:33.637547 master-0 kubenswrapper[33572]: I1204 22:35:33.636440 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" event={"ID":"3ac59098-23a5-4191-acf7-07e7aad0c17c","Type":"ContainerDied","Data":"602d579d7d64da1adab8b0266b0ff26f3b6ad1eebe55a8fa7364325bbe0f35f2"} Dec 04 22:35:33.637547 master-0 kubenswrapper[33572]: I1204 22:35:33.636483 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" event={"ID":"3ac59098-23a5-4191-acf7-07e7aad0c17c","Type":"ContainerStarted","Data":"d7f27dc7027ea547ce1e4408cac1df385c8c377e16be84a718ac9ce13a6b8c2b"} Dec 04 22:35:33.640163 master-0 kubenswrapper[33572]: I1204 22:35:33.640072 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-api-0" event={"ID":"40b2cc3d-21a7-4959-af99-7224b50a9d94","Type":"ContainerStarted","Data":"756a1cb7f0d9273db96eb9225502913cb152b26c07abba87f830224e53574a0d"} Dec 04 22:35:33.859554 master-0 kubenswrapper[33572]: I1204 22:35:33.856562 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-backup-0"] Dec 04 22:35:33.888761 master-0 kubenswrapper[33572]: W1204 22:35:33.887531 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf4511d6_1b75_449e_b5e2_754c93cca72c.slice/crio-8d365fb344d9db2b6d850f6d3960f355c5b5128f5944546f2df8c539b16cf05c WatchSource:0}: Error finding container 8d365fb344d9db2b6d850f6d3960f355c5b5128f5944546f2df8c539b16cf05c: Status 404 returned error can't find the container with id 8d365fb344d9db2b6d850f6d3960f355c5b5128f5944546f2df8c539b16cf05c Dec 04 22:35:33.912163 master-0 kubenswrapper[33572]: I1204 22:35:33.912109 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:34.042321 master-0 kubenswrapper[33572]: I1204 22:35:34.041988 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-nb\") pod \"2fb350b1-0cc9-4d29-8333-c67277e45842\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " Dec 04 22:35:34.042321 master-0 kubenswrapper[33572]: I1204 22:35:34.042055 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-config\") pod \"2fb350b1-0cc9-4d29-8333-c67277e45842\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " Dec 04 22:35:34.042321 master-0 kubenswrapper[33572]: I1204 22:35:34.042153 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-swift-storage-0\") pod \"2fb350b1-0cc9-4d29-8333-c67277e45842\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " Dec 04 22:35:34.042321 master-0 kubenswrapper[33572]: I1204 22:35:34.042195 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp48z\" (UniqueName: \"kubernetes.io/projected/2fb350b1-0cc9-4d29-8333-c67277e45842-kube-api-access-hp48z\") pod \"2fb350b1-0cc9-4d29-8333-c67277e45842\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " Dec 04 22:35:34.042966 master-0 kubenswrapper[33572]: I1204 22:35:34.042664 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-sb\") pod \"2fb350b1-0cc9-4d29-8333-c67277e45842\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " Dec 04 22:35:34.043073 master-0 kubenswrapper[33572]: I1204 22:35:34.043004 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-svc\") pod \"2fb350b1-0cc9-4d29-8333-c67277e45842\" (UID: \"2fb350b1-0cc9-4d29-8333-c67277e45842\") " Dec 04 22:35:34.046551 master-0 kubenswrapper[33572]: I1204 22:35:34.046322 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fb350b1-0cc9-4d29-8333-c67277e45842-kube-api-access-hp48z" (OuterVolumeSpecName: "kube-api-access-hp48z") pod "2fb350b1-0cc9-4d29-8333-c67277e45842" (UID: "2fb350b1-0cc9-4d29-8333-c67277e45842"). InnerVolumeSpecName "kube-api-access-hp48z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:34.108784 master-0 kubenswrapper[33572]: I1204 22:35:34.108675 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2fb350b1-0cc9-4d29-8333-c67277e45842" (UID: "2fb350b1-0cc9-4d29-8333-c67277e45842"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:34.126122 master-0 kubenswrapper[33572]: I1204 22:35:34.126056 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-config" (OuterVolumeSpecName: "config") pod "2fb350b1-0cc9-4d29-8333-c67277e45842" (UID: "2fb350b1-0cc9-4d29-8333-c67277e45842"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:34.145656 master-0 kubenswrapper[33572]: I1204 22:35:34.144939 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2fb350b1-0cc9-4d29-8333-c67277e45842" (UID: "2fb350b1-0cc9-4d29-8333-c67277e45842"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:34.145656 master-0 kubenswrapper[33572]: I1204 22:35:34.145028 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:34.145656 master-0 kubenswrapper[33572]: I1204 22:35:34.145046 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:34.145656 master-0 kubenswrapper[33572]: I1204 22:35:34.145058 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp48z\" (UniqueName: \"kubernetes.io/projected/2fb350b1-0cc9-4d29-8333-c67277e45842-kube-api-access-hp48z\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:34.146546 master-0 kubenswrapper[33572]: I1204 22:35:34.146473 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2fb350b1-0cc9-4d29-8333-c67277e45842" (UID: "2fb350b1-0cc9-4d29-8333-c67277e45842"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:34.155868 master-0 kubenswrapper[33572]: I1204 22:35:34.155769 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2fb350b1-0cc9-4d29-8333-c67277e45842" (UID: "2fb350b1-0cc9-4d29-8333-c67277e45842"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:34.247401 master-0 kubenswrapper[33572]: I1204 22:35:34.247320 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:34.247401 master-0 kubenswrapper[33572]: I1204 22:35:34.247361 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:34.247401 master-0 kubenswrapper[33572]: I1204 22:35:34.247371 33572 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2fb350b1-0cc9-4d29-8333-c67277e45842-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:34.660096 master-0 kubenswrapper[33572]: I1204 22:35:34.659862 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" event={"ID":"2fb350b1-0cc9-4d29-8333-c67277e45842","Type":"ContainerDied","Data":"7b87fba2f14ad1983a2551e02aaf5bc56ce7c19977fe7ee83abe657153a3ba33"} Dec 04 22:35:34.660096 master-0 kubenswrapper[33572]: I1204 22:35:34.659903 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bc64b79f9-dpgqw" Dec 04 22:35:34.660096 master-0 kubenswrapper[33572]: I1204 22:35:34.659930 33572 scope.go:117] "RemoveContainer" containerID="1a7bda0910d2a9338d07e4e489bd27514abea50ad68dffe3333b65f4e879c4f0" Dec 04 22:35:34.665561 master-0 kubenswrapper[33572]: I1204 22:35:34.665462 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" event={"ID":"24240a78-0969-401e-b1e8-0b3f01a4a434","Type":"ContainerStarted","Data":"f7a0b68706cdf9914a2144da72bee96e37da7f9db0690b4ae7ed704be6a990a6"} Dec 04 22:35:34.669395 master-0 kubenswrapper[33572]: I1204 22:35:34.668624 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" event={"ID":"3ac59098-23a5-4191-acf7-07e7aad0c17c","Type":"ContainerStarted","Data":"9c8281a52166c9bd04899dbb199b1d89b6df5d9fc55bea8ccd7e163ee64f89d2"} Dec 04 22:35:34.669395 master-0 kubenswrapper[33572]: I1204 22:35:34.668814 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:34.674009 master-0 kubenswrapper[33572]: I1204 22:35:34.673951 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-api-0" event={"ID":"40b2cc3d-21a7-4959-af99-7224b50a9d94","Type":"ContainerStarted","Data":"bbc91fb05397d144a428c1a56753d26e44ebf29c7aa72fe800ec67af9832175a"} Dec 04 22:35:34.682011 master-0 kubenswrapper[33572]: I1204 22:35:34.680411 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-659f96b785-cjs9k" event={"ID":"a92b60af-9aa0-459d-b519-1e8bdcd9a0da","Type":"ContainerStarted","Data":"fe1f23baa9432c6e2cdee1b800b1656c28e2538d7a91eb9219a2d551133a5411"} Dec 04 22:35:34.682011 master-0 kubenswrapper[33572]: I1204 22:35:34.680464 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-659f96b785-cjs9k" event={"ID":"a92b60af-9aa0-459d-b519-1e8bdcd9a0da","Type":"ContainerStarted","Data":"1fdd0344d1e1f85c759bf97f8d89f92ea0dd2d2bc05cc74a67af9438b32eac5a"} Dec 04 22:35:34.683797 master-0 kubenswrapper[33572]: I1204 22:35:34.683694 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:35:34.688922 master-0 kubenswrapper[33572]: I1204 22:35:34.688858 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-backup-0" event={"ID":"df4511d6-1b75-449e-b5e2-754c93cca72c","Type":"ContainerStarted","Data":"8d365fb344d9db2b6d850f6d3960f355c5b5128f5944546f2df8c539b16cf05c"} Dec 04 22:35:34.690259 master-0 kubenswrapper[33572]: I1204 22:35:34.690237 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bc64b79f9-dpgqw"] Dec 04 22:35:34.692255 master-0 kubenswrapper[33572]: I1204 22:35:34.692223 33572 generic.go:334] "Generic (PLEG): container finished" podID="320dd132-b10a-4d56-88a4-307ecb61196f" containerID="5ee780f89386259b20927daa58537ef5615e819e02fafc58790a9eea8e5f1738" exitCode=0 Dec 04 22:35:34.692311 master-0 kubenswrapper[33572]: I1204 22:35:34.692256 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j89sq" event={"ID":"320dd132-b10a-4d56-88a4-307ecb61196f","Type":"ContainerDied","Data":"5ee780f89386259b20927daa58537ef5615e819e02fafc58790a9eea8e5f1738"} Dec 04 22:35:34.702074 master-0 kubenswrapper[33572]: I1204 22:35:34.702005 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bc64b79f9-dpgqw"] Dec 04 22:35:34.724545 master-0 kubenswrapper[33572]: I1204 22:35:34.723976 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" podStartSLOduration=3.723947632 podStartE2EDuration="3.723947632s" podCreationTimestamp="2025-12-04 22:35:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:34.708080413 +0000 UTC m=+998.435606052" watchObservedRunningTime="2025-12-04 22:35:34.723947632 +0000 UTC m=+998.451473291" Dec 04 22:35:34.738923 master-0 kubenswrapper[33572]: I1204 22:35:34.736762 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-659f96b785-cjs9k" podStartSLOduration=3.736737367 podStartE2EDuration="3.736737367s" podCreationTimestamp="2025-12-04 22:35:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:34.731668207 +0000 UTC m=+998.459193856" watchObservedRunningTime="2025-12-04 22:35:34.736737367 +0000 UTC m=+998.464263016" Dec 04 22:35:34.904688 master-0 kubenswrapper[33572]: I1204 22:35:34.902356 33572 scope.go:117] "RemoveContainer" containerID="ebe3f6eb3ed8b8d7987c86b72a47808848c24c414536426ce2c00c9dcb772c47" Dec 04 22:35:35.217541 master-0 kubenswrapper[33572]: I1204 22:35:35.215594 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7675d-api-0"] Dec 04 22:35:35.713329 master-0 kubenswrapper[33572]: I1204 22:35:35.713268 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" event={"ID":"24240a78-0969-401e-b1e8-0b3f01a4a434","Type":"ContainerStarted","Data":"54267db9176e2e2b9efe2e5bacef0739627419b64734e20c1716d61af6863aea"} Dec 04 22:35:35.715425 master-0 kubenswrapper[33572]: I1204 22:35:35.715378 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-api-0" event={"ID":"40b2cc3d-21a7-4959-af99-7224b50a9d94","Type":"ContainerStarted","Data":"26486bfde804d1b1b956d33edd58535742b07ece5cbbbc5d538e1b304ab6e0e7"} Dec 04 22:35:35.715699 master-0 kubenswrapper[33572]: I1204 22:35:35.715678 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-7675d-api-0" Dec 04 22:35:35.718059 master-0 kubenswrapper[33572]: I1204 22:35:35.718012 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-backup-0" event={"ID":"df4511d6-1b75-449e-b5e2-754c93cca72c","Type":"ContainerStarted","Data":"fc6ac157ac0f29d0182f071c3f4775d561f62b707d1d469af37859d59a80e715"} Dec 04 22:35:35.720147 master-0 kubenswrapper[33572]: I1204 22:35:35.720096 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-scheduler-0" event={"ID":"6dfe0d20-6730-407c-b131-00b753ab28a3","Type":"ContainerStarted","Data":"a6dd2f1375da904eb79da1bb286b68ab548b33022bfc148b22344405ada74821"} Dec 04 22:35:35.764383 master-0 kubenswrapper[33572]: I1204 22:35:35.754823 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" podStartSLOduration=4.5091150330000005 podStartE2EDuration="5.754802478s" podCreationTimestamp="2025-12-04 22:35:30 +0000 UTC" firstStartedPulling="2025-12-04 22:35:32.782120856 +0000 UTC m=+996.509646505" lastFinishedPulling="2025-12-04 22:35:34.027808301 +0000 UTC m=+997.755333950" observedRunningTime="2025-12-04 22:35:35.74728602 +0000 UTC m=+999.474811669" watchObservedRunningTime="2025-12-04 22:35:35.754802478 +0000 UTC m=+999.482328127" Dec 04 22:35:35.776776 master-0 kubenswrapper[33572]: I1204 22:35:35.775482 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7675d-api-0" podStartSLOduration=4.775462362 podStartE2EDuration="4.775462362s" podCreationTimestamp="2025-12-04 22:35:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:35.775435621 +0000 UTC m=+999.502961270" watchObservedRunningTime="2025-12-04 22:35:35.775462362 +0000 UTC m=+999.502988011" Dec 04 22:35:36.277541 master-0 kubenswrapper[33572]: I1204 22:35:36.277464 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:36.414648 master-0 kubenswrapper[33572]: I1204 22:35:36.414575 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-config-data\") pod \"320dd132-b10a-4d56-88a4-307ecb61196f\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " Dec 04 22:35:36.414871 master-0 kubenswrapper[33572]: I1204 22:35:36.414728 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkptk\" (UniqueName: \"kubernetes.io/projected/320dd132-b10a-4d56-88a4-307ecb61196f-kube-api-access-rkptk\") pod \"320dd132-b10a-4d56-88a4-307ecb61196f\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " Dec 04 22:35:36.414926 master-0 kubenswrapper[33572]: I1204 22:35:36.414878 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-combined-ca-bundle\") pod \"320dd132-b10a-4d56-88a4-307ecb61196f\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " Dec 04 22:35:36.415178 master-0 kubenswrapper[33572]: I1204 22:35:36.415132 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/320dd132-b10a-4d56-88a4-307ecb61196f-etc-podinfo\") pod \"320dd132-b10a-4d56-88a4-307ecb61196f\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " Dec 04 22:35:36.415309 master-0 kubenswrapper[33572]: I1204 22:35:36.415272 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/320dd132-b10a-4d56-88a4-307ecb61196f-config-data-merged\") pod \"320dd132-b10a-4d56-88a4-307ecb61196f\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " Dec 04 22:35:36.415374 master-0 kubenswrapper[33572]: I1204 22:35:36.415310 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-scripts\") pod \"320dd132-b10a-4d56-88a4-307ecb61196f\" (UID: \"320dd132-b10a-4d56-88a4-307ecb61196f\") " Dec 04 22:35:36.417571 master-0 kubenswrapper[33572]: I1204 22:35:36.417521 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320dd132-b10a-4d56-88a4-307ecb61196f-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "320dd132-b10a-4d56-88a4-307ecb61196f" (UID: "320dd132-b10a-4d56-88a4-307ecb61196f"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:35:36.419898 master-0 kubenswrapper[33572]: I1204 22:35:36.419819 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/320dd132-b10a-4d56-88a4-307ecb61196f-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "320dd132-b10a-4d56-88a4-307ecb61196f" (UID: "320dd132-b10a-4d56-88a4-307ecb61196f"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 22:35:36.420517 master-0 kubenswrapper[33572]: I1204 22:35:36.420428 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-scripts" (OuterVolumeSpecName: "scripts") pod "320dd132-b10a-4d56-88a4-307ecb61196f" (UID: "320dd132-b10a-4d56-88a4-307ecb61196f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:36.421207 master-0 kubenswrapper[33572]: I1204 22:35:36.420906 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320dd132-b10a-4d56-88a4-307ecb61196f-kube-api-access-rkptk" (OuterVolumeSpecName: "kube-api-access-rkptk") pod "320dd132-b10a-4d56-88a4-307ecb61196f" (UID: "320dd132-b10a-4d56-88a4-307ecb61196f"). InnerVolumeSpecName "kube-api-access-rkptk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:36.458500 master-0 kubenswrapper[33572]: I1204 22:35:36.457593 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-config-data" (OuterVolumeSpecName: "config-data") pod "320dd132-b10a-4d56-88a4-307ecb61196f" (UID: "320dd132-b10a-4d56-88a4-307ecb61196f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:36.470069 master-0 kubenswrapper[33572]: I1204 22:35:36.467718 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:36.484109 master-0 kubenswrapper[33572]: I1204 22:35:36.483605 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "320dd132-b10a-4d56-88a4-307ecb61196f" (UID: "320dd132-b10a-4d56-88a4-307ecb61196f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:36.518441 master-0 kubenswrapper[33572]: I1204 22:35:36.518116 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:36.518441 master-0 kubenswrapper[33572]: I1204 22:35:36.518162 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkptk\" (UniqueName: \"kubernetes.io/projected/320dd132-b10a-4d56-88a4-307ecb61196f-kube-api-access-rkptk\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:36.518441 master-0 kubenswrapper[33572]: I1204 22:35:36.518178 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:36.518441 master-0 kubenswrapper[33572]: I1204 22:35:36.518194 33572 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/320dd132-b10a-4d56-88a4-307ecb61196f-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:36.518441 master-0 kubenswrapper[33572]: I1204 22:35:36.518206 33572 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/320dd132-b10a-4d56-88a4-307ecb61196f-config-data-merged\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:36.518441 master-0 kubenswrapper[33572]: I1204 22:35:36.518218 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320dd132-b10a-4d56-88a4-307ecb61196f-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:36.556337 master-0 kubenswrapper[33572]: I1204 22:35:36.555737 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fb350b1-0cc9-4d29-8333-c67277e45842" path="/var/lib/kubelet/pods/2fb350b1-0cc9-4d29-8333-c67277e45842/volumes" Dec 04 22:35:36.747349 master-0 kubenswrapper[33572]: I1204 22:35:36.747278 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-scheduler-0" event={"ID":"6dfe0d20-6730-407c-b131-00b753ab28a3","Type":"ContainerStarted","Data":"9db96b7edbfea58983408c8a371129c05dc5daab9d42ca5bc3644a4d5f8338e6"} Dec 04 22:35:36.750007 master-0 kubenswrapper[33572]: I1204 22:35:36.749947 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-backup-0" event={"ID":"df4511d6-1b75-449e-b5e2-754c93cca72c","Type":"ContainerStarted","Data":"2c692068d8f5370502a824c3902aba7b7a0386e5e84e94b48e49433c81d01e23"} Dec 04 22:35:36.762434 master-0 kubenswrapper[33572]: I1204 22:35:36.762367 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-j89sq" event={"ID":"320dd132-b10a-4d56-88a4-307ecb61196f","Type":"ContainerDied","Data":"acc14b8a76e105174a5a9026f416a9ef37d4daac3e621df587edb759ea99c558"} Dec 04 22:35:36.762544 master-0 kubenswrapper[33572]: I1204 22:35:36.762439 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc14b8a76e105174a5a9026f416a9ef37d4daac3e621df587edb759ea99c558" Dec 04 22:35:36.762774 master-0 kubenswrapper[33572]: I1204 22:35:36.762697 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7675d-api-0" podUID="40b2cc3d-21a7-4959-af99-7224b50a9d94" containerName="cinder-7675d-api-log" containerID="cri-o://bbc91fb05397d144a428c1a56753d26e44ebf29c7aa72fe800ec67af9832175a" gracePeriod=30 Dec 04 22:35:36.763468 master-0 kubenswrapper[33572]: I1204 22:35:36.763120 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7675d-api-0" podUID="40b2cc3d-21a7-4959-af99-7224b50a9d94" containerName="cinder-api" containerID="cri-o://26486bfde804d1b1b956d33edd58535742b07ece5cbbbc5d538e1b304ab6e0e7" gracePeriod=30 Dec 04 22:35:36.765135 master-0 kubenswrapper[33572]: I1204 22:35:36.764609 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-j89sq" Dec 04 22:35:36.794255 master-0 kubenswrapper[33572]: I1204 22:35:36.794169 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7675d-scheduler-0" podStartSLOduration=5.536050888 podStartE2EDuration="6.794143818s" podCreationTimestamp="2025-12-04 22:35:30 +0000 UTC" firstStartedPulling="2025-12-04 22:35:32.779871893 +0000 UTC m=+996.507397542" lastFinishedPulling="2025-12-04 22:35:34.037964823 +0000 UTC m=+997.765490472" observedRunningTime="2025-12-04 22:35:36.767536053 +0000 UTC m=+1000.495061742" watchObservedRunningTime="2025-12-04 22:35:36.794143818 +0000 UTC m=+1000.521669467" Dec 04 22:35:36.825929 master-0 kubenswrapper[33572]: I1204 22:35:36.825801 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7675d-backup-0" podStartSLOduration=4.797803806 podStartE2EDuration="5.825782492s" podCreationTimestamp="2025-12-04 22:35:31 +0000 UTC" firstStartedPulling="2025-12-04 22:35:33.93042083 +0000 UTC m=+997.657946479" lastFinishedPulling="2025-12-04 22:35:34.958399516 +0000 UTC m=+998.685925165" observedRunningTime="2025-12-04 22:35:36.816951161 +0000 UTC m=+1000.544476810" watchObservedRunningTime="2025-12-04 22:35:36.825782492 +0000 UTC m=+1000.553308141" Dec 04 22:35:37.354632 master-0 kubenswrapper[33572]: I1204 22:35:37.354569 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-vp6mx"] Dec 04 22:35:37.355200 master-0 kubenswrapper[33572]: E1204 22:35:37.355070 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320dd132-b10a-4d56-88a4-307ecb61196f" containerName="ironic-db-sync" Dec 04 22:35:37.355200 master-0 kubenswrapper[33572]: I1204 22:35:37.355088 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="320dd132-b10a-4d56-88a4-307ecb61196f" containerName="ironic-db-sync" Dec 04 22:35:37.355200 master-0 kubenswrapper[33572]: E1204 22:35:37.355117 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb350b1-0cc9-4d29-8333-c67277e45842" containerName="init" Dec 04 22:35:37.355200 master-0 kubenswrapper[33572]: I1204 22:35:37.355123 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb350b1-0cc9-4d29-8333-c67277e45842" containerName="init" Dec 04 22:35:37.355200 master-0 kubenswrapper[33572]: E1204 22:35:37.355140 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb350b1-0cc9-4d29-8333-c67277e45842" containerName="dnsmasq-dns" Dec 04 22:35:37.355200 master-0 kubenswrapper[33572]: I1204 22:35:37.355147 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb350b1-0cc9-4d29-8333-c67277e45842" containerName="dnsmasq-dns" Dec 04 22:35:37.355200 master-0 kubenswrapper[33572]: E1204 22:35:37.355182 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320dd132-b10a-4d56-88a4-307ecb61196f" containerName="init" Dec 04 22:35:37.355200 master-0 kubenswrapper[33572]: I1204 22:35:37.355188 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="320dd132-b10a-4d56-88a4-307ecb61196f" containerName="init" Dec 04 22:35:37.355464 master-0 kubenswrapper[33572]: I1204 22:35:37.355384 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fb350b1-0cc9-4d29-8333-c67277e45842" containerName="dnsmasq-dns" Dec 04 22:35:37.355464 master-0 kubenswrapper[33572]: I1204 22:35:37.355446 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="320dd132-b10a-4d56-88a4-307ecb61196f" containerName="ironic-db-sync" Dec 04 22:35:37.357193 master-0 kubenswrapper[33572]: I1204 22:35:37.356138 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-vp6mx" Dec 04 22:35:37.452051 master-0 kubenswrapper[33572]: I1204 22:35:37.449563 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-vp6mx"] Dec 04 22:35:37.474487 master-0 kubenswrapper[33572]: I1204 22:35:37.474055 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvgrz\" (UniqueName: \"kubernetes.io/projected/fbdff964-9c49-48a4-a0f8-7044131a5627-kube-api-access-cvgrz\") pod \"ironic-inspector-db-create-vp6mx\" (UID: \"fbdff964-9c49-48a4-a0f8-7044131a5627\") " pod="openstack/ironic-inspector-db-create-vp6mx" Dec 04 22:35:37.474487 master-0 kubenswrapper[33572]: I1204 22:35:37.474152 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbdff964-9c49-48a4-a0f8-7044131a5627-operator-scripts\") pod \"ironic-inspector-db-create-vp6mx\" (UID: \"fbdff964-9c49-48a4-a0f8-7044131a5627\") " pod="openstack/ironic-inspector-db-create-vp6mx" Dec 04 22:35:37.510163 master-0 kubenswrapper[33572]: I1204 22:35:37.509569 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-57f48bc457-9xrbh"] Dec 04 22:35:37.544524 master-0 kubenswrapper[33572]: I1204 22:35:37.537742 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-57f48bc457-9xrbh"] Dec 04 22:35:37.544524 master-0 kubenswrapper[33572]: I1204 22:35:37.537873 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.553353 master-0 kubenswrapper[33572]: I1204 22:35:37.553315 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Dec 04 22:35:37.585322 master-0 kubenswrapper[33572]: I1204 22:35:37.583122 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvgrz\" (UniqueName: \"kubernetes.io/projected/fbdff964-9c49-48a4-a0f8-7044131a5627-kube-api-access-cvgrz\") pod \"ironic-inspector-db-create-vp6mx\" (UID: \"fbdff964-9c49-48a4-a0f8-7044131a5627\") " pod="openstack/ironic-inspector-db-create-vp6mx" Dec 04 22:35:37.585322 master-0 kubenswrapper[33572]: I1204 22:35:37.583196 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbdff964-9c49-48a4-a0f8-7044131a5627-operator-scripts\") pod \"ironic-inspector-db-create-vp6mx\" (UID: \"fbdff964-9c49-48a4-a0f8-7044131a5627\") " pod="openstack/ironic-inspector-db-create-vp6mx" Dec 04 22:35:37.603428 master-0 kubenswrapper[33572]: I1204 22:35:37.597653 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbdff964-9c49-48a4-a0f8-7044131a5627-operator-scripts\") pod \"ironic-inspector-db-create-vp6mx\" (UID: \"fbdff964-9c49-48a4-a0f8-7044131a5627\") " pod="openstack/ironic-inspector-db-create-vp6mx" Dec 04 22:35:37.647600 master-0 kubenswrapper[33572]: I1204 22:35:37.632658 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bf87966c-ng7wp"] Dec 04 22:35:37.647600 master-0 kubenswrapper[33572]: I1204 22:35:37.632940 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" podUID="3ac59098-23a5-4191-acf7-07e7aad0c17c" containerName="dnsmasq-dns" containerID="cri-o://9c8281a52166c9bd04899dbb199b1d89b6df5d9fc55bea8ccd7e163ee64f89d2" gracePeriod=10 Dec 04 22:35:37.647600 master-0 kubenswrapper[33572]: I1204 22:35:37.640087 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvgrz\" (UniqueName: \"kubernetes.io/projected/fbdff964-9c49-48a4-a0f8-7044131a5627-kube-api-access-cvgrz\") pod \"ironic-inspector-db-create-vp6mx\" (UID: \"fbdff964-9c49-48a4-a0f8-7044131a5627\") " pod="openstack/ironic-inspector-db-create-vp6mx" Dec 04 22:35:37.665320 master-0 kubenswrapper[33572]: I1204 22:35:37.650719 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-ab22-account-create-update-pqwpd"] Dec 04 22:35:37.665320 master-0 kubenswrapper[33572]: I1204 22:35:37.652300 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" Dec 04 22:35:37.665320 master-0 kubenswrapper[33572]: I1204 22:35:37.661699 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6967585dd7-g4s2p"] Dec 04 22:35:37.665667 master-0 kubenswrapper[33572]: I1204 22:35:37.665648 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.666140 master-0 kubenswrapper[33572]: I1204 22:35:37.666103 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Dec 04 22:35:37.707886 master-0 kubenswrapper[33572]: I1204 22:35:37.690872 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0304754-88f4-4abd-b4ff-2daf5fea19a8-config\") pod \"ironic-neutron-agent-57f48bc457-9xrbh\" (UID: \"c0304754-88f4-4abd-b4ff-2daf5fea19a8\") " pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.707886 master-0 kubenswrapper[33572]: I1204 22:35:37.690971 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhkgw\" (UniqueName: \"kubernetes.io/projected/c0304754-88f4-4abd-b4ff-2daf5fea19a8-kube-api-access-hhkgw\") pod \"ironic-neutron-agent-57f48bc457-9xrbh\" (UID: \"c0304754-88f4-4abd-b4ff-2daf5fea19a8\") " pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.707886 master-0 kubenswrapper[33572]: I1204 22:35:37.691001 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0304754-88f4-4abd-b4ff-2daf5fea19a8-combined-ca-bundle\") pod \"ironic-neutron-agent-57f48bc457-9xrbh\" (UID: \"c0304754-88f4-4abd-b4ff-2daf5fea19a8\") " pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.707886 master-0 kubenswrapper[33572]: I1204 22:35:37.695627 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-ab22-account-create-update-pqwpd"] Dec 04 22:35:37.716328 master-0 kubenswrapper[33572]: I1204 22:35:37.716264 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6967585dd7-g4s2p"] Dec 04 22:35:37.733222 master-0 kubenswrapper[33572]: I1204 22:35:37.731056 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-vp6mx" Dec 04 22:35:37.757455 master-0 kubenswrapper[33572]: I1204 22:35:37.756256 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-74d58d9b69-7cpwt"] Dec 04 22:35:37.766579 master-0 kubenswrapper[33572]: I1204 22:35:37.760986 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:37.766579 master-0 kubenswrapper[33572]: I1204 22:35:37.763212 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Dec 04 22:35:37.766579 master-0 kubenswrapper[33572]: I1204 22:35:37.763431 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Dec 04 22:35:37.766579 master-0 kubenswrapper[33572]: I1204 22:35:37.763704 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Dec 04 22:35:37.766579 master-0 kubenswrapper[33572]: I1204 22:35:37.764735 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 22:35:37.766579 master-0 kubenswrapper[33572]: I1204 22:35:37.764921 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Dec 04 22:35:37.784026 master-0 kubenswrapper[33572]: I1204 22:35:37.781229 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-74d58d9b69-7cpwt"] Dec 04 22:35:37.816253 master-0 kubenswrapper[33572]: I1204 22:35:37.800892 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rchw2\" (UniqueName: \"kubernetes.io/projected/cee3b31f-b225-485b-9675-70162abf39f2-kube-api-access-rchw2\") pod \"ironic-inspector-ab22-account-create-update-pqwpd\" (UID: \"cee3b31f-b225-485b-9675-70162abf39f2\") " pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" Dec 04 22:35:37.816479 master-0 kubenswrapper[33572]: I1204 22:35:37.816320 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-svc\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.816479 master-0 kubenswrapper[33572]: I1204 22:35:37.816417 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-swift-storage-0\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.816608 master-0 kubenswrapper[33572]: I1204 22:35:37.816482 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-config\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.816726 master-0 kubenswrapper[33572]: I1204 22:35:37.816693 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0304754-88f4-4abd-b4ff-2daf5fea19a8-config\") pod \"ironic-neutron-agent-57f48bc457-9xrbh\" (UID: \"c0304754-88f4-4abd-b4ff-2daf5fea19a8\") " pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.816778 master-0 kubenswrapper[33572]: I1204 22:35:37.816724 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wbzf\" (UniqueName: \"kubernetes.io/projected/6320a890-2938-48d8-a0c1-60ec3d24ae6f-kube-api-access-5wbzf\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.816778 master-0 kubenswrapper[33572]: I1204 22:35:37.816767 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-sb\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.817465 master-0 kubenswrapper[33572]: I1204 22:35:37.817428 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhkgw\" (UniqueName: \"kubernetes.io/projected/c0304754-88f4-4abd-b4ff-2daf5fea19a8-kube-api-access-hhkgw\") pod \"ironic-neutron-agent-57f48bc457-9xrbh\" (UID: \"c0304754-88f4-4abd-b4ff-2daf5fea19a8\") " pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.818150 master-0 kubenswrapper[33572]: I1204 22:35:37.818120 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0304754-88f4-4abd-b4ff-2daf5fea19a8-combined-ca-bundle\") pod \"ironic-neutron-agent-57f48bc457-9xrbh\" (UID: \"c0304754-88f4-4abd-b4ff-2daf5fea19a8\") " pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.818480 master-0 kubenswrapper[33572]: I1204 22:35:37.818200 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cee3b31f-b225-485b-9675-70162abf39f2-operator-scripts\") pod \"ironic-inspector-ab22-account-create-update-pqwpd\" (UID: \"cee3b31f-b225-485b-9675-70162abf39f2\") " pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" Dec 04 22:35:37.818480 master-0 kubenswrapper[33572]: I1204 22:35:37.818223 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-nb\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.834262 master-0 kubenswrapper[33572]: I1204 22:35:37.831396 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c0304754-88f4-4abd-b4ff-2daf5fea19a8-config\") pod \"ironic-neutron-agent-57f48bc457-9xrbh\" (UID: \"c0304754-88f4-4abd-b4ff-2daf5fea19a8\") " pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.834262 master-0 kubenswrapper[33572]: I1204 22:35:37.831626 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0304754-88f4-4abd-b4ff-2daf5fea19a8-combined-ca-bundle\") pod \"ironic-neutron-agent-57f48bc457-9xrbh\" (UID: \"c0304754-88f4-4abd-b4ff-2daf5fea19a8\") " pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.840591 master-0 kubenswrapper[33572]: I1204 22:35:37.834816 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhkgw\" (UniqueName: \"kubernetes.io/projected/c0304754-88f4-4abd-b4ff-2daf5fea19a8-kube-api-access-hhkgw\") pod \"ironic-neutron-agent-57f48bc457-9xrbh\" (UID: \"c0304754-88f4-4abd-b4ff-2daf5fea19a8\") " pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.863816 master-0 kubenswrapper[33572]: I1204 22:35:37.863759 33572 generic.go:334] "Generic (PLEG): container finished" podID="40b2cc3d-21a7-4959-af99-7224b50a9d94" containerID="26486bfde804d1b1b956d33edd58535742b07ece5cbbbc5d538e1b304ab6e0e7" exitCode=0 Dec 04 22:35:37.863816 master-0 kubenswrapper[33572]: I1204 22:35:37.863798 33572 generic.go:334] "Generic (PLEG): container finished" podID="40b2cc3d-21a7-4959-af99-7224b50a9d94" containerID="bbc91fb05397d144a428c1a56753d26e44ebf29c7aa72fe800ec67af9832175a" exitCode=143 Dec 04 22:35:37.864086 master-0 kubenswrapper[33572]: I1204 22:35:37.863839 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-api-0" event={"ID":"40b2cc3d-21a7-4959-af99-7224b50a9d94","Type":"ContainerDied","Data":"26486bfde804d1b1b956d33edd58535742b07ece5cbbbc5d538e1b304ab6e0e7"} Dec 04 22:35:37.864086 master-0 kubenswrapper[33572]: I1204 22:35:37.863867 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-api-0" event={"ID":"40b2cc3d-21a7-4959-af99-7224b50a9d94","Type":"ContainerDied","Data":"bbc91fb05397d144a428c1a56753d26e44ebf29c7aa72fe800ec67af9832175a"} Dec 04 22:35:37.864086 master-0 kubenswrapper[33572]: I1204 22:35:37.863938 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-api-0" event={"ID":"40b2cc3d-21a7-4959-af99-7224b50a9d94","Type":"ContainerDied","Data":"756a1cb7f0d9273db96eb9225502913cb152b26c07abba87f830224e53574a0d"} Dec 04 22:35:37.864086 master-0 kubenswrapper[33572]: I1204 22:35:37.863950 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756a1cb7f0d9273db96eb9225502913cb152b26c07abba87f830224e53574a0d" Dec 04 22:35:37.867649 master-0 kubenswrapper[33572]: I1204 22:35:37.867241 33572 generic.go:334] "Generic (PLEG): container finished" podID="3ac59098-23a5-4191-acf7-07e7aad0c17c" containerID="9c8281a52166c9bd04899dbb199b1d89b6df5d9fc55bea8ccd7e163ee64f89d2" exitCode=0 Dec 04 22:35:37.868847 master-0 kubenswrapper[33572]: I1204 22:35:37.868775 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" event={"ID":"3ac59098-23a5-4191-acf7-07e7aad0c17c","Type":"ContainerDied","Data":"9c8281a52166c9bd04899dbb199b1d89b6df5d9fc55bea8ccd7e163ee64f89d2"} Dec 04 22:35:37.900219 master-0 kubenswrapper[33572]: I1204 22:35:37.900104 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-api-0" Dec 04 22:35:37.931165 master-0 kubenswrapper[33572]: I1204 22:35:37.931084 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.945816 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-sb\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.946148 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-combined-ca-bundle\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.946290 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cee3b31f-b225-485b-9675-70162abf39f2-operator-scripts\") pod \"ironic-inspector-ab22-account-create-update-pqwpd\" (UID: \"cee3b31f-b225-485b-9675-70162abf39f2\") " pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.946317 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-nb\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.946394 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rchw2\" (UniqueName: \"kubernetes.io/projected/cee3b31f-b225-485b-9675-70162abf39f2-kube-api-access-rchw2\") pod \"ironic-inspector-ab22-account-create-update-pqwpd\" (UID: \"cee3b31f-b225-485b-9675-70162abf39f2\") " pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.946445 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-svc\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.949587 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbrsr\" (UniqueName: \"kubernetes.io/projected/578da66e-114f-4fb7-ace1-ae07e428e2b3-kube-api-access-wbrsr\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.949779 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-swift-storage-0\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.949940 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-merged\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.950114 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-config\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.950384 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.950443 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-scripts\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.950552 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-custom\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.950987 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/578da66e-114f-4fb7-ace1-ae07e428e2b3-etc-podinfo\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.951138 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-logs\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:37.952064 master-0 kubenswrapper[33572]: I1204 22:35:37.951311 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wbzf\" (UniqueName: \"kubernetes.io/projected/6320a890-2938-48d8-a0c1-60ec3d24ae6f-kube-api-access-5wbzf\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.962385 master-0 kubenswrapper[33572]: I1204 22:35:37.956337 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-nb\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.962385 master-0 kubenswrapper[33572]: I1204 22:35:37.956672 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-svc\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.962385 master-0 kubenswrapper[33572]: I1204 22:35:37.957340 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-config\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.962385 master-0 kubenswrapper[33572]: I1204 22:35:37.957847 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cee3b31f-b225-485b-9675-70162abf39f2-operator-scripts\") pod \"ironic-inspector-ab22-account-create-update-pqwpd\" (UID: \"cee3b31f-b225-485b-9675-70162abf39f2\") " pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" Dec 04 22:35:37.962385 master-0 kubenswrapper[33572]: I1204 22:35:37.957982 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-sb\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.978527 master-0 kubenswrapper[33572]: I1204 22:35:37.975071 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-swift-storage-0\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:37.986194 master-0 kubenswrapper[33572]: I1204 22:35:37.986136 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rchw2\" (UniqueName: \"kubernetes.io/projected/cee3b31f-b225-485b-9675-70162abf39f2-kube-api-access-rchw2\") pod \"ironic-inspector-ab22-account-create-update-pqwpd\" (UID: \"cee3b31f-b225-485b-9675-70162abf39f2\") " pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" Dec 04 22:35:37.993172 master-0 kubenswrapper[33572]: I1204 22:35:37.992770 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wbzf\" (UniqueName: \"kubernetes.io/projected/6320a890-2938-48d8-a0c1-60ec3d24ae6f-kube-api-access-5wbzf\") pod \"dnsmasq-dns-6967585dd7-g4s2p\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:38.050646 master-0 kubenswrapper[33572]: E1204 22:35:38.048443 33572 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ac59098_23a5_4191_acf7_07e7aad0c17c.slice/crio-conmon-9c8281a52166c9bd04899dbb199b1d89b6df5d9fc55bea8ccd7e163ee64f89d2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ac59098_23a5_4191_acf7_07e7aad0c17c.slice/crio-9c8281a52166c9bd04899dbb199b1d89b6df5d9fc55bea8ccd7e163ee64f89d2.scope\": RecentStats: unable to find data in memory cache]" Dec 04 22:35:38.055460 master-0 kubenswrapper[33572]: I1204 22:35:38.053970 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrx8z\" (UniqueName: \"kubernetes.io/projected/40b2cc3d-21a7-4959-af99-7224b50a9d94-kube-api-access-rrx8z\") pod \"40b2cc3d-21a7-4959-af99-7224b50a9d94\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " Dec 04 22:35:38.055460 master-0 kubenswrapper[33572]: I1204 22:35:38.054052 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data\") pod \"40b2cc3d-21a7-4959-af99-7224b50a9d94\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " Dec 04 22:35:38.055460 master-0 kubenswrapper[33572]: I1204 22:35:38.054164 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b2cc3d-21a7-4959-af99-7224b50a9d94-logs\") pod \"40b2cc3d-21a7-4959-af99-7224b50a9d94\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " Dec 04 22:35:38.055460 master-0 kubenswrapper[33572]: I1204 22:35:38.054274 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-combined-ca-bundle\") pod \"40b2cc3d-21a7-4959-af99-7224b50a9d94\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " Dec 04 22:35:38.055460 master-0 kubenswrapper[33572]: I1204 22:35:38.054374 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-scripts\") pod \"40b2cc3d-21a7-4959-af99-7224b50a9d94\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " Dec 04 22:35:38.055460 master-0 kubenswrapper[33572]: I1204 22:35:38.054843 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b2cc3d-21a7-4959-af99-7224b50a9d94-logs" (OuterVolumeSpecName: "logs") pod "40b2cc3d-21a7-4959-af99-7224b50a9d94" (UID: "40b2cc3d-21a7-4959-af99-7224b50a9d94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:35:38.055460 master-0 kubenswrapper[33572]: I1204 22:35:38.055190 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40b2cc3d-21a7-4959-af99-7224b50a9d94-etc-machine-id\") pod \"40b2cc3d-21a7-4959-af99-7224b50a9d94\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " Dec 04 22:35:38.055460 master-0 kubenswrapper[33572]: I1204 22:35:38.055275 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data-custom\") pod \"40b2cc3d-21a7-4959-af99-7224b50a9d94\" (UID: \"40b2cc3d-21a7-4959-af99-7224b50a9d94\") " Dec 04 22:35:38.056039 master-0 kubenswrapper[33572]: I1204 22:35:38.055931 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-merged\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.056039 master-0 kubenswrapper[33572]: I1204 22:35:38.056018 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.056162 master-0 kubenswrapper[33572]: I1204 22:35:38.056059 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-scripts\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.056162 master-0 kubenswrapper[33572]: I1204 22:35:38.056078 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-custom\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.056162 master-0 kubenswrapper[33572]: I1204 22:35:38.056146 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/578da66e-114f-4fb7-ace1-ae07e428e2b3-etc-podinfo\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.056293 master-0 kubenswrapper[33572]: I1204 22:35:38.056166 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-logs\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.056341 master-0 kubenswrapper[33572]: I1204 22:35:38.056318 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-combined-ca-bundle\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.056586 master-0 kubenswrapper[33572]: I1204 22:35:38.056543 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbrsr\" (UniqueName: \"kubernetes.io/projected/578da66e-114f-4fb7-ace1-ae07e428e2b3-kube-api-access-wbrsr\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.056668 master-0 kubenswrapper[33572]: I1204 22:35:38.056623 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40b2cc3d-21a7-4959-af99-7224b50a9d94-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.058300 master-0 kubenswrapper[33572]: I1204 22:35:38.057052 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40b2cc3d-21a7-4959-af99-7224b50a9d94-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "40b2cc3d-21a7-4959-af99-7224b50a9d94" (UID: "40b2cc3d-21a7-4959-af99-7224b50a9d94"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:38.058632 master-0 kubenswrapper[33572]: I1204 22:35:38.058586 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b2cc3d-21a7-4959-af99-7224b50a9d94-kube-api-access-rrx8z" (OuterVolumeSpecName: "kube-api-access-rrx8z") pod "40b2cc3d-21a7-4959-af99-7224b50a9d94" (UID: "40b2cc3d-21a7-4959-af99-7224b50a9d94"). InnerVolumeSpecName "kube-api-access-rrx8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:38.060572 master-0 kubenswrapper[33572]: I1204 22:35:38.060540 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-merged\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.062930 master-0 kubenswrapper[33572]: I1204 22:35:38.062887 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-logs\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.064478 master-0 kubenswrapper[33572]: I1204 22:35:38.064376 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "40b2cc3d-21a7-4959-af99-7224b50a9d94" (UID: "40b2cc3d-21a7-4959-af99-7224b50a9d94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:38.067746 master-0 kubenswrapper[33572]: I1204 22:35:38.067691 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/578da66e-114f-4fb7-ace1-ae07e428e2b3-etc-podinfo\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.074948 master-0 kubenswrapper[33572]: E1204 22:35:38.074714 33572 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ac59098_23a5_4191_acf7_07e7aad0c17c.slice/crio-9c8281a52166c9bd04899dbb199b1d89b6df5d9fc55bea8ccd7e163ee64f89d2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ac59098_23a5_4191_acf7_07e7aad0c17c.slice/crio-conmon-9c8281a52166c9bd04899dbb199b1d89b6df5d9fc55bea8ccd7e163ee64f89d2.scope\": RecentStats: unable to find data in memory cache]" Dec 04 22:35:38.081759 master-0 kubenswrapper[33572]: I1204 22:35:38.081649 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-combined-ca-bundle\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.101531 master-0 kubenswrapper[33572]: I1204 22:35:38.083844 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbrsr\" (UniqueName: \"kubernetes.io/projected/578da66e-114f-4fb7-ace1-ae07e428e2b3-kube-api-access-wbrsr\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.104526 master-0 kubenswrapper[33572]: I1204 22:35:38.103951 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-scripts\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.104526 master-0 kubenswrapper[33572]: I1204 22:35:38.104376 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.120353 master-0 kubenswrapper[33572]: I1204 22:35:38.115166 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-custom\") pod \"ironic-74d58d9b69-7cpwt\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.120353 master-0 kubenswrapper[33572]: I1204 22:35:38.115284 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-scripts" (OuterVolumeSpecName: "scripts") pod "40b2cc3d-21a7-4959-af99-7224b50a9d94" (UID: "40b2cc3d-21a7-4959-af99-7224b50a9d94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:38.195052 master-0 kubenswrapper[33572]: I1204 22:35:38.158977 33572 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40b2cc3d-21a7-4959-af99-7224b50a9d94-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.195052 master-0 kubenswrapper[33572]: I1204 22:35:38.159009 33572 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.195052 master-0 kubenswrapper[33572]: I1204 22:35:38.159021 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrx8z\" (UniqueName: \"kubernetes.io/projected/40b2cc3d-21a7-4959-af99-7224b50a9d94-kube-api-access-rrx8z\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.195052 master-0 kubenswrapper[33572]: I1204 22:35:38.159031 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.195052 master-0 kubenswrapper[33572]: I1204 22:35:38.178619 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" Dec 04 22:35:38.195052 master-0 kubenswrapper[33572]: I1204 22:35:38.194850 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40b2cc3d-21a7-4959-af99-7224b50a9d94" (UID: "40b2cc3d-21a7-4959-af99-7224b50a9d94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:38.195885 master-0 kubenswrapper[33572]: I1204 22:35:38.195051 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:38.263391 master-0 kubenswrapper[33572]: I1204 22:35:38.263143 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.284054 master-0 kubenswrapper[33572]: I1204 22:35:38.283968 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data" (OuterVolumeSpecName: "config-data") pod "40b2cc3d-21a7-4959-af99-7224b50a9d94" (UID: "40b2cc3d-21a7-4959-af99-7224b50a9d94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:38.304029 master-0 kubenswrapper[33572]: I1204 22:35:38.303573 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:38.366450 master-0 kubenswrapper[33572]: I1204 22:35:38.366396 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40b2cc3d-21a7-4959-af99-7224b50a9d94-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.441763 master-0 kubenswrapper[33572]: I1204 22:35:38.438431 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:38.587414 master-0 kubenswrapper[33572]: I1204 22:35:38.586579 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-swift-storage-0\") pod \"3ac59098-23a5-4191-acf7-07e7aad0c17c\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " Dec 04 22:35:38.587414 master-0 kubenswrapper[33572]: I1204 22:35:38.586717 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-sb\") pod \"3ac59098-23a5-4191-acf7-07e7aad0c17c\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " Dec 04 22:35:38.587414 master-0 kubenswrapper[33572]: I1204 22:35:38.586855 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-config\") pod \"3ac59098-23a5-4191-acf7-07e7aad0c17c\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " Dec 04 22:35:38.587414 master-0 kubenswrapper[33572]: I1204 22:35:38.586952 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-nb\") pod \"3ac59098-23a5-4191-acf7-07e7aad0c17c\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " Dec 04 22:35:38.587414 master-0 kubenswrapper[33572]: I1204 22:35:38.587055 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-svc\") pod \"3ac59098-23a5-4191-acf7-07e7aad0c17c\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " Dec 04 22:35:38.587414 master-0 kubenswrapper[33572]: I1204 22:35:38.587088 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrvg9\" (UniqueName: \"kubernetes.io/projected/3ac59098-23a5-4191-acf7-07e7aad0c17c-kube-api-access-nrvg9\") pod \"3ac59098-23a5-4191-acf7-07e7aad0c17c\" (UID: \"3ac59098-23a5-4191-acf7-07e7aad0c17c\") " Dec 04 22:35:38.597935 master-0 kubenswrapper[33572]: I1204 22:35:38.596227 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac59098-23a5-4191-acf7-07e7aad0c17c-kube-api-access-nrvg9" (OuterVolumeSpecName: "kube-api-access-nrvg9") pod "3ac59098-23a5-4191-acf7-07e7aad0c17c" (UID: "3ac59098-23a5-4191-acf7-07e7aad0c17c"). InnerVolumeSpecName "kube-api-access-nrvg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:38.640752 master-0 kubenswrapper[33572]: I1204 22:35:38.639953 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-vp6mx"] Dec 04 22:35:38.689443 master-0 kubenswrapper[33572]: I1204 22:35:38.689373 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-57f48bc457-9xrbh"] Dec 04 22:35:38.692255 master-0 kubenswrapper[33572]: I1204 22:35:38.692217 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrvg9\" (UniqueName: \"kubernetes.io/projected/3ac59098-23a5-4191-acf7-07e7aad0c17c-kube-api-access-nrvg9\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.711418 master-0 kubenswrapper[33572]: W1204 22:35:38.711196 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbdff964_9c49_48a4_a0f8_7044131a5627.slice/crio-30fb9f886428e5842fa0a8e865c32b69bc10c0dd0e08397dbf5fcaf45769a46e WatchSource:0}: Error finding container 30fb9f886428e5842fa0a8e865c32b69bc10c0dd0e08397dbf5fcaf45769a46e: Status 404 returned error can't find the container with id 30fb9f886428e5842fa0a8e865c32b69bc10c0dd0e08397dbf5fcaf45769a46e Dec 04 22:35:38.711895 master-0 kubenswrapper[33572]: I1204 22:35:38.711854 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3ac59098-23a5-4191-acf7-07e7aad0c17c" (UID: "3ac59098-23a5-4191-acf7-07e7aad0c17c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:38.712139 master-0 kubenswrapper[33572]: I1204 22:35:38.711816 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3ac59098-23a5-4191-acf7-07e7aad0c17c" (UID: "3ac59098-23a5-4191-acf7-07e7aad0c17c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:38.728556 master-0 kubenswrapper[33572]: I1204 22:35:38.728062 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-config" (OuterVolumeSpecName: "config") pod "3ac59098-23a5-4191-acf7-07e7aad0c17c" (UID: "3ac59098-23a5-4191-acf7-07e7aad0c17c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:38.772213 master-0 kubenswrapper[33572]: I1204 22:35:38.771591 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3ac59098-23a5-4191-acf7-07e7aad0c17c" (UID: "3ac59098-23a5-4191-acf7-07e7aad0c17c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:38.796930 master-0 kubenswrapper[33572]: I1204 22:35:38.796267 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.796930 master-0 kubenswrapper[33572]: I1204 22:35:38.796312 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.796930 master-0 kubenswrapper[33572]: I1204 22:35:38.796326 33572 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.796930 master-0 kubenswrapper[33572]: I1204 22:35:38.796335 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.801908 master-0 kubenswrapper[33572]: I1204 22:35:38.801577 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3ac59098-23a5-4191-acf7-07e7aad0c17c" (UID: "3ac59098-23a5-4191-acf7-07e7aad0c17c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:38.886612 master-0 kubenswrapper[33572]: I1204 22:35:38.884668 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-ab22-account-create-update-pqwpd"] Dec 04 22:35:38.889132 master-0 kubenswrapper[33572]: I1204 22:35:38.889085 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" event={"ID":"3ac59098-23a5-4191-acf7-07e7aad0c17c","Type":"ContainerDied","Data":"d7f27dc7027ea547ce1e4408cac1df385c8c377e16be84a718ac9ce13a6b8c2b"} Dec 04 22:35:38.889234 master-0 kubenswrapper[33572]: I1204 22:35:38.889140 33572 scope.go:117] "RemoveContainer" containerID="9c8281a52166c9bd04899dbb199b1d89b6df5d9fc55bea8ccd7e163ee64f89d2" Dec 04 22:35:38.889276 master-0 kubenswrapper[33572]: I1204 22:35:38.889257 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bf87966c-ng7wp" Dec 04 22:35:38.907406 master-0 kubenswrapper[33572]: I1204 22:35:38.897836 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ac59098-23a5-4191-acf7-07e7aad0c17c-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:38.910964 master-0 kubenswrapper[33572]: I1204 22:35:38.910918 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-vp6mx" event={"ID":"fbdff964-9c49-48a4-a0f8-7044131a5627","Type":"ContainerStarted","Data":"30fb9f886428e5842fa0a8e865c32b69bc10c0dd0e08397dbf5fcaf45769a46e"} Dec 04 22:35:38.933745 master-0 kubenswrapper[33572]: I1204 22:35:38.933193 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-api-0" Dec 04 22:35:38.933745 master-0 kubenswrapper[33572]: I1204 22:35:38.933305 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" event={"ID":"c0304754-88f4-4abd-b4ff-2daf5fea19a8","Type":"ContainerStarted","Data":"90c7c88f3154e8ac4fd2741c209dd61b2b2864a6d9fed427cb957217c28a2781"} Dec 04 22:35:38.938149 master-0 kubenswrapper[33572]: I1204 22:35:38.938119 33572 scope.go:117] "RemoveContainer" containerID="602d579d7d64da1adab8b0266b0ff26f3b6ad1eebe55a8fa7364325bbe0f35f2" Dec 04 22:35:39.055092 master-0 kubenswrapper[33572]: I1204 22:35:39.054772 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bf87966c-ng7wp"] Dec 04 22:35:39.132652 master-0 kubenswrapper[33572]: I1204 22:35:39.120553 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bf87966c-ng7wp"] Dec 04 22:35:39.136544 master-0 kubenswrapper[33572]: I1204 22:35:39.136463 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7675d-api-0"] Dec 04 22:35:39.147871 master-0 kubenswrapper[33572]: I1204 22:35:39.147733 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7675d-api-0"] Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: I1204 22:35:39.168812 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7675d-api-0"] Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: E1204 22:35:39.169359 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac59098-23a5-4191-acf7-07e7aad0c17c" containerName="dnsmasq-dns" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: I1204 22:35:39.169374 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac59098-23a5-4191-acf7-07e7aad0c17c" containerName="dnsmasq-dns" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: E1204 22:35:39.169398 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b2cc3d-21a7-4959-af99-7224b50a9d94" containerName="cinder-api" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: I1204 22:35:39.169405 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b2cc3d-21a7-4959-af99-7224b50a9d94" containerName="cinder-api" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: E1204 22:35:39.169424 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac59098-23a5-4191-acf7-07e7aad0c17c" containerName="init" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: I1204 22:35:39.169431 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac59098-23a5-4191-acf7-07e7aad0c17c" containerName="init" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: E1204 22:35:39.169443 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b2cc3d-21a7-4959-af99-7224b50a9d94" containerName="cinder-7675d-api-log" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: I1204 22:35:39.169449 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b2cc3d-21a7-4959-af99-7224b50a9d94" containerName="cinder-7675d-api-log" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: I1204 22:35:39.169730 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b2cc3d-21a7-4959-af99-7224b50a9d94" containerName="cinder-7675d-api-log" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: I1204 22:35:39.169771 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b2cc3d-21a7-4959-af99-7224b50a9d94" containerName="cinder-api" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: I1204 22:35:39.169787 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac59098-23a5-4191-acf7-07e7aad0c17c" containerName="dnsmasq-dns" Dec 04 22:35:39.173341 master-0 kubenswrapper[33572]: I1204 22:35:39.171000 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.174037 master-0 kubenswrapper[33572]: I1204 22:35:39.173833 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 04 22:35:39.174037 master-0 kubenswrapper[33572]: I1204 22:35:39.173995 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-api-config-data" Dec 04 22:35:39.174130 master-0 kubenswrapper[33572]: I1204 22:35:39.174104 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 04 22:35:39.180634 master-0 kubenswrapper[33572]: I1204 22:35:39.180560 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6967585dd7-g4s2p"] Dec 04 22:35:39.191998 master-0 kubenswrapper[33572]: I1204 22:35:39.191938 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-74d58d9b69-7cpwt"] Dec 04 22:35:39.202903 master-0 kubenswrapper[33572]: I1204 22:35:39.202847 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-api-0"] Dec 04 22:35:39.315798 master-0 kubenswrapper[33572]: I1204 22:35:39.314431 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4568378-b046-4ee7-8de7-5d5c1f31751f-etc-machine-id\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.315798 master-0 kubenswrapper[33572]: I1204 22:35:39.314557 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4568378-b046-4ee7-8de7-5d5c1f31751f-logs\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.315798 master-0 kubenswrapper[33572]: I1204 22:35:39.314751 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-public-tls-certs\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.315798 master-0 kubenswrapper[33572]: I1204 22:35:39.314784 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-internal-tls-certs\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.315798 master-0 kubenswrapper[33572]: I1204 22:35:39.314868 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-combined-ca-bundle\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.315798 master-0 kubenswrapper[33572]: I1204 22:35:39.314951 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-config-data-custom\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.315798 master-0 kubenswrapper[33572]: I1204 22:35:39.315142 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-scripts\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.315798 master-0 kubenswrapper[33572]: I1204 22:35:39.315223 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7vqm\" (UniqueName: \"kubernetes.io/projected/b4568378-b046-4ee7-8de7-5d5c1f31751f-kube-api-access-q7vqm\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.315798 master-0 kubenswrapper[33572]: I1204 22:35:39.315388 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-config-data\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.417599 master-0 kubenswrapper[33572]: I1204 22:35:39.417540 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-combined-ca-bundle\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.417789 master-0 kubenswrapper[33572]: I1204 22:35:39.417624 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-config-data-custom\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.417789 master-0 kubenswrapper[33572]: I1204 22:35:39.417671 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-scripts\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.417789 master-0 kubenswrapper[33572]: I1204 22:35:39.417717 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7vqm\" (UniqueName: \"kubernetes.io/projected/b4568378-b046-4ee7-8de7-5d5c1f31751f-kube-api-access-q7vqm\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.417879 master-0 kubenswrapper[33572]: I1204 22:35:39.417777 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-config-data\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.417879 master-0 kubenswrapper[33572]: I1204 22:35:39.417830 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4568378-b046-4ee7-8de7-5d5c1f31751f-etc-machine-id\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.417941 master-0 kubenswrapper[33572]: I1204 22:35:39.417888 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4568378-b046-4ee7-8de7-5d5c1f31751f-logs\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.418017 master-0 kubenswrapper[33572]: I1204 22:35:39.418003 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-public-tls-certs\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.418077 master-0 kubenswrapper[33572]: I1204 22:35:39.418022 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-internal-tls-certs\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.421630 master-0 kubenswrapper[33572]: I1204 22:35:39.421581 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b4568378-b046-4ee7-8de7-5d5c1f31751f-etc-machine-id\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.421780 master-0 kubenswrapper[33572]: I1204 22:35:39.421742 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4568378-b046-4ee7-8de7-5d5c1f31751f-logs\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.422861 master-0 kubenswrapper[33572]: I1204 22:35:39.422802 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-config-data\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.423321 master-0 kubenswrapper[33572]: I1204 22:35:39.423273 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-internal-tls-certs\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.424705 master-0 kubenswrapper[33572]: I1204 22:35:39.424537 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-config-data-custom\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.429185 master-0 kubenswrapper[33572]: I1204 22:35:39.429148 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-scripts\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.430483 master-0 kubenswrapper[33572]: I1204 22:35:39.430460 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-combined-ca-bundle\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.431402 master-0 kubenswrapper[33572]: I1204 22:35:39.431381 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4568378-b046-4ee7-8de7-5d5c1f31751f-public-tls-certs\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.448959 master-0 kubenswrapper[33572]: I1204 22:35:39.446790 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7vqm\" (UniqueName: \"kubernetes.io/projected/b4568378-b046-4ee7-8de7-5d5c1f31751f-kube-api-access-q7vqm\") pod \"cinder-7675d-api-0\" (UID: \"b4568378-b046-4ee7-8de7-5d5c1f31751f\") " pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.631557 master-0 kubenswrapper[33572]: I1204 22:35:39.631329 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-api-0" Dec 04 22:35:39.635833 master-0 kubenswrapper[33572]: I1204 22:35:39.635803 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Dec 04 22:35:39.646134 master-0 kubenswrapper[33572]: I1204 22:35:39.639460 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Dec 04 22:35:39.646134 master-0 kubenswrapper[33572]: I1204 22:35:39.642434 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Dec 04 22:35:39.646134 master-0 kubenswrapper[33572]: I1204 22:35:39.642844 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Dec 04 22:35:39.657033 master-0 kubenswrapper[33572]: I1204 22:35:39.656933 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Dec 04 22:35:39.732305 master-0 kubenswrapper[33572]: I1204 22:35:39.732232 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-scripts\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.732682 master-0 kubenswrapper[33572]: I1204 22:35:39.732375 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.732682 master-0 kubenswrapper[33572]: I1204 22:35:39.732440 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.732682 master-0 kubenswrapper[33572]: I1204 22:35:39.732530 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsg46\" (UniqueName: \"kubernetes.io/projected/5c2c1216-db02-4bb8-856c-a56a68e06d23-kube-api-access-qsg46\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.732682 master-0 kubenswrapper[33572]: I1204 22:35:39.732596 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5c2c1216-db02-4bb8-856c-a56a68e06d23-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.732852 master-0 kubenswrapper[33572]: I1204 22:35:39.732806 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-config-data\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.732934 master-0 kubenswrapper[33572]: I1204 22:35:39.732904 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5c2c1216-db02-4bb8-856c-a56a68e06d23-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.732982 master-0 kubenswrapper[33572]: I1204 22:35:39.732947 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ffde0b45-b45d-48a5-bbb7-beee4ebd7250\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e77e0d9-de88-43fb-9d41-8485f8694849\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.844171 master-0 kubenswrapper[33572]: I1204 22:35:39.844054 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5c2c1216-db02-4bb8-856c-a56a68e06d23-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.844827 master-0 kubenswrapper[33572]: I1204 22:35:39.844728 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ffde0b45-b45d-48a5-bbb7-beee4ebd7250\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e77e0d9-de88-43fb-9d41-8485f8694849\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.846212 master-0 kubenswrapper[33572]: I1204 22:35:39.845081 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-scripts\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.846212 master-0 kubenswrapper[33572]: I1204 22:35:39.845118 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/5c2c1216-db02-4bb8-856c-a56a68e06d23-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.846212 master-0 kubenswrapper[33572]: I1204 22:35:39.845361 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.846212 master-0 kubenswrapper[33572]: I1204 22:35:39.845511 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.846212 master-0 kubenswrapper[33572]: I1204 22:35:39.845711 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsg46\" (UniqueName: \"kubernetes.io/projected/5c2c1216-db02-4bb8-856c-a56a68e06d23-kube-api-access-qsg46\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.846426 master-0 kubenswrapper[33572]: I1204 22:35:39.846267 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5c2c1216-db02-4bb8-856c-a56a68e06d23-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.847818 master-0 kubenswrapper[33572]: I1204 22:35:39.846679 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-config-data\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.850939 master-0 kubenswrapper[33572]: I1204 22:35:39.849663 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.850939 master-0 kubenswrapper[33572]: I1204 22:35:39.849784 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-scripts\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.850939 master-0 kubenswrapper[33572]: I1204 22:35:39.850822 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:35:39.850939 master-0 kubenswrapper[33572]: I1204 22:35:39.850849 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ffde0b45-b45d-48a5-bbb7-beee4ebd7250\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e77e0d9-de88-43fb-9d41-8485f8694849\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/3a81b3c13ca8bf59599dfef0a0dded0a8b512a7db64c2437d161d8688ec110ec/globalmount\"" pod="openstack/ironic-conductor-0" Dec 04 22:35:39.856852 master-0 kubenswrapper[33572]: I1204 22:35:39.856740 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5c2c1216-db02-4bb8-856c-a56a68e06d23-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.863029 master-0 kubenswrapper[33572]: I1204 22:35:39.862975 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-config-data\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.867633 master-0 kubenswrapper[33572]: I1204 22:35:39.867347 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c2c1216-db02-4bb8-856c-a56a68e06d23-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.871231 master-0 kubenswrapper[33572]: I1204 22:35:39.871195 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsg46\" (UniqueName: \"kubernetes.io/projected/5c2c1216-db02-4bb8-856c-a56a68e06d23-kube-api-access-qsg46\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:39.968522 master-0 kubenswrapper[33572]: I1204 22:35:39.968056 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74d58d9b69-7cpwt" event={"ID":"578da66e-114f-4fb7-ace1-ae07e428e2b3","Type":"ContainerStarted","Data":"903cd1783a9d27f059acbd4bf10f0be207ad3eb5e5696e9651b271f1ee075065"} Dec 04 22:35:39.988566 master-0 kubenswrapper[33572]: I1204 22:35:39.969428 33572 generic.go:334] "Generic (PLEG): container finished" podID="fbdff964-9c49-48a4-a0f8-7044131a5627" containerID="849da94721ea8e63b0b4991267e31331a5a838d1865e205d04bfa2710bd2b223" exitCode=0 Dec 04 22:35:39.988566 master-0 kubenswrapper[33572]: I1204 22:35:39.969469 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-vp6mx" event={"ID":"fbdff964-9c49-48a4-a0f8-7044131a5627","Type":"ContainerDied","Data":"849da94721ea8e63b0b4991267e31331a5a838d1865e205d04bfa2710bd2b223"} Dec 04 22:35:39.988566 master-0 kubenswrapper[33572]: I1204 22:35:39.981065 33572 generic.go:334] "Generic (PLEG): container finished" podID="6320a890-2938-48d8-a0c1-60ec3d24ae6f" containerID="fee7c9db32cace22190856c6fa1a721fec110e11436888a08215dab910970945" exitCode=0 Dec 04 22:35:39.988566 master-0 kubenswrapper[33572]: I1204 22:35:39.981199 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" event={"ID":"6320a890-2938-48d8-a0c1-60ec3d24ae6f","Type":"ContainerDied","Data":"fee7c9db32cace22190856c6fa1a721fec110e11436888a08215dab910970945"} Dec 04 22:35:39.988566 master-0 kubenswrapper[33572]: I1204 22:35:39.981232 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" event={"ID":"6320a890-2938-48d8-a0c1-60ec3d24ae6f","Type":"ContainerStarted","Data":"2bf0706b2df6f0fe0d13b49487ac2b6279b49ce1d6ec943951e98a0d0c97db61"} Dec 04 22:35:40.020041 master-0 kubenswrapper[33572]: I1204 22:35:40.019978 33572 generic.go:334] "Generic (PLEG): container finished" podID="cee3b31f-b225-485b-9675-70162abf39f2" containerID="e0f150fa83bea57a5a2417465b010b1a258fc7779fd43e1dec7600654531dc50" exitCode=0 Dec 04 22:35:40.020041 master-0 kubenswrapper[33572]: I1204 22:35:40.020033 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" event={"ID":"cee3b31f-b225-485b-9675-70162abf39f2","Type":"ContainerDied","Data":"e0f150fa83bea57a5a2417465b010b1a258fc7779fd43e1dec7600654531dc50"} Dec 04 22:35:40.020270 master-0 kubenswrapper[33572]: I1204 22:35:40.020060 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" event={"ID":"cee3b31f-b225-485b-9675-70162abf39f2","Type":"ContainerStarted","Data":"f1186a27212de11e82680c1c44eb4b9d83a77027467500f86c41a81e62abbb45"} Dec 04 22:35:40.255669 master-0 kubenswrapper[33572]: I1204 22:35:40.252748 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-api-0"] Dec 04 22:35:40.543640 master-0 kubenswrapper[33572]: I1204 22:35:40.543375 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac59098-23a5-4191-acf7-07e7aad0c17c" path="/var/lib/kubelet/pods/3ac59098-23a5-4191-acf7-07e7aad0c17c/volumes" Dec 04 22:35:40.544457 master-0 kubenswrapper[33572]: I1204 22:35:40.544425 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b2cc3d-21a7-4959-af99-7224b50a9d94" path="/var/lib/kubelet/pods/40b2cc3d-21a7-4959-af99-7224b50a9d94/volumes" Dec 04 22:35:41.006563 master-0 kubenswrapper[33572]: I1204 22:35:41.003919 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-7c5df64bd5-tsrwx"] Dec 04 22:35:41.006563 master-0 kubenswrapper[33572]: I1204 22:35:41.006181 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.014554 master-0 kubenswrapper[33572]: I1204 22:35:41.010018 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Dec 04 22:35:41.014554 master-0 kubenswrapper[33572]: I1204 22:35:41.010275 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Dec 04 22:35:41.026989 master-0 kubenswrapper[33572]: I1204 22:35:41.026896 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7c5df64bd5-tsrwx"] Dec 04 22:35:41.060986 master-0 kubenswrapper[33572]: I1204 22:35:41.060919 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-api-0" event={"ID":"b4568378-b046-4ee7-8de7-5d5c1f31751f","Type":"ContainerStarted","Data":"1bd88e567aac974f1d9796fa20a905cd6ccb68b297c5ac8feebfdaefbe0a1e36"} Dec 04 22:35:41.064031 master-0 kubenswrapper[33572]: I1204 22:35:41.063989 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" event={"ID":"6320a890-2938-48d8-a0c1-60ec3d24ae6f","Type":"ContainerStarted","Data":"49a2aae73c5770bf4ff6df828527d4cb612e447c1e972e69068684b4ed20bda1"} Dec 04 22:35:41.064769 master-0 kubenswrapper[33572]: I1204 22:35:41.064148 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:41.087036 master-0 kubenswrapper[33572]: I1204 22:35:41.086961 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-scripts\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.087289 master-0 kubenswrapper[33572]: I1204 22:35:41.087175 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3321093f-53af-4bf5-86ea-0e043e2fa780-etc-podinfo\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.087289 master-0 kubenswrapper[33572]: I1204 22:35:41.087229 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3321093f-53af-4bf5-86ea-0e043e2fa780-logs\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.087482 master-0 kubenswrapper[33572]: I1204 22:35:41.087448 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ld86\" (UniqueName: \"kubernetes.io/projected/3321093f-53af-4bf5-86ea-0e043e2fa780-kube-api-access-5ld86\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.087677 master-0 kubenswrapper[33572]: I1204 22:35:41.087647 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-config-data\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.087803 master-0 kubenswrapper[33572]: I1204 22:35:41.087775 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-combined-ca-bundle\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.087889 master-0 kubenswrapper[33572]: I1204 22:35:41.087858 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-public-tls-certs\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.087889 master-0 kubenswrapper[33572]: I1204 22:35:41.087891 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3321093f-53af-4bf5-86ea-0e043e2fa780-config-data-merged\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.088018 master-0 kubenswrapper[33572]: I1204 22:35:41.087976 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-internal-tls-certs\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.088018 master-0 kubenswrapper[33572]: I1204 22:35:41.088006 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-config-data-custom\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.089324 master-0 kubenswrapper[33572]: I1204 22:35:41.088811 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" podStartSLOduration=4.08879798 podStartE2EDuration="4.08879798s" podCreationTimestamp="2025-12-04 22:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:41.0884212 +0000 UTC m=+1004.815946849" watchObservedRunningTime="2025-12-04 22:35:41.08879798 +0000 UTC m=+1004.816323649" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.190314 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-scripts\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.190390 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3321093f-53af-4bf5-86ea-0e043e2fa780-etc-podinfo\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.190415 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3321093f-53af-4bf5-86ea-0e043e2fa780-logs\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.190701 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ld86\" (UniqueName: \"kubernetes.io/projected/3321093f-53af-4bf5-86ea-0e043e2fa780-kube-api-access-5ld86\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.190900 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-config-data\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.190937 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3321093f-53af-4bf5-86ea-0e043e2fa780-logs\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.191025 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-combined-ca-bundle\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.191080 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-public-tls-certs\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.191117 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3321093f-53af-4bf5-86ea-0e043e2fa780-config-data-merged\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.191180 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-internal-tls-certs\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.191275 master-0 kubenswrapper[33572]: I1204 22:35:41.191199 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-config-data-custom\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.193081 master-0 kubenswrapper[33572]: I1204 22:35:41.193042 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3321093f-53af-4bf5-86ea-0e043e2fa780-config-data-merged\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.195065 master-0 kubenswrapper[33572]: I1204 22:35:41.195038 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3321093f-53af-4bf5-86ea-0e043e2fa780-etc-podinfo\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.195540 master-0 kubenswrapper[33572]: I1204 22:35:41.195468 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-internal-tls-certs\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.196441 master-0 kubenswrapper[33572]: I1204 22:35:41.196417 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-combined-ca-bundle\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.196887 master-0 kubenswrapper[33572]: I1204 22:35:41.196861 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-config-data-custom\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.197263 master-0 kubenswrapper[33572]: I1204 22:35:41.197225 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-config-data\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.197756 master-0 kubenswrapper[33572]: I1204 22:35:41.197593 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-public-tls-certs\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.203355 master-0 kubenswrapper[33572]: I1204 22:35:41.203317 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3321093f-53af-4bf5-86ea-0e043e2fa780-scripts\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.214260 master-0 kubenswrapper[33572]: I1204 22:35:41.214186 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ld86\" (UniqueName: \"kubernetes.io/projected/3321093f-53af-4bf5-86ea-0e043e2fa780-kube-api-access-5ld86\") pod \"ironic-7c5df64bd5-tsrwx\" (UID: \"3321093f-53af-4bf5-86ea-0e043e2fa780\") " pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.330990 master-0 kubenswrapper[33572]: I1204 22:35:41.330932 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:41.412512 master-0 kubenswrapper[33572]: I1204 22:35:41.412441 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ffde0b45-b45d-48a5-bbb7-beee4ebd7250\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e77e0d9-de88-43fb-9d41-8485f8694849\") pod \"ironic-conductor-0\" (UID: \"5c2c1216-db02-4bb8-856c-a56a68e06d23\") " pod="openstack/ironic-conductor-0" Dec 04 22:35:41.426300 master-0 kubenswrapper[33572]: I1204 22:35:41.425963 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:41.662863 master-0 kubenswrapper[33572]: I1204 22:35:41.662804 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:41.669027 master-0 kubenswrapper[33572]: I1204 22:35:41.666875 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:41.680751 master-0 kubenswrapper[33572]: I1204 22:35:41.680708 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:41.787741 master-0 kubenswrapper[33572]: I1204 22:35:41.785102 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Dec 04 22:35:41.864150 master-0 kubenswrapper[33572]: I1204 22:35:41.864087 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7675d-volume-lvm-iscsi-0"] Dec 04 22:35:41.938755 master-0 kubenswrapper[33572]: I1204 22:35:41.938686 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-vp6mx" Dec 04 22:35:41.962729 master-0 kubenswrapper[33572]: I1204 22:35:41.961040 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" Dec 04 22:35:42.039399 master-0 kubenswrapper[33572]: I1204 22:35:42.038221 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rchw2\" (UniqueName: \"kubernetes.io/projected/cee3b31f-b225-485b-9675-70162abf39f2-kube-api-access-rchw2\") pod \"cee3b31f-b225-485b-9675-70162abf39f2\" (UID: \"cee3b31f-b225-485b-9675-70162abf39f2\") " Dec 04 22:35:42.039399 master-0 kubenswrapper[33572]: I1204 22:35:42.038339 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbdff964-9c49-48a4-a0f8-7044131a5627-operator-scripts\") pod \"fbdff964-9c49-48a4-a0f8-7044131a5627\" (UID: \"fbdff964-9c49-48a4-a0f8-7044131a5627\") " Dec 04 22:35:42.039399 master-0 kubenswrapper[33572]: I1204 22:35:42.039052 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cee3b31f-b225-485b-9675-70162abf39f2-operator-scripts\") pod \"cee3b31f-b225-485b-9675-70162abf39f2\" (UID: \"cee3b31f-b225-485b-9675-70162abf39f2\") " Dec 04 22:35:42.039399 master-0 kubenswrapper[33572]: I1204 22:35:42.039359 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbdff964-9c49-48a4-a0f8-7044131a5627-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbdff964-9c49-48a4-a0f8-7044131a5627" (UID: "fbdff964-9c49-48a4-a0f8-7044131a5627"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:42.040318 master-0 kubenswrapper[33572]: I1204 22:35:42.039412 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvgrz\" (UniqueName: \"kubernetes.io/projected/fbdff964-9c49-48a4-a0f8-7044131a5627-kube-api-access-cvgrz\") pod \"fbdff964-9c49-48a4-a0f8-7044131a5627\" (UID: \"fbdff964-9c49-48a4-a0f8-7044131a5627\") " Dec 04 22:35:42.040318 master-0 kubenswrapper[33572]: I1204 22:35:42.039582 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:42.040318 master-0 kubenswrapper[33572]: I1204 22:35:42.040185 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbdff964-9c49-48a4-a0f8-7044131a5627-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:42.040448 master-0 kubenswrapper[33572]: I1204 22:35:42.040392 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee3b31f-b225-485b-9675-70162abf39f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cee3b31f-b225-485b-9675-70162abf39f2" (UID: "cee3b31f-b225-485b-9675-70162abf39f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:42.042996 master-0 kubenswrapper[33572]: I1204 22:35:42.042885 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbdff964-9c49-48a4-a0f8-7044131a5627-kube-api-access-cvgrz" (OuterVolumeSpecName: "kube-api-access-cvgrz") pod "fbdff964-9c49-48a4-a0f8-7044131a5627" (UID: "fbdff964-9c49-48a4-a0f8-7044131a5627"). InnerVolumeSpecName "kube-api-access-cvgrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:42.078943 master-0 kubenswrapper[33572]: I1204 22:35:42.078698 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee3b31f-b225-485b-9675-70162abf39f2-kube-api-access-rchw2" (OuterVolumeSpecName: "kube-api-access-rchw2") pod "cee3b31f-b225-485b-9675-70162abf39f2" (UID: "cee3b31f-b225-485b-9675-70162abf39f2"). InnerVolumeSpecName "kube-api-access-rchw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:42.127664 master-0 kubenswrapper[33572]: I1204 22:35:42.124409 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" event={"ID":"cee3b31f-b225-485b-9675-70162abf39f2","Type":"ContainerDied","Data":"f1186a27212de11e82680c1c44eb4b9d83a77027467500f86c41a81e62abbb45"} Dec 04 22:35:42.127664 master-0 kubenswrapper[33572]: I1204 22:35:42.125845 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1186a27212de11e82680c1c44eb4b9d83a77027467500f86c41a81e62abbb45" Dec 04 22:35:42.127664 master-0 kubenswrapper[33572]: I1204 22:35:42.125933 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-ab22-account-create-update-pqwpd" Dec 04 22:35:42.131043 master-0 kubenswrapper[33572]: I1204 22:35:42.130536 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-vp6mx" Dec 04 22:35:42.133681 master-0 kubenswrapper[33572]: I1204 22:35:42.131403 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" podUID="24240a78-0969-401e-b1e8-0b3f01a4a434" containerName="cinder-volume" containerID="cri-o://f7a0b68706cdf9914a2144da72bee96e37da7f9db0690b4ae7ed704be6a990a6" gracePeriod=30 Dec 04 22:35:42.133681 master-0 kubenswrapper[33572]: I1204 22:35:42.131513 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-vp6mx" event={"ID":"fbdff964-9c49-48a4-a0f8-7044131a5627","Type":"ContainerDied","Data":"30fb9f886428e5842fa0a8e865c32b69bc10c0dd0e08397dbf5fcaf45769a46e"} Dec 04 22:35:42.133681 master-0 kubenswrapper[33572]: I1204 22:35:42.131535 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30fb9f886428e5842fa0a8e865c32b69bc10c0dd0e08397dbf5fcaf45769a46e" Dec 04 22:35:42.133681 master-0 kubenswrapper[33572]: I1204 22:35:42.131575 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" podUID="24240a78-0969-401e-b1e8-0b3f01a4a434" containerName="probe" containerID="cri-o://54267db9176e2e2b9efe2e5bacef0739627419b64734e20c1716d61af6863aea" gracePeriod=30 Dec 04 22:35:42.147060 master-0 kubenswrapper[33572]: I1204 22:35:42.143566 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cee3b31f-b225-485b-9675-70162abf39f2-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:42.147060 master-0 kubenswrapper[33572]: I1204 22:35:42.143596 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvgrz\" (UniqueName: \"kubernetes.io/projected/fbdff964-9c49-48a4-a0f8-7044131a5627-kube-api-access-cvgrz\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:42.147060 master-0 kubenswrapper[33572]: I1204 22:35:42.143609 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rchw2\" (UniqueName: \"kubernetes.io/projected/cee3b31f-b225-485b-9675-70162abf39f2-kube-api-access-rchw2\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:42.270686 master-0 kubenswrapper[33572]: I1204 22:35:42.270635 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7675d-scheduler-0"] Dec 04 22:35:42.314731 master-0 kubenswrapper[33572]: I1204 22:35:42.314676 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7675d-backup-0"] Dec 04 22:35:42.669768 master-0 kubenswrapper[33572]: I1204 22:35:42.669709 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7c5df64bd5-tsrwx"] Dec 04 22:35:42.853555 master-0 kubenswrapper[33572]: I1204 22:35:42.851156 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Dec 04 22:35:43.172642 master-0 kubenswrapper[33572]: I1204 22:35:43.172556 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-api-0" event={"ID":"b4568378-b046-4ee7-8de7-5d5c1f31751f","Type":"ContainerStarted","Data":"085c69c6f33efddf965f25ce66bd503b018ae500453caf9edf2f2e2c77a49ec1"} Dec 04 22:35:43.180284 master-0 kubenswrapper[33572]: I1204 22:35:43.180233 33572 generic.go:334] "Generic (PLEG): container finished" podID="24240a78-0969-401e-b1e8-0b3f01a4a434" containerID="54267db9176e2e2b9efe2e5bacef0739627419b64734e20c1716d61af6863aea" exitCode=0 Dec 04 22:35:43.180284 master-0 kubenswrapper[33572]: I1204 22:35:43.180283 33572 generic.go:334] "Generic (PLEG): container finished" podID="24240a78-0969-401e-b1e8-0b3f01a4a434" containerID="f7a0b68706cdf9914a2144da72bee96e37da7f9db0690b4ae7ed704be6a990a6" exitCode=0 Dec 04 22:35:43.180442 master-0 kubenswrapper[33572]: I1204 22:35:43.180369 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" event={"ID":"24240a78-0969-401e-b1e8-0b3f01a4a434","Type":"ContainerDied","Data":"54267db9176e2e2b9efe2e5bacef0739627419b64734e20c1716d61af6863aea"} Dec 04 22:35:43.180442 master-0 kubenswrapper[33572]: I1204 22:35:43.180400 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" event={"ID":"24240a78-0969-401e-b1e8-0b3f01a4a434","Type":"ContainerDied","Data":"f7a0b68706cdf9914a2144da72bee96e37da7f9db0690b4ae7ed704be6a990a6"} Dec 04 22:35:43.195234 master-0 kubenswrapper[33572]: I1204 22:35:43.195152 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74d58d9b69-7cpwt" event={"ID":"578da66e-114f-4fb7-ace1-ae07e428e2b3","Type":"ContainerStarted","Data":"c9b096a01d794fd74f6780bae723f9e6dfb1448705ba359343472831a2dbc15c"} Dec 04 22:35:43.228549 master-0 kubenswrapper[33572]: I1204 22:35:43.222792 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5c2c1216-db02-4bb8-856c-a56a68e06d23","Type":"ContainerStarted","Data":"9754bd5c0702728bdf15a225abd819885f81bb58a7ff2ddd3811d9cc01bc34ec"} Dec 04 22:35:43.272755 master-0 kubenswrapper[33572]: I1204 22:35:43.256996 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7c5df64bd5-tsrwx" event={"ID":"3321093f-53af-4bf5-86ea-0e043e2fa780","Type":"ContainerStarted","Data":"ae9261770207110cb3f0c77d57b862c0b5877117ce8826bb03023af462bd199b"} Dec 04 22:35:43.272755 master-0 kubenswrapper[33572]: I1204 22:35:43.257077 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7c5df64bd5-tsrwx" event={"ID":"3321093f-53af-4bf5-86ea-0e043e2fa780","Type":"ContainerStarted","Data":"1e56303399bb057238870668dbc1df80052fdecbb14b6e2546f70e23a4615c08"} Dec 04 22:35:43.272755 master-0 kubenswrapper[33572]: I1204 22:35:43.260788 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7675d-scheduler-0" podUID="6dfe0d20-6730-407c-b131-00b753ab28a3" containerName="cinder-scheduler" containerID="cri-o://a6dd2f1375da904eb79da1bb286b68ab548b33022bfc148b22344405ada74821" gracePeriod=30 Dec 04 22:35:43.272755 master-0 kubenswrapper[33572]: I1204 22:35:43.261625 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" event={"ID":"c0304754-88f4-4abd-b4ff-2daf5fea19a8","Type":"ContainerStarted","Data":"ea84d9325ccae62067349c56a067da72d61a92a04989a57fa9524cf6713056b9"} Dec 04 22:35:43.272755 master-0 kubenswrapper[33572]: I1204 22:35:43.261652 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:43.272755 master-0 kubenswrapper[33572]: I1204 22:35:43.261795 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7675d-backup-0" podUID="df4511d6-1b75-449e-b5e2-754c93cca72c" containerName="cinder-backup" containerID="cri-o://fc6ac157ac0f29d0182f071c3f4775d561f62b707d1d469af37859d59a80e715" gracePeriod=30 Dec 04 22:35:43.272755 master-0 kubenswrapper[33572]: I1204 22:35:43.261864 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7675d-scheduler-0" podUID="6dfe0d20-6730-407c-b131-00b753ab28a3" containerName="probe" containerID="cri-o://9db96b7edbfea58983408c8a371129c05dc5daab9d42ca5bc3644a4d5f8338e6" gracePeriod=30 Dec 04 22:35:43.272755 master-0 kubenswrapper[33572]: I1204 22:35:43.261918 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-7675d-backup-0" podUID="df4511d6-1b75-449e-b5e2-754c93cca72c" containerName="probe" containerID="cri-o://2c692068d8f5370502a824c3902aba7b7a0386e5e84e94b48e49433c81d01e23" gracePeriod=30 Dec 04 22:35:43.369068 master-0 kubenswrapper[33572]: I1204 22:35:43.368833 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" podStartSLOduration=3.049780492 podStartE2EDuration="6.368806376s" podCreationTimestamp="2025-12-04 22:35:37 +0000 UTC" firstStartedPulling="2025-12-04 22:35:38.642646831 +0000 UTC m=+1002.370172480" lastFinishedPulling="2025-12-04 22:35:41.961672715 +0000 UTC m=+1005.689198364" observedRunningTime="2025-12-04 22:35:43.319993724 +0000 UTC m=+1007.047519373" watchObservedRunningTime="2025-12-04 22:35:43.368806376 +0000 UTC m=+1007.096332025" Dec 04 22:35:44.300363 master-0 kubenswrapper[33572]: I1204 22:35:44.300137 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5c2c1216-db02-4bb8-856c-a56a68e06d23","Type":"ContainerStarted","Data":"4eab8a34a7567c47d250b480269a3bb9b6f11c48833d692a9d3ec6b01d2af4d3"} Dec 04 22:35:44.305745 master-0 kubenswrapper[33572]: I1204 22:35:44.305689 33572 generic.go:334] "Generic (PLEG): container finished" podID="3321093f-53af-4bf5-86ea-0e043e2fa780" containerID="ae9261770207110cb3f0c77d57b862c0b5877117ce8826bb03023af462bd199b" exitCode=0 Dec 04 22:35:44.305818 master-0 kubenswrapper[33572]: I1204 22:35:44.305757 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7c5df64bd5-tsrwx" event={"ID":"3321093f-53af-4bf5-86ea-0e043e2fa780","Type":"ContainerDied","Data":"ae9261770207110cb3f0c77d57b862c0b5877117ce8826bb03023af462bd199b"} Dec 04 22:35:44.312865 master-0 kubenswrapper[33572]: I1204 22:35:44.312811 33572 generic.go:334] "Generic (PLEG): container finished" podID="df4511d6-1b75-449e-b5e2-754c93cca72c" containerID="2c692068d8f5370502a824c3902aba7b7a0386e5e84e94b48e49433c81d01e23" exitCode=0 Dec 04 22:35:44.312926 master-0 kubenswrapper[33572]: I1204 22:35:44.312870 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-backup-0" event={"ID":"df4511d6-1b75-449e-b5e2-754c93cca72c","Type":"ContainerDied","Data":"2c692068d8f5370502a824c3902aba7b7a0386e5e84e94b48e49433c81d01e23"} Dec 04 22:35:44.313635 master-0 kubenswrapper[33572]: I1204 22:35:44.313590 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:44.319588 master-0 kubenswrapper[33572]: I1204 22:35:44.319520 33572 generic.go:334] "Generic (PLEG): container finished" podID="6dfe0d20-6730-407c-b131-00b753ab28a3" containerID="9db96b7edbfea58983408c8a371129c05dc5daab9d42ca5bc3644a4d5f8338e6" exitCode=0 Dec 04 22:35:44.319588 master-0 kubenswrapper[33572]: I1204 22:35:44.319558 33572 generic.go:334] "Generic (PLEG): container finished" podID="6dfe0d20-6730-407c-b131-00b753ab28a3" containerID="a6dd2f1375da904eb79da1bb286b68ab548b33022bfc148b22344405ada74821" exitCode=0 Dec 04 22:35:44.319693 master-0 kubenswrapper[33572]: I1204 22:35:44.319613 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-scheduler-0" event={"ID":"6dfe0d20-6730-407c-b131-00b753ab28a3","Type":"ContainerDied","Data":"9db96b7edbfea58983408c8a371129c05dc5daab9d42ca5bc3644a4d5f8338e6"} Dec 04 22:35:44.319693 master-0 kubenswrapper[33572]: I1204 22:35:44.319640 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-scheduler-0" event={"ID":"6dfe0d20-6730-407c-b131-00b753ab28a3","Type":"ContainerDied","Data":"a6dd2f1375da904eb79da1bb286b68ab548b33022bfc148b22344405ada74821"} Dec 04 22:35:44.332983 master-0 kubenswrapper[33572]: I1204 22:35:44.332916 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" event={"ID":"24240a78-0969-401e-b1e8-0b3f01a4a434","Type":"ContainerDied","Data":"af63b680cdae9c435bd3cd2385d04a8ede90280990befa69a548f5e84b2bdc41"} Dec 04 22:35:44.332983 master-0 kubenswrapper[33572]: I1204 22:35:44.332971 33572 scope.go:117] "RemoveContainer" containerID="54267db9176e2e2b9efe2e5bacef0739627419b64734e20c1716d61af6863aea" Dec 04 22:35:44.333230 master-0 kubenswrapper[33572]: I1204 22:35:44.333097 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:44.337306 master-0 kubenswrapper[33572]: I1204 22:35:44.336948 33572 generic.go:334] "Generic (PLEG): container finished" podID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerID="c9b096a01d794fd74f6780bae723f9e6dfb1448705ba359343472831a2dbc15c" exitCode=0 Dec 04 22:35:44.338693 master-0 kubenswrapper[33572]: I1204 22:35:44.338663 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74d58d9b69-7cpwt" event={"ID":"578da66e-114f-4fb7-ace1-ae07e428e2b3","Type":"ContainerDied","Data":"c9b096a01d794fd74f6780bae723f9e6dfb1448705ba359343472831a2dbc15c"} Dec 04 22:35:44.416412 master-0 kubenswrapper[33572]: I1204 22:35:44.416354 33572 scope.go:117] "RemoveContainer" containerID="f7a0b68706cdf9914a2144da72bee96e37da7f9db0690b4ae7ed704be6a990a6" Dec 04 22:35:44.429016 master-0 kubenswrapper[33572]: I1204 22:35:44.428960 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-cinder\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429229 master-0 kubenswrapper[33572]: I1204 22:35:44.429064 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-iscsi\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429229 master-0 kubenswrapper[33572]: I1204 22:35:44.429103 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-lib-cinder\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429229 master-0 kubenswrapper[33572]: I1204 22:35:44.429184 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-dev\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429229 master-0 kubenswrapper[33572]: I1204 22:35:44.429206 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-run\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429357 master-0 kubenswrapper[33572]: I1204 22:35:44.429248 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-scripts\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429357 master-0 kubenswrapper[33572]: I1204 22:35:44.429289 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-sys\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429357 master-0 kubenswrapper[33572]: I1204 22:35:44.429318 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-lib-modules\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429452 master-0 kubenswrapper[33572]: I1204 22:35:44.429390 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-combined-ca-bundle\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429452 master-0 kubenswrapper[33572]: I1204 22:35:44.429420 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-brick\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429548 master-0 kubenswrapper[33572]: I1204 22:35:44.429451 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429584 master-0 kubenswrapper[33572]: I1204 22:35:44.429556 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data-custom\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429584 master-0 kubenswrapper[33572]: I1204 22:35:44.429578 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwlr4\" (UniqueName: \"kubernetes.io/projected/24240a78-0969-401e-b1e8-0b3f01a4a434-kube-api-access-fwlr4\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429645 master-0 kubenswrapper[33572]: I1204 22:35:44.429597 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-machine-id\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.429645 master-0 kubenswrapper[33572]: I1204 22:35:44.429613 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-nvme\") pod \"24240a78-0969-401e-b1e8-0b3f01a4a434\" (UID: \"24240a78-0969-401e-b1e8-0b3f01a4a434\") " Dec 04 22:35:44.430716 master-0 kubenswrapper[33572]: I1204 22:35:44.430538 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:44.431108 master-0 kubenswrapper[33572]: I1204 22:35:44.430991 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:44.431108 master-0 kubenswrapper[33572]: I1204 22:35:44.431021 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:44.431108 master-0 kubenswrapper[33572]: I1204 22:35:44.431041 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:44.431108 master-0 kubenswrapper[33572]: I1204 22:35:44.431060 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-dev" (OuterVolumeSpecName: "dev") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:44.431108 master-0 kubenswrapper[33572]: I1204 22:35:44.431082 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-run" (OuterVolumeSpecName: "run") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:44.431380 master-0 kubenswrapper[33572]: I1204 22:35:44.431341 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:44.433340 master-0 kubenswrapper[33572]: I1204 22:35:44.433296 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:44.433340 master-0 kubenswrapper[33572]: I1204 22:35:44.433328 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-sys" (OuterVolumeSpecName: "sys") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:44.433438 master-0 kubenswrapper[33572]: I1204 22:35:44.433353 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:44.436374 master-0 kubenswrapper[33572]: I1204 22:35:44.436315 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-scripts" (OuterVolumeSpecName: "scripts") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:44.466869 master-0 kubenswrapper[33572]: I1204 22:35:44.466805 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24240a78-0969-401e-b1e8-0b3f01a4a434-kube-api-access-fwlr4" (OuterVolumeSpecName: "kube-api-access-fwlr4") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "kube-api-access-fwlr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:44.468132 master-0 kubenswrapper[33572]: I1204 22:35:44.468062 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:44.533050 master-0 kubenswrapper[33572]: I1204 22:35:44.532996 33572 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533050 master-0 kubenswrapper[33572]: I1204 22:35:44.533037 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwlr4\" (UniqueName: \"kubernetes.io/projected/24240a78-0969-401e-b1e8-0b3f01a4a434-kube-api-access-fwlr4\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533050 master-0 kubenswrapper[33572]: I1204 22:35:44.533052 33572 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533050 master-0 kubenswrapper[33572]: I1204 22:35:44.533064 33572 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533361 master-0 kubenswrapper[33572]: I1204 22:35:44.533075 33572 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-nvme\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533361 master-0 kubenswrapper[33572]: I1204 22:35:44.533086 33572 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533361 master-0 kubenswrapper[33572]: I1204 22:35:44.533096 33572 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533361 master-0 kubenswrapper[33572]: I1204 22:35:44.533106 33572 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533361 master-0 kubenswrapper[33572]: I1204 22:35:44.533115 33572 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-dev\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533361 master-0 kubenswrapper[33572]: I1204 22:35:44.533125 33572 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-run\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533361 master-0 kubenswrapper[33572]: I1204 22:35:44.533134 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533361 master-0 kubenswrapper[33572]: I1204 22:35:44.533142 33572 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-sys\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.533361 master-0 kubenswrapper[33572]: I1204 22:35:44.533151 33572 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24240a78-0969-401e-b1e8-0b3f01a4a434-lib-modules\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.713917 master-0 kubenswrapper[33572]: I1204 22:35:44.712868 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:44.747554 master-0 kubenswrapper[33572]: I1204 22:35:44.744092 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:44.857131 master-0 kubenswrapper[33572]: I1204 22:35:44.857016 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data" (OuterVolumeSpecName: "config-data") pod "24240a78-0969-401e-b1e8-0b3f01a4a434" (UID: "24240a78-0969-401e-b1e8-0b3f01a4a434"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:44.898327 master-0 kubenswrapper[33572]: I1204 22:35:44.898278 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:45.004681 master-0 kubenswrapper[33572]: I1204 22:35:45.004554 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24240a78-0969-401e-b1e8-0b3f01a4a434-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:45.083604 master-0 kubenswrapper[33572]: I1204 22:35:45.072384 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7675d-volume-lvm-iscsi-0"] Dec 04 22:35:45.104085 master-0 kubenswrapper[33572]: I1204 22:35:45.103938 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7675d-volume-lvm-iscsi-0"] Dec 04 22:35:45.108516 master-0 kubenswrapper[33572]: I1204 22:35:45.107586 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-scripts\") pod \"6dfe0d20-6730-407c-b131-00b753ab28a3\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " Dec 04 22:35:45.108516 master-0 kubenswrapper[33572]: I1204 22:35:45.107673 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dfe0d20-6730-407c-b131-00b753ab28a3-etc-machine-id\") pod \"6dfe0d20-6730-407c-b131-00b753ab28a3\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " Dec 04 22:35:45.108516 master-0 kubenswrapper[33572]: I1204 22:35:45.107721 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data\") pod \"6dfe0d20-6730-407c-b131-00b753ab28a3\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " Dec 04 22:35:45.108516 master-0 kubenswrapper[33572]: I1204 22:35:45.107778 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-combined-ca-bundle\") pod \"6dfe0d20-6730-407c-b131-00b753ab28a3\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " Dec 04 22:35:45.108516 master-0 kubenswrapper[33572]: I1204 22:35:45.107841 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data-custom\") pod \"6dfe0d20-6730-407c-b131-00b753ab28a3\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " Dec 04 22:35:45.108516 master-0 kubenswrapper[33572]: I1204 22:35:45.107862 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kc7r\" (UniqueName: \"kubernetes.io/projected/6dfe0d20-6730-407c-b131-00b753ab28a3-kube-api-access-7kc7r\") pod \"6dfe0d20-6730-407c-b131-00b753ab28a3\" (UID: \"6dfe0d20-6730-407c-b131-00b753ab28a3\") " Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: I1204 22:35:45.115759 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7675d-volume-lvm-iscsi-0"] Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: E1204 22:35:45.116222 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24240a78-0969-401e-b1e8-0b3f01a4a434" containerName="cinder-volume" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: I1204 22:35:45.116238 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="24240a78-0969-401e-b1e8-0b3f01a4a434" containerName="cinder-volume" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: E1204 22:35:45.116245 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24240a78-0969-401e-b1e8-0b3f01a4a434" containerName="probe" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: I1204 22:35:45.116252 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="24240a78-0969-401e-b1e8-0b3f01a4a434" containerName="probe" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: E1204 22:35:45.116267 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfe0d20-6730-407c-b131-00b753ab28a3" containerName="probe" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: I1204 22:35:45.116273 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfe0d20-6730-407c-b131-00b753ab28a3" containerName="probe" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: E1204 22:35:45.116282 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfe0d20-6730-407c-b131-00b753ab28a3" containerName="cinder-scheduler" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: I1204 22:35:45.116288 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfe0d20-6730-407c-b131-00b753ab28a3" containerName="cinder-scheduler" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: E1204 22:35:45.116341 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbdff964-9c49-48a4-a0f8-7044131a5627" containerName="mariadb-database-create" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: I1204 22:35:45.116347 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbdff964-9c49-48a4-a0f8-7044131a5627" containerName="mariadb-database-create" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: E1204 22:35:45.116359 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee3b31f-b225-485b-9675-70162abf39f2" containerName="mariadb-account-create-update" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: I1204 22:35:45.116365 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee3b31f-b225-485b-9675-70162abf39f2" containerName="mariadb-account-create-update" Dec 04 22:35:45.116565 master-0 kubenswrapper[33572]: I1204 22:35:45.116589 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfe0d20-6730-407c-b131-00b753ab28a3" containerName="cinder-scheduler" Dec 04 22:35:45.117174 master-0 kubenswrapper[33572]: I1204 22:35:45.116608 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="24240a78-0969-401e-b1e8-0b3f01a4a434" containerName="cinder-volume" Dec 04 22:35:45.117174 master-0 kubenswrapper[33572]: I1204 22:35:45.116647 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfe0d20-6730-407c-b131-00b753ab28a3" containerName="probe" Dec 04 22:35:45.117174 master-0 kubenswrapper[33572]: I1204 22:35:45.116657 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee3b31f-b225-485b-9675-70162abf39f2" containerName="mariadb-account-create-update" Dec 04 22:35:45.117174 master-0 kubenswrapper[33572]: I1204 22:35:45.116674 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbdff964-9c49-48a4-a0f8-7044131a5627" containerName="mariadb-database-create" Dec 04 22:35:45.117174 master-0 kubenswrapper[33572]: I1204 22:35:45.116683 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="24240a78-0969-401e-b1e8-0b3f01a4a434" containerName="probe" Dec 04 22:35:45.118953 master-0 kubenswrapper[33572]: I1204 22:35:45.118062 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.120967 master-0 kubenswrapper[33572]: I1204 22:35:45.120906 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-volume-lvm-iscsi-config-data" Dec 04 22:35:45.124038 master-0 kubenswrapper[33572]: I1204 22:35:45.123999 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6dfe0d20-6730-407c-b131-00b753ab28a3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6dfe0d20-6730-407c-b131-00b753ab28a3" (UID: "6dfe0d20-6730-407c-b131-00b753ab28a3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.136596 master-0 kubenswrapper[33572]: I1204 22:35:45.136538 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-volume-lvm-iscsi-0"] Dec 04 22:35:45.223914 master-0 kubenswrapper[33572]: I1204 22:35:45.223385 33572 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6dfe0d20-6730-407c-b131-00b753ab28a3-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:45.247861 master-0 kubenswrapper[33572]: I1204 22:35:45.247787 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-scripts" (OuterVolumeSpecName: "scripts") pod "6dfe0d20-6730-407c-b131-00b753ab28a3" (UID: "6dfe0d20-6730-407c-b131-00b753ab28a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:45.281531 master-0 kubenswrapper[33572]: I1204 22:35:45.277983 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfe0d20-6730-407c-b131-00b753ab28a3-kube-api-access-7kc7r" (OuterVolumeSpecName: "kube-api-access-7kc7r") pod "6dfe0d20-6730-407c-b131-00b753ab28a3" (UID: "6dfe0d20-6730-407c-b131-00b753ab28a3"). InnerVolumeSpecName "kube-api-access-7kc7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:45.328464 master-0 kubenswrapper[33572]: I1204 22:35:45.328396 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6dfe0d20-6730-407c-b131-00b753ab28a3" (UID: "6dfe0d20-6730-407c-b131-00b753ab28a3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:45.330324 master-0 kubenswrapper[33572]: I1204 22:35:45.330217 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-etc-iscsi\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330324 master-0 kubenswrapper[33572]: I1204 22:35:45.330283 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-var-locks-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330417 master-0 kubenswrapper[33572]: I1204 22:35:45.330343 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-lib-modules\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330417 master-0 kubenswrapper[33572]: I1204 22:35:45.330368 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-run\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330417 master-0 kubenswrapper[33572]: I1204 22:35:45.330395 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-etc-nvme\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330417 master-0 kubenswrapper[33572]: I1204 22:35:45.330412 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-var-locks-brick\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330552 master-0 kubenswrapper[33572]: I1204 22:35:45.330430 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-combined-ca-bundle\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330552 master-0 kubenswrapper[33572]: I1204 22:35:45.330488 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-sys\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330620 master-0 kubenswrapper[33572]: I1204 22:35:45.330551 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-config-data-custom\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330620 master-0 kubenswrapper[33572]: I1204 22:35:45.330573 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-scripts\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330620 master-0 kubenswrapper[33572]: I1204 22:35:45.330615 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-var-lib-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330705 master-0 kubenswrapper[33572]: I1204 22:35:45.330635 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-config-data\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330705 master-0 kubenswrapper[33572]: I1204 22:35:45.330672 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js45x\" (UniqueName: \"kubernetes.io/projected/9f407128-b34c-49b3-819e-fdfd27a68d66-kube-api-access-js45x\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330705 master-0 kubenswrapper[33572]: I1204 22:35:45.330692 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-dev\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330797 master-0 kubenswrapper[33572]: I1204 22:35:45.330722 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-etc-machine-id\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.330861 master-0 kubenswrapper[33572]: I1204 22:35:45.330833 33572 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:45.330861 master-0 kubenswrapper[33572]: I1204 22:35:45.330857 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kc7r\" (UniqueName: \"kubernetes.io/projected/6dfe0d20-6730-407c-b131-00b753ab28a3-kube-api-access-7kc7r\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:45.330937 master-0 kubenswrapper[33572]: I1204 22:35:45.330869 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:45.395455 master-0 kubenswrapper[33572]: I1204 22:35:45.395326 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-api-0" event={"ID":"b4568378-b046-4ee7-8de7-5d5c1f31751f","Type":"ContainerStarted","Data":"5bc1f0821928f0618f79511703e828b999f7d343dcdac30d6710a4eb2e70bc0d"} Dec 04 22:35:45.398948 master-0 kubenswrapper[33572]: I1204 22:35:45.398912 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-7675d-api-0" Dec 04 22:35:45.410172 master-0 kubenswrapper[33572]: I1204 22:35:45.410125 33572 generic.go:334] "Generic (PLEG): container finished" podID="df4511d6-1b75-449e-b5e2-754c93cca72c" containerID="fc6ac157ac0f29d0182f071c3f4775d561f62b707d1d469af37859d59a80e715" exitCode=0 Dec 04 22:35:45.410345 master-0 kubenswrapper[33572]: I1204 22:35:45.410189 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-backup-0" event={"ID":"df4511d6-1b75-449e-b5e2-754c93cca72c","Type":"ContainerDied","Data":"fc6ac157ac0f29d0182f071c3f4775d561f62b707d1d469af37859d59a80e715"} Dec 04 22:35:45.420301 master-0 kubenswrapper[33572]: I1204 22:35:45.420254 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:45.420508 master-0 kubenswrapper[33572]: I1204 22:35:45.420337 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-scheduler-0" event={"ID":"6dfe0d20-6730-407c-b131-00b753ab28a3","Type":"ContainerDied","Data":"e843d8746f35ed9c0f9aea42bede543cfb6e8c1092782d0b4c8c077f795a3135"} Dec 04 22:35:45.420508 master-0 kubenswrapper[33572]: I1204 22:35:45.420371 33572 scope.go:117] "RemoveContainer" containerID="9db96b7edbfea58983408c8a371129c05dc5daab9d42ca5bc3644a4d5f8338e6" Dec 04 22:35:45.420583 master-0 kubenswrapper[33572]: I1204 22:35:45.420525 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7675d-api-0" podStartSLOduration=6.420493233 podStartE2EDuration="6.420493233s" podCreationTimestamp="2025-12-04 22:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:45.420118302 +0000 UTC m=+1009.147643961" watchObservedRunningTime="2025-12-04 22:35:45.420493233 +0000 UTC m=+1009.148018882" Dec 04 22:35:45.432605 master-0 kubenswrapper[33572]: I1204 22:35:45.432363 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-etc-machine-id\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432718 master-0 kubenswrapper[33572]: I1204 22:35:45.432677 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-etc-iscsi\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432718 master-0 kubenswrapper[33572]: I1204 22:35:45.432711 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-var-locks-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432797 master-0 kubenswrapper[33572]: I1204 22:35:45.432732 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-lib-modules\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432797 master-0 kubenswrapper[33572]: I1204 22:35:45.432756 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-run\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432797 master-0 kubenswrapper[33572]: I1204 22:35:45.432776 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-etc-nvme\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432889 master-0 kubenswrapper[33572]: I1204 22:35:45.432795 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-var-locks-brick\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432889 master-0 kubenswrapper[33572]: I1204 22:35:45.432812 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-combined-ca-bundle\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432889 master-0 kubenswrapper[33572]: I1204 22:35:45.432863 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-sys\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432980 master-0 kubenswrapper[33572]: I1204 22:35:45.432891 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-config-data-custom\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432980 master-0 kubenswrapper[33572]: I1204 22:35:45.432912 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-scripts\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432980 master-0 kubenswrapper[33572]: I1204 22:35:45.432950 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-var-lib-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.432980 master-0 kubenswrapper[33572]: I1204 22:35:45.432969 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-config-data\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.433095 master-0 kubenswrapper[33572]: I1204 22:35:45.433003 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js45x\" (UniqueName: \"kubernetes.io/projected/9f407128-b34c-49b3-819e-fdfd27a68d66-kube-api-access-js45x\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.433095 master-0 kubenswrapper[33572]: I1204 22:35:45.433020 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-dev\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.433198 master-0 kubenswrapper[33572]: I1204 22:35:45.433175 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-dev\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.433234 master-0 kubenswrapper[33572]: I1204 22:35:45.433213 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-etc-machine-id\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.433264 master-0 kubenswrapper[33572]: I1204 22:35:45.433236 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-etc-iscsi\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.433298 master-0 kubenswrapper[33572]: I1204 22:35:45.433274 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-var-locks-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.433298 master-0 kubenswrapper[33572]: I1204 22:35:45.433295 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-lib-modules\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.433368 master-0 kubenswrapper[33572]: I1204 22:35:45.433316 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-run\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.433368 master-0 kubenswrapper[33572]: I1204 22:35:45.433346 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-etc-nvme\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.433427 master-0 kubenswrapper[33572]: I1204 22:35:45.433376 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-var-locks-brick\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.439935 master-0 kubenswrapper[33572]: I1204 22:35:45.439891 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-sys\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.441058 master-0 kubenswrapper[33572]: I1204 22:35:45.440662 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9f407128-b34c-49b3-819e-fdfd27a68d66-var-lib-cinder\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.445066 master-0 kubenswrapper[33572]: I1204 22:35:45.445033 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-combined-ca-bundle\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.467618 master-0 kubenswrapper[33572]: I1204 22:35:45.464911 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-config-data-custom\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.467618 master-0 kubenswrapper[33572]: I1204 22:35:45.465029 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-scripts\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.468408 master-0 kubenswrapper[33572]: I1204 22:35:45.468330 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js45x\" (UniqueName: \"kubernetes.io/projected/9f407128-b34c-49b3-819e-fdfd27a68d66-kube-api-access-js45x\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.470127 master-0 kubenswrapper[33572]: I1204 22:35:45.469417 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f407128-b34c-49b3-819e-fdfd27a68d66-config-data\") pod \"cinder-7675d-volume-lvm-iscsi-0\" (UID: \"9f407128-b34c-49b3-819e-fdfd27a68d66\") " pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.498601 master-0 kubenswrapper[33572]: I1204 22:35:45.498569 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:45.697950 master-0 kubenswrapper[33572]: I1204 22:35:45.697897 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dfe0d20-6730-407c-b131-00b753ab28a3" (UID: "6dfe0d20-6730-407c-b131-00b753ab28a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:45.743929 master-0 kubenswrapper[33572]: I1204 22:35:45.743881 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:45.791380 master-0 kubenswrapper[33572]: I1204 22:35:45.791304 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data" (OuterVolumeSpecName: "config-data") pod "6dfe0d20-6730-407c-b131-00b753ab28a3" (UID: "6dfe0d20-6730-407c-b131-00b753ab28a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:45.846669 master-0 kubenswrapper[33572]: I1204 22:35:45.846622 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfe0d20-6730-407c-b131-00b753ab28a3-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:45.870405 master-0 kubenswrapper[33572]: I1204 22:35:45.870350 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:45.874098 master-0 kubenswrapper[33572]: I1204 22:35:45.874040 33572 scope.go:117] "RemoveContainer" containerID="a6dd2f1375da904eb79da1bb286b68ab548b33022bfc148b22344405ada74821" Dec 04 22:35:45.950156 master-0 kubenswrapper[33572]: I1204 22:35:45.950073 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-lib-modules\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950156 master-0 kubenswrapper[33572]: I1204 22:35:45.950145 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950238 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data-custom\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950263 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-nvme\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950311 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-sys\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950342 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-dev\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950388 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-iscsi\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950406 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-scripts\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950431 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-combined-ca-bundle\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950487 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdlnp\" (UniqueName: \"kubernetes.io/projected/df4511d6-1b75-449e-b5e2-754c93cca72c-kube-api-access-xdlnp\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950516 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-cinder\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950544 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-machine-id\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950584 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-brick\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950613 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-lib-cinder\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.950678 master-0 kubenswrapper[33572]: I1204 22:35:45.950664 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-run\") pod \"df4511d6-1b75-449e-b5e2-754c93cca72c\" (UID: \"df4511d6-1b75-449e-b5e2-754c93cca72c\") " Dec 04 22:35:45.951834 master-0 kubenswrapper[33572]: I1204 22:35:45.951317 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-run" (OuterVolumeSpecName: "run") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.951834 master-0 kubenswrapper[33572]: I1204 22:35:45.951394 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.951834 master-0 kubenswrapper[33572]: I1204 22:35:45.951515 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.951834 master-0 kubenswrapper[33572]: I1204 22:35:45.951554 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-sys" (OuterVolumeSpecName: "sys") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.951834 master-0 kubenswrapper[33572]: I1204 22:35:45.951680 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.951834 master-0 kubenswrapper[33572]: I1204 22:35:45.951705 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-dev" (OuterVolumeSpecName: "dev") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.951834 master-0 kubenswrapper[33572]: I1204 22:35:45.951753 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.951834 master-0 kubenswrapper[33572]: I1204 22:35:45.951785 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.951834 master-0 kubenswrapper[33572]: I1204 22:35:45.951816 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.951834 master-0 kubenswrapper[33572]: I1204 22:35:45.951849 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 22:35:45.958046 master-0 kubenswrapper[33572]: I1204 22:35:45.957933 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-scripts" (OuterVolumeSpecName: "scripts") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:45.961230 master-0 kubenswrapper[33572]: I1204 22:35:45.959327 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4511d6-1b75-449e-b5e2-754c93cca72c-kube-api-access-xdlnp" (OuterVolumeSpecName: "kube-api-access-xdlnp") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "kube-api-access-xdlnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:45.985859 master-0 kubenswrapper[33572]: I1204 22:35:45.985785 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:46.051521 master-0 kubenswrapper[33572]: I1204 22:35:46.048064 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-volume-lvm-iscsi-0"] Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053585 33572 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053614 33572 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-nvme\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053624 33572 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-sys\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053632 33572 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-dev\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053641 33572 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053649 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053660 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdlnp\" (UniqueName: \"kubernetes.io/projected/df4511d6-1b75-449e-b5e2-754c93cca72c-kube-api-access-xdlnp\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053668 33572 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053678 33572 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053689 33572 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053697 33572 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053705 33572 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-run\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.053716 master-0 kubenswrapper[33572]: I1204 22:35:46.053712 33572 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df4511d6-1b75-449e-b5e2-754c93cca72c-lib-modules\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.100236 master-0 kubenswrapper[33572]: W1204 22:35:46.100164 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f407128_b34c_49b3_819e_fdfd27a68d66.slice/crio-0415aa5e9706e5586b4693e894288cd4513d987c5367f7f9cd2f8eb262151ae7 WatchSource:0}: Error finding container 0415aa5e9706e5586b4693e894288cd4513d987c5367f7f9cd2f8eb262151ae7: Status 404 returned error can't find the container with id 0415aa5e9706e5586b4693e894288cd4513d987c5367f7f9cd2f8eb262151ae7 Dec 04 22:35:46.140992 master-0 kubenswrapper[33572]: I1204 22:35:46.134142 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7675d-scheduler-0"] Dec 04 22:35:46.187873 master-0 kubenswrapper[33572]: I1204 22:35:46.187791 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:46.198182 master-0 kubenswrapper[33572]: I1204 22:35:46.198122 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7675d-scheduler-0"] Dec 04 22:35:46.209366 master-0 kubenswrapper[33572]: I1204 22:35:46.208973 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7675d-scheduler-0"] Dec 04 22:35:46.210516 master-0 kubenswrapper[33572]: E1204 22:35:46.209705 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4511d6-1b75-449e-b5e2-754c93cca72c" containerName="cinder-backup" Dec 04 22:35:46.210516 master-0 kubenswrapper[33572]: I1204 22:35:46.209734 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4511d6-1b75-449e-b5e2-754c93cca72c" containerName="cinder-backup" Dec 04 22:35:46.210516 master-0 kubenswrapper[33572]: E1204 22:35:46.209766 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4511d6-1b75-449e-b5e2-754c93cca72c" containerName="probe" Dec 04 22:35:46.210516 master-0 kubenswrapper[33572]: I1204 22:35:46.209776 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4511d6-1b75-449e-b5e2-754c93cca72c" containerName="probe" Dec 04 22:35:46.210516 master-0 kubenswrapper[33572]: I1204 22:35:46.210165 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4511d6-1b75-449e-b5e2-754c93cca72c" containerName="probe" Dec 04 22:35:46.210516 master-0 kubenswrapper[33572]: I1204 22:35:46.210194 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4511d6-1b75-449e-b5e2-754c93cca72c" containerName="cinder-backup" Dec 04 22:35:46.211893 master-0 kubenswrapper[33572]: I1204 22:35:46.211867 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.213890 master-0 kubenswrapper[33572]: I1204 22:35:46.213838 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-scheduler-config-data" Dec 04 22:35:46.251649 master-0 kubenswrapper[33572]: I1204 22:35:46.249835 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-scheduler-0"] Dec 04 22:35:46.262254 master-0 kubenswrapper[33572]: I1204 22:35:46.262200 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.286695 master-0 kubenswrapper[33572]: I1204 22:35:46.286632 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data" (OuterVolumeSpecName: "config-data") pod "df4511d6-1b75-449e-b5e2-754c93cca72c" (UID: "df4511d6-1b75-449e-b5e2-754c93cca72c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:46.364568 master-0 kubenswrapper[33572]: I1204 22:35:46.364482 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-config-data-custom\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.385924 master-0 kubenswrapper[33572]: I1204 22:35:46.364618 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-scripts\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.385924 master-0 kubenswrapper[33572]: I1204 22:35:46.364698 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55f85bca-594d-4827-8c21-17b118a30f5a-etc-machine-id\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.385924 master-0 kubenswrapper[33572]: I1204 22:35:46.364811 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2v5n\" (UniqueName: \"kubernetes.io/projected/55f85bca-594d-4827-8c21-17b118a30f5a-kube-api-access-g2v5n\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.385924 master-0 kubenswrapper[33572]: I1204 22:35:46.364942 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-combined-ca-bundle\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.385924 master-0 kubenswrapper[33572]: I1204 22:35:46.365261 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-config-data\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.385924 master-0 kubenswrapper[33572]: I1204 22:35:46.365483 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4511d6-1b75-449e-b5e2-754c93cca72c-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:46.452671 master-0 kubenswrapper[33572]: I1204 22:35:46.445877 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-backup-0" event={"ID":"df4511d6-1b75-449e-b5e2-754c93cca72c","Type":"ContainerDied","Data":"8d365fb344d9db2b6d850f6d3960f355c5b5128f5944546f2df8c539b16cf05c"} Dec 04 22:35:46.452671 master-0 kubenswrapper[33572]: I1204 22:35:46.445945 33572 scope.go:117] "RemoveContainer" containerID="2c692068d8f5370502a824c3902aba7b7a0386e5e84e94b48e49433c81d01e23" Dec 04 22:35:46.452671 master-0 kubenswrapper[33572]: I1204 22:35:46.446141 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.459696 master-0 kubenswrapper[33572]: I1204 22:35:46.457919 33572 generic.go:334] "Generic (PLEG): container finished" podID="c0304754-88f4-4abd-b4ff-2daf5fea19a8" containerID="ea84d9325ccae62067349c56a067da72d61a92a04989a57fa9524cf6713056b9" exitCode=1 Dec 04 22:35:46.459696 master-0 kubenswrapper[33572]: I1204 22:35:46.457996 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" event={"ID":"c0304754-88f4-4abd-b4ff-2daf5fea19a8","Type":"ContainerDied","Data":"ea84d9325ccae62067349c56a067da72d61a92a04989a57fa9524cf6713056b9"} Dec 04 22:35:46.459696 master-0 kubenswrapper[33572]: I1204 22:35:46.458859 33572 scope.go:117] "RemoveContainer" containerID="ea84d9325ccae62067349c56a067da72d61a92a04989a57fa9524cf6713056b9" Dec 04 22:35:46.464177 master-0 kubenswrapper[33572]: I1204 22:35:46.464139 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" event={"ID":"9f407128-b34c-49b3-819e-fdfd27a68d66","Type":"ContainerStarted","Data":"0415aa5e9706e5586b4693e894288cd4513d987c5367f7f9cd2f8eb262151ae7"} Dec 04 22:35:46.467010 master-0 kubenswrapper[33572]: I1204 22:35:46.466967 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-config-data\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.467159 master-0 kubenswrapper[33572]: I1204 22:35:46.467045 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-config-data-custom\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.467159 master-0 kubenswrapper[33572]: I1204 22:35:46.467123 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-scripts\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.467245 master-0 kubenswrapper[33572]: I1204 22:35:46.467185 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55f85bca-594d-4827-8c21-17b118a30f5a-etc-machine-id\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.467245 master-0 kubenswrapper[33572]: I1204 22:35:46.467213 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2v5n\" (UniqueName: \"kubernetes.io/projected/55f85bca-594d-4827-8c21-17b118a30f5a-kube-api-access-g2v5n\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.467317 master-0 kubenswrapper[33572]: I1204 22:35:46.467252 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-combined-ca-bundle\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.469014 master-0 kubenswrapper[33572]: I1204 22:35:46.468958 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/55f85bca-594d-4827-8c21-17b118a30f5a-etc-machine-id\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.481592 master-0 kubenswrapper[33572]: I1204 22:35:46.481483 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-combined-ca-bundle\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.481805 master-0 kubenswrapper[33572]: I1204 22:35:46.481758 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-config-data\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.481850 master-0 kubenswrapper[33572]: I1204 22:35:46.481810 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-scripts\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.491935 master-0 kubenswrapper[33572]: I1204 22:35:46.491872 33572 generic.go:334] "Generic (PLEG): container finished" podID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerID="26ba72a428ca1baab534c1e42f8db19c6174598ac6a82d2130e70915acc4b69c" exitCode=1 Dec 04 22:35:46.492237 master-0 kubenswrapper[33572]: I1204 22:35:46.491990 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74d58d9b69-7cpwt" event={"ID":"578da66e-114f-4fb7-ace1-ae07e428e2b3","Type":"ContainerDied","Data":"26ba72a428ca1baab534c1e42f8db19c6174598ac6a82d2130e70915acc4b69c"} Dec 04 22:35:46.492237 master-0 kubenswrapper[33572]: I1204 22:35:46.492035 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74d58d9b69-7cpwt" event={"ID":"578da66e-114f-4fb7-ace1-ae07e428e2b3","Type":"ContainerStarted","Data":"e42a0982a4cc4302175c24de74c027af1c769f4f0d9bedffecfa10409b502bb1"} Dec 04 22:35:46.492905 master-0 kubenswrapper[33572]: I1204 22:35:46.492879 33572 scope.go:117] "RemoveContainer" containerID="26ba72a428ca1baab534c1e42f8db19c6174598ac6a82d2130e70915acc4b69c" Dec 04 22:35:46.493312 master-0 kubenswrapper[33572]: I1204 22:35:46.493265 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55f85bca-594d-4827-8c21-17b118a30f5a-config-data-custom\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.497899 master-0 kubenswrapper[33572]: I1204 22:35:46.495778 33572 scope.go:117] "RemoveContainer" containerID="fc6ac157ac0f29d0182f071c3f4775d561f62b707d1d469af37859d59a80e715" Dec 04 22:35:46.518237 master-0 kubenswrapper[33572]: I1204 22:35:46.518176 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7c5df64bd5-tsrwx" event={"ID":"3321093f-53af-4bf5-86ea-0e043e2fa780","Type":"ContainerStarted","Data":"fcb271503da697c6bacd0e42d513c81af3fda55ea45a474c8405f81327bd63fe"} Dec 04 22:35:46.518237 master-0 kubenswrapper[33572]: I1204 22:35:46.518238 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7c5df64bd5-tsrwx" event={"ID":"3321093f-53af-4bf5-86ea-0e043e2fa780","Type":"ContainerStarted","Data":"629adade5cb729c167896d87d90affed5f4ffffdef95cba826b7a9f82d21d57a"} Dec 04 22:35:46.524480 master-0 kubenswrapper[33572]: I1204 22:35:46.524429 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2v5n\" (UniqueName: \"kubernetes.io/projected/55f85bca-594d-4827-8c21-17b118a30f5a-kube-api-access-g2v5n\") pod \"cinder-7675d-scheduler-0\" (UID: \"55f85bca-594d-4827-8c21-17b118a30f5a\") " pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.545479 master-0 kubenswrapper[33572]: I1204 22:35:46.545403 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24240a78-0969-401e-b1e8-0b3f01a4a434" path="/var/lib/kubelet/pods/24240a78-0969-401e-b1e8-0b3f01a4a434/volumes" Dec 04 22:35:46.552596 master-0 kubenswrapper[33572]: I1204 22:35:46.548054 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfe0d20-6730-407c-b131-00b753ab28a3" path="/var/lib/kubelet/pods/6dfe0d20-6730-407c-b131-00b753ab28a3/volumes" Dec 04 22:35:46.582676 master-0 kubenswrapper[33572]: I1204 22:35:46.581914 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7675d-backup-0"] Dec 04 22:35:46.612431 master-0 kubenswrapper[33572]: I1204 22:35:46.612367 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7675d-backup-0"] Dec 04 22:35:46.626009 master-0 kubenswrapper[33572]: I1204 22:35:46.625940 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-7c5df64bd5-tsrwx" podStartSLOduration=6.62591994 podStartE2EDuration="6.62591994s" podCreationTimestamp="2025-12-04 22:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:46.588108858 +0000 UTC m=+1010.315634517" watchObservedRunningTime="2025-12-04 22:35:46.62591994 +0000 UTC m=+1010.353445589" Dec 04 22:35:46.637842 master-0 kubenswrapper[33572]: I1204 22:35:46.637789 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7675d-backup-0"] Dec 04 22:35:46.644168 master-0 kubenswrapper[33572]: I1204 22:35:46.644126 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.657568 master-0 kubenswrapper[33572]: I1204 22:35:46.653295 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-7675d-backup-config-data" Dec 04 22:35:46.657568 master-0 kubenswrapper[33572]: I1204 22:35:46.655057 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-backup-0"] Dec 04 22:35:46.673526 master-0 kubenswrapper[33572]: I1204 22:35:46.670027 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783208 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-var-locks-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783282 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-sys\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783309 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-etc-machine-id\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783323 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-dev\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783349 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-var-lib-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783372 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-scripts\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783392 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-run\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783424 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-config-data\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783443 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-combined-ca-bundle\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783517 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-lib-modules\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783550 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-etc-iscsi\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783577 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-etc-nvme\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783616 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-config-data-custom\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783640 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-var-locks-brick\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.784526 master-0 kubenswrapper[33572]: I1204 22:35:46.783680 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8v6x\" (UniqueName: \"kubernetes.io/projected/4d1b2769-38d7-42b2-ac37-10e30cbc9629-kube-api-access-v8v6x\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885361 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-var-locks-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885430 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-sys\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885458 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-etc-machine-id\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885473 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-dev\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885511 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-var-lib-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885540 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-scripts\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885566 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-run\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885609 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-config-data\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885631 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-combined-ca-bundle\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885674 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-lib-modules\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885717 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-etc-iscsi\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885753 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-etc-nvme\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885808 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-config-data-custom\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885843 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-var-locks-brick\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.886107 master-0 kubenswrapper[33572]: I1204 22:35:46.885897 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8v6x\" (UniqueName: \"kubernetes.io/projected/4d1b2769-38d7-42b2-ac37-10e30cbc9629-kube-api-access-v8v6x\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.887160 master-0 kubenswrapper[33572]: I1204 22:35:46.886387 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-var-locks-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.887160 master-0 kubenswrapper[33572]: I1204 22:35:46.886426 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-sys\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.887160 master-0 kubenswrapper[33572]: I1204 22:35:46.886447 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-etc-machine-id\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.887160 master-0 kubenswrapper[33572]: I1204 22:35:46.886467 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-dev\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.887160 master-0 kubenswrapper[33572]: I1204 22:35:46.886520 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-var-lib-cinder\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.887160 master-0 kubenswrapper[33572]: I1204 22:35:46.887024 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-lib-modules\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.887160 master-0 kubenswrapper[33572]: I1204 22:35:46.887127 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-run\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.890558 master-0 kubenswrapper[33572]: I1204 22:35:46.889567 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-etc-nvme\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.890558 master-0 kubenswrapper[33572]: I1204 22:35:46.889658 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-etc-iscsi\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.890687 master-0 kubenswrapper[33572]: I1204 22:35:46.890648 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4d1b2769-38d7-42b2-ac37-10e30cbc9629-var-locks-brick\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.958621 master-0 kubenswrapper[33572]: I1204 22:35:46.944320 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-scripts\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.958621 master-0 kubenswrapper[33572]: I1204 22:35:46.945061 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8v6x\" (UniqueName: \"kubernetes.io/projected/4d1b2769-38d7-42b2-ac37-10e30cbc9629-kube-api-access-v8v6x\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.958621 master-0 kubenswrapper[33572]: I1204 22:35:46.947215 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-combined-ca-bundle\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.958621 master-0 kubenswrapper[33572]: I1204 22:35:46.951118 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-config-data-custom\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:46.967247 master-0 kubenswrapper[33572]: I1204 22:35:46.966643 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d1b2769-38d7-42b2-ac37-10e30cbc9629-config-data\") pod \"cinder-7675d-backup-0\" (UID: \"4d1b2769-38d7-42b2-ac37-10e30cbc9629\") " pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:47.184709 master-0 kubenswrapper[33572]: I1204 22:35:47.184100 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:47.204969 master-0 kubenswrapper[33572]: I1204 22:35:47.200221 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-scheduler-0"] Dec 04 22:35:47.232972 master-0 kubenswrapper[33572]: W1204 22:35:47.232865 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55f85bca_594d_4827_8c21_17b118a30f5a.slice/crio-22f396c9d89a4c97dab7ea17806cc614de48e0551c411ba8d1a64b2093215fb0 WatchSource:0}: Error finding container 22f396c9d89a4c97dab7ea17806cc614de48e0551c411ba8d1a64b2093215fb0: Status 404 returned error can't find the container with id 22f396c9d89a4c97dab7ea17806cc614de48e0551c411ba8d1a64b2093215fb0 Dec 04 22:35:47.542902 master-0 kubenswrapper[33572]: I1204 22:35:47.542789 33572 generic.go:334] "Generic (PLEG): container finished" podID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerID="93d4b7a0c200a2938f16675d1e013cfae3c0590581f4f158558b160bf18d88bc" exitCode=1 Dec 04 22:35:47.542902 master-0 kubenswrapper[33572]: I1204 22:35:47.542844 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74d58d9b69-7cpwt" event={"ID":"578da66e-114f-4fb7-ace1-ae07e428e2b3","Type":"ContainerDied","Data":"93d4b7a0c200a2938f16675d1e013cfae3c0590581f4f158558b160bf18d88bc"} Dec 04 22:35:47.542902 master-0 kubenswrapper[33572]: I1204 22:35:47.542933 33572 scope.go:117] "RemoveContainer" containerID="26ba72a428ca1baab534c1e42f8db19c6174598ac6a82d2130e70915acc4b69c" Dec 04 22:35:47.545409 master-0 kubenswrapper[33572]: I1204 22:35:47.543634 33572 scope.go:117] "RemoveContainer" containerID="93d4b7a0c200a2938f16675d1e013cfae3c0590581f4f158558b160bf18d88bc" Dec 04 22:35:47.545409 master-0 kubenswrapper[33572]: E1204 22:35:47.543873 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-74d58d9b69-7cpwt_openstack(578da66e-114f-4fb7-ace1-ae07e428e2b3)\"" pod="openstack/ironic-74d58d9b69-7cpwt" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" Dec 04 22:35:47.547538 master-0 kubenswrapper[33572]: I1204 22:35:47.547491 33572 generic.go:334] "Generic (PLEG): container finished" podID="5c2c1216-db02-4bb8-856c-a56a68e06d23" containerID="4eab8a34a7567c47d250b480269a3bb9b6f11c48833d692a9d3ec6b01d2af4d3" exitCode=0 Dec 04 22:35:47.547623 master-0 kubenswrapper[33572]: I1204 22:35:47.547606 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5c2c1216-db02-4bb8-856c-a56a68e06d23","Type":"ContainerDied","Data":"4eab8a34a7567c47d250b480269a3bb9b6f11c48833d692a9d3ec6b01d2af4d3"} Dec 04 22:35:47.558772 master-0 kubenswrapper[33572]: I1204 22:35:47.557714 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" event={"ID":"c0304754-88f4-4abd-b4ff-2daf5fea19a8","Type":"ContainerStarted","Data":"c7f3123b297498e99e0be827847475a3b74290a3707c53aea234e7f37d6d177f"} Dec 04 22:35:47.558772 master-0 kubenswrapper[33572]: I1204 22:35:47.557955 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:47.559695 master-0 kubenswrapper[33572]: I1204 22:35:47.559652 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-scheduler-0" event={"ID":"55f85bca-594d-4827-8c21-17b118a30f5a","Type":"ContainerStarted","Data":"22f396c9d89a4c97dab7ea17806cc614de48e0551c411ba8d1a64b2093215fb0"} Dec 04 22:35:47.562620 master-0 kubenswrapper[33572]: I1204 22:35:47.562590 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" event={"ID":"9f407128-b34c-49b3-819e-fdfd27a68d66","Type":"ContainerStarted","Data":"035c85c442822b583864b3dce0c3bcb781374e11a193e0a37000abf4c8ca93b1"} Dec 04 22:35:47.562705 master-0 kubenswrapper[33572]: I1204 22:35:47.562622 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" event={"ID":"9f407128-b34c-49b3-819e-fdfd27a68d66","Type":"ContainerStarted","Data":"6dc49085652f2f2214c8aff2d0d73beb4ff4c742726b691564a1a420a31c687e"} Dec 04 22:35:47.563469 master-0 kubenswrapper[33572]: I1204 22:35:47.563450 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:47.691776 master-0 kubenswrapper[33572]: I1204 22:35:47.691689 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" podStartSLOduration=2.691672047 podStartE2EDuration="2.691672047s" podCreationTimestamp="2025-12-04 22:35:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:47.635826363 +0000 UTC m=+1011.363352012" watchObservedRunningTime="2025-12-04 22:35:47.691672047 +0000 UTC m=+1011.419197696" Dec 04 22:35:47.775869 master-0 kubenswrapper[33572]: I1204 22:35:47.775815 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7675d-backup-0"] Dec 04 22:35:48.197686 master-0 kubenswrapper[33572]: I1204 22:35:48.197636 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:35:48.343552 master-0 kubenswrapper[33572]: I1204 22:35:48.341115 33572 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:48.343552 master-0 kubenswrapper[33572]: I1204 22:35:48.341173 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:48.385557 master-0 kubenswrapper[33572]: I1204 22:35:48.380960 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d7495797c-5fbwv"] Dec 04 22:35:48.385557 master-0 kubenswrapper[33572]: I1204 22:35:48.381464 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" podUID="a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" containerName="dnsmasq-dns" containerID="cri-o://2147933aafa02c46bc0780d7ecb4bada18ebbbc4671ffe40cebdf847e8ec4077" gracePeriod=10 Dec 04 22:35:48.566936 master-0 kubenswrapper[33572]: I1204 22:35:48.565650 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4511d6-1b75-449e-b5e2-754c93cca72c" path="/var/lib/kubelet/pods/df4511d6-1b75-449e-b5e2-754c93cca72c/volumes" Dec 04 22:35:48.650859 master-0 kubenswrapper[33572]: I1204 22:35:48.650615 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-backup-0" event={"ID":"4d1b2769-38d7-42b2-ac37-10e30cbc9629","Type":"ContainerStarted","Data":"2f9068a9e27534ec746aa6ba6e955816a32e9a66bd466c6f52c5d2010fc0c7f7"} Dec 04 22:35:48.650859 master-0 kubenswrapper[33572]: I1204 22:35:48.650671 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-backup-0" event={"ID":"4d1b2769-38d7-42b2-ac37-10e30cbc9629","Type":"ContainerStarted","Data":"7359d880a3e0bea047139e5e849f8586588175ab4ca742d00ea86b59625d9479"} Dec 04 22:35:48.650859 master-0 kubenswrapper[33572]: I1204 22:35:48.650682 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-backup-0" event={"ID":"4d1b2769-38d7-42b2-ac37-10e30cbc9629","Type":"ContainerStarted","Data":"cb280ba972af4223d0a8e136742d68548faadce264e43d80917cb5db56b4ac02"} Dec 04 22:35:48.662666 master-0 kubenswrapper[33572]: I1204 22:35:48.655885 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-scheduler-0" event={"ID":"55f85bca-594d-4827-8c21-17b118a30f5a","Type":"ContainerStarted","Data":"a27d5aa6da4a7608e9db59008bf91b38ec121b8e78e2950f994a0b1bd97ff523"} Dec 04 22:35:48.666528 master-0 kubenswrapper[33572]: I1204 22:35:48.665110 33572 scope.go:117] "RemoveContainer" containerID="93d4b7a0c200a2938f16675d1e013cfae3c0590581f4f158558b160bf18d88bc" Dec 04 22:35:48.666528 master-0 kubenswrapper[33572]: E1204 22:35:48.665341 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-74d58d9b69-7cpwt_openstack(578da66e-114f-4fb7-ace1-ae07e428e2b3)\"" pod="openstack/ironic-74d58d9b69-7cpwt" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" Dec 04 22:35:48.690535 master-0 kubenswrapper[33572]: I1204 22:35:48.688083 33572 generic.go:334] "Generic (PLEG): container finished" podID="a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" containerID="2147933aafa02c46bc0780d7ecb4bada18ebbbc4671ffe40cebdf847e8ec4077" exitCode=0 Dec 04 22:35:48.690535 master-0 kubenswrapper[33572]: I1204 22:35:48.689435 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" event={"ID":"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50","Type":"ContainerDied","Data":"2147933aafa02c46bc0780d7ecb4bada18ebbbc4671ffe40cebdf847e8ec4077"} Dec 04 22:35:48.705676 master-0 kubenswrapper[33572]: I1204 22:35:48.696579 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7675d-backup-0" podStartSLOduration=2.696492622 podStartE2EDuration="2.696492622s" podCreationTimestamp="2025-12-04 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:48.684844064 +0000 UTC m=+1012.412369713" watchObservedRunningTime="2025-12-04 22:35:48.696492622 +0000 UTC m=+1012.424018271" Dec 04 22:35:49.086151 master-0 kubenswrapper[33572]: I1204 22:35:49.086088 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:35:49.157894 master-0 kubenswrapper[33572]: I1204 22:35:49.157010 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-swift-storage-0\") pod \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " Dec 04 22:35:49.157894 master-0 kubenswrapper[33572]: I1204 22:35:49.157161 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-svc\") pod \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " Dec 04 22:35:49.157894 master-0 kubenswrapper[33572]: I1204 22:35:49.157263 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs5g4\" (UniqueName: \"kubernetes.io/projected/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-kube-api-access-xs5g4\") pod \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " Dec 04 22:35:49.157894 master-0 kubenswrapper[33572]: I1204 22:35:49.157315 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-config\") pod \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " Dec 04 22:35:49.157894 master-0 kubenswrapper[33572]: I1204 22:35:49.157346 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-nb\") pod \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " Dec 04 22:35:49.157894 master-0 kubenswrapper[33572]: I1204 22:35:49.157369 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-sb\") pod \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\" (UID: \"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50\") " Dec 04 22:35:49.199785 master-0 kubenswrapper[33572]: I1204 22:35:49.198100 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-kube-api-access-xs5g4" (OuterVolumeSpecName: "kube-api-access-xs5g4") pod "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" (UID: "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50"). InnerVolumeSpecName "kube-api-access-xs5g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:49.259801 master-0 kubenswrapper[33572]: I1204 22:35:49.259743 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs5g4\" (UniqueName: \"kubernetes.io/projected/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-kube-api-access-xs5g4\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:49.277194 master-0 kubenswrapper[33572]: I1204 22:35:49.277102 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-config" (OuterVolumeSpecName: "config") pod "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" (UID: "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:49.309922 master-0 kubenswrapper[33572]: I1204 22:35:49.309852 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" (UID: "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:49.329820 master-0 kubenswrapper[33572]: I1204 22:35:49.329733 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" (UID: "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:49.360877 master-0 kubenswrapper[33572]: I1204 22:35:49.356594 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" (UID: "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:49.360877 master-0 kubenswrapper[33572]: I1204 22:35:49.362791 33572 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:49.360877 master-0 kubenswrapper[33572]: I1204 22:35:49.362838 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:49.360877 master-0 kubenswrapper[33572]: I1204 22:35:49.362847 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:49.360877 master-0 kubenswrapper[33572]: I1204 22:35:49.362861 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:49.448831 master-0 kubenswrapper[33572]: I1204 22:35:49.446997 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" (UID: "a156bd4c-38f9-40f2-82c0-bc3a57ab6f50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:35:49.466241 master-0 kubenswrapper[33572]: I1204 22:35:49.466013 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:49.725946 master-0 kubenswrapper[33572]: I1204 22:35:49.725795 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" event={"ID":"a156bd4c-38f9-40f2-82c0-bc3a57ab6f50","Type":"ContainerDied","Data":"7c5db0a0b96fc171be3f91d985e326537d3bdf4b9c65a00a4fbc319d9d7fc743"} Dec 04 22:35:49.725946 master-0 kubenswrapper[33572]: I1204 22:35:49.725858 33572 scope.go:117] "RemoveContainer" containerID="2147933aafa02c46bc0780d7ecb4bada18ebbbc4671ffe40cebdf847e8ec4077" Dec 04 22:35:49.725946 master-0 kubenswrapper[33572]: I1204 22:35:49.725926 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d7495797c-5fbwv" Dec 04 22:35:49.741777 master-0 kubenswrapper[33572]: I1204 22:35:49.741692 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7675d-scheduler-0" event={"ID":"55f85bca-594d-4827-8c21-17b118a30f5a","Type":"ContainerStarted","Data":"7abe67057c718867127ee66e6132cf6c6058321fadd5ed5937ff9fa1399819ff"} Dec 04 22:35:49.743652 master-0 kubenswrapper[33572]: I1204 22:35:49.743103 33572 scope.go:117] "RemoveContainer" containerID="93d4b7a0c200a2938f16675d1e013cfae3c0590581f4f158558b160bf18d88bc" Dec 04 22:35:49.743652 master-0 kubenswrapper[33572]: E1204 22:35:49.743603 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-74d58d9b69-7cpwt_openstack(578da66e-114f-4fb7-ace1-ae07e428e2b3)\"" pod="openstack/ironic-74d58d9b69-7cpwt" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" Dec 04 22:35:49.796529 master-0 kubenswrapper[33572]: I1204 22:35:49.789798 33572 scope.go:117] "RemoveContainer" containerID="29b204cc4b45d355767e20b68482224f716a0bd8e04053ae0bbb254486085bd7" Dec 04 22:35:49.842527 master-0 kubenswrapper[33572]: I1204 22:35:49.834739 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7675d-scheduler-0" podStartSLOduration=3.834713226 podStartE2EDuration="3.834713226s" podCreationTimestamp="2025-12-04 22:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:35:49.776448827 +0000 UTC m=+1013.503974496" watchObservedRunningTime="2025-12-04 22:35:49.834713226 +0000 UTC m=+1013.562238875" Dec 04 22:35:49.893529 master-0 kubenswrapper[33572]: I1204 22:35:49.892474 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d7495797c-5fbwv"] Dec 04 22:35:49.919526 master-0 kubenswrapper[33572]: I1204 22:35:49.916708 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d7495797c-5fbwv"] Dec 04 22:35:50.301874 master-0 kubenswrapper[33572]: I1204 22:35:50.301824 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:50.488480 master-0 kubenswrapper[33572]: I1204 22:35:50.488360 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-84794cb9bb-xw9hq" Dec 04 22:35:50.499979 master-0 kubenswrapper[33572]: I1204 22:35:50.499945 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:50.541923 master-0 kubenswrapper[33572]: I1204 22:35:50.541865 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" path="/var/lib/kubelet/pods/a156bd4c-38f9-40f2-82c0-bc3a57ab6f50/volumes" Dec 04 22:35:50.787530 master-0 kubenswrapper[33572]: I1204 22:35:50.787448 33572 generic.go:334] "Generic (PLEG): container finished" podID="c0304754-88f4-4abd-b4ff-2daf5fea19a8" containerID="c7f3123b297498e99e0be827847475a3b74290a3707c53aea234e7f37d6d177f" exitCode=1 Dec 04 22:35:50.788125 master-0 kubenswrapper[33572]: I1204 22:35:50.787550 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" event={"ID":"c0304754-88f4-4abd-b4ff-2daf5fea19a8","Type":"ContainerDied","Data":"c7f3123b297498e99e0be827847475a3b74290a3707c53aea234e7f37d6d177f"} Dec 04 22:35:50.788125 master-0 kubenswrapper[33572]: I1204 22:35:50.787585 33572 scope.go:117] "RemoveContainer" containerID="ea84d9325ccae62067349c56a067da72d61a92a04989a57fa9524cf6713056b9" Dec 04 22:35:50.788565 master-0 kubenswrapper[33572]: I1204 22:35:50.788490 33572 scope.go:117] "RemoveContainer" containerID="c7f3123b297498e99e0be827847475a3b74290a3707c53aea234e7f37d6d177f" Dec 04 22:35:50.789676 master-0 kubenswrapper[33572]: E1204 22:35:50.788846 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-57f48bc457-9xrbh_openstack(c0304754-88f4-4abd-b4ff-2daf5fea19a8)\"" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" podUID="c0304754-88f4-4abd-b4ff-2daf5fea19a8" Dec 04 22:35:51.673739 master-0 kubenswrapper[33572]: I1204 22:35:51.673629 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:52.176620 master-0 kubenswrapper[33572]: I1204 22:35:52.176551 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-7675d-api-0" Dec 04 22:35:52.185135 master-0 kubenswrapper[33572]: I1204 22:35:52.184944 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:52.559613 master-0 kubenswrapper[33572]: I1204 22:35:52.559468 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-nc4qg"] Dec 04 22:35:52.563939 master-0 kubenswrapper[33572]: E1204 22:35:52.559867 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" containerName="dnsmasq-dns" Dec 04 22:35:52.563939 master-0 kubenswrapper[33572]: I1204 22:35:52.559886 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" containerName="dnsmasq-dns" Dec 04 22:35:52.563939 master-0 kubenswrapper[33572]: E1204 22:35:52.559908 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" containerName="init" Dec 04 22:35:52.563939 master-0 kubenswrapper[33572]: I1204 22:35:52.559914 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" containerName="init" Dec 04 22:35:52.563939 master-0 kubenswrapper[33572]: I1204 22:35:52.560201 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="a156bd4c-38f9-40f2-82c0-bc3a57ab6f50" containerName="dnsmasq-dns" Dec 04 22:35:52.563939 master-0 kubenswrapper[33572]: I1204 22:35:52.560931 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.563939 master-0 kubenswrapper[33572]: I1204 22:35:52.563551 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Dec 04 22:35:52.565474 master-0 kubenswrapper[33572]: I1204 22:35:52.564976 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Dec 04 22:35:52.571098 master-0 kubenswrapper[33572]: I1204 22:35:52.571054 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-nc4qg"] Dec 04 22:35:52.688469 master-0 kubenswrapper[33572]: I1204 22:35:52.688367 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-scripts\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.688736 master-0 kubenswrapper[33572]: I1204 22:35:52.688529 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.688950 master-0 kubenswrapper[33572]: I1204 22:35:52.688907 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-combined-ca-bundle\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.689084 master-0 kubenswrapper[33572]: I1204 22:35:52.689064 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.689129 master-0 kubenswrapper[33572]: I1204 22:35:52.689114 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/415ef8f6-9405-4807-8661-ca51383aa454-etc-podinfo\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.689321 master-0 kubenswrapper[33572]: I1204 22:35:52.689270 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-config\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.689392 master-0 kubenswrapper[33572]: I1204 22:35:52.689354 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4znz\" (UniqueName: \"kubernetes.io/projected/415ef8f6-9405-4807-8661-ca51383aa454-kube-api-access-h4znz\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.797309 master-0 kubenswrapper[33572]: I1204 22:35:52.792509 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-combined-ca-bundle\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.797309 master-0 kubenswrapper[33572]: I1204 22:35:52.792590 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.797309 master-0 kubenswrapper[33572]: I1204 22:35:52.792610 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/415ef8f6-9405-4807-8661-ca51383aa454-etc-podinfo\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.797309 master-0 kubenswrapper[33572]: I1204 22:35:52.792692 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-config\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.797309 master-0 kubenswrapper[33572]: I1204 22:35:52.792769 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4znz\" (UniqueName: \"kubernetes.io/projected/415ef8f6-9405-4807-8661-ca51383aa454-kube-api-access-h4znz\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.797309 master-0 kubenswrapper[33572]: I1204 22:35:52.792878 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-scripts\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.797309 master-0 kubenswrapper[33572]: I1204 22:35:52.792901 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.797309 master-0 kubenswrapper[33572]: I1204 22:35:52.793405 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.797309 master-0 kubenswrapper[33572]: I1204 22:35:52.794424 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.798036 master-0 kubenswrapper[33572]: I1204 22:35:52.797852 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/415ef8f6-9405-4807-8661-ca51383aa454-etc-podinfo\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.800484 master-0 kubenswrapper[33572]: I1204 22:35:52.800415 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-config\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.801353 master-0 kubenswrapper[33572]: I1204 22:35:52.801312 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-scripts\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.820915 master-0 kubenswrapper[33572]: I1204 22:35:52.820766 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4znz\" (UniqueName: \"kubernetes.io/projected/415ef8f6-9405-4807-8661-ca51383aa454-kube-api-access-h4znz\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.828164 master-0 kubenswrapper[33572]: I1204 22:35:52.828107 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-combined-ca-bundle\") pod \"ironic-inspector-db-sync-nc4qg\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:52.932464 master-0 kubenswrapper[33572]: I1204 22:35:52.932359 33572 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:35:52.934391 master-0 kubenswrapper[33572]: I1204 22:35:52.934318 33572 scope.go:117] "RemoveContainer" containerID="c7f3123b297498e99e0be827847475a3b74290a3707c53aea234e7f37d6d177f" Dec 04 22:35:52.934891 master-0 kubenswrapper[33572]: E1204 22:35:52.934866 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-57f48bc457-9xrbh_openstack(c0304754-88f4-4abd-b4ff-2daf5fea19a8)\"" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" podUID="c0304754-88f4-4abd-b4ff-2daf5fea19a8" Dec 04 22:35:52.944350 master-0 kubenswrapper[33572]: I1204 22:35:52.944306 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-7c5df64bd5-tsrwx" Dec 04 22:35:52.966907 master-0 kubenswrapper[33572]: I1204 22:35:52.966549 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:35:53.206688 master-0 kubenswrapper[33572]: I1204 22:35:53.206607 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-74d58d9b69-7cpwt"] Dec 04 22:35:53.207364 master-0 kubenswrapper[33572]: I1204 22:35:53.206988 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-74d58d9b69-7cpwt" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="ironic-api-log" containerID="cri-o://e42a0982a4cc4302175c24de74c027af1c769f4f0d9bedffecfa10409b502bb1" gracePeriod=60 Dec 04 22:35:53.869223 master-0 kubenswrapper[33572]: I1204 22:35:53.869179 33572 generic.go:334] "Generic (PLEG): container finished" podID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerID="e42a0982a4cc4302175c24de74c027af1c769f4f0d9bedffecfa10409b502bb1" exitCode=143 Dec 04 22:35:53.870591 master-0 kubenswrapper[33572]: I1204 22:35:53.869248 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74d58d9b69-7cpwt" event={"ID":"578da66e-114f-4fb7-ace1-ae07e428e2b3","Type":"ContainerDied","Data":"e42a0982a4cc4302175c24de74c027af1c769f4f0d9bedffecfa10409b502bb1"} Dec 04 22:35:54.032857 master-0 kubenswrapper[33572]: I1204 22:35:54.032691 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:54.093799 master-0 kubenswrapper[33572]: I1204 22:35:54.093721 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-nc4qg"] Dec 04 22:35:54.147096 master-0 kubenswrapper[33572]: I1204 22:35:54.147039 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbrsr\" (UniqueName: \"kubernetes.io/projected/578da66e-114f-4fb7-ace1-ae07e428e2b3-kube-api-access-wbrsr\") pod \"578da66e-114f-4fb7-ace1-ae07e428e2b3\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " Dec 04 22:35:54.147566 master-0 kubenswrapper[33572]: I1204 22:35:54.147537 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-merged\") pod \"578da66e-114f-4fb7-ace1-ae07e428e2b3\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " Dec 04 22:35:54.147814 master-0 kubenswrapper[33572]: I1204 22:35:54.147741 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/578da66e-114f-4fb7-ace1-ae07e428e2b3-etc-podinfo\") pod \"578da66e-114f-4fb7-ace1-ae07e428e2b3\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " Dec 04 22:35:54.147868 master-0 kubenswrapper[33572]: I1204 22:35:54.147846 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-logs\") pod \"578da66e-114f-4fb7-ace1-ae07e428e2b3\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " Dec 04 22:35:54.147904 master-0 kubenswrapper[33572]: I1204 22:35:54.147879 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-combined-ca-bundle\") pod \"578da66e-114f-4fb7-ace1-ae07e428e2b3\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " Dec 04 22:35:54.148325 master-0 kubenswrapper[33572]: I1204 22:35:54.147988 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-custom\") pod \"578da66e-114f-4fb7-ace1-ae07e428e2b3\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " Dec 04 22:35:54.148325 master-0 kubenswrapper[33572]: I1204 22:35:54.148043 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data\") pod \"578da66e-114f-4fb7-ace1-ae07e428e2b3\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " Dec 04 22:35:54.148325 master-0 kubenswrapper[33572]: I1204 22:35:54.148122 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-scripts\") pod \"578da66e-114f-4fb7-ace1-ae07e428e2b3\" (UID: \"578da66e-114f-4fb7-ace1-ae07e428e2b3\") " Dec 04 22:35:54.148555 master-0 kubenswrapper[33572]: I1204 22:35:54.148443 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "578da66e-114f-4fb7-ace1-ae07e428e2b3" (UID: "578da66e-114f-4fb7-ace1-ae07e428e2b3"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:35:54.148765 master-0 kubenswrapper[33572]: I1204 22:35:54.148729 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-logs" (OuterVolumeSpecName: "logs") pod "578da66e-114f-4fb7-ace1-ae07e428e2b3" (UID: "578da66e-114f-4fb7-ace1-ae07e428e2b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:35:54.149249 master-0 kubenswrapper[33572]: I1204 22:35:54.149160 33572 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-merged\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:54.149291 master-0 kubenswrapper[33572]: I1204 22:35:54.149250 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/578da66e-114f-4fb7-ace1-ae07e428e2b3-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:54.151873 master-0 kubenswrapper[33572]: I1204 22:35:54.151797 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578da66e-114f-4fb7-ace1-ae07e428e2b3-kube-api-access-wbrsr" (OuterVolumeSpecName: "kube-api-access-wbrsr") pod "578da66e-114f-4fb7-ace1-ae07e428e2b3" (UID: "578da66e-114f-4fb7-ace1-ae07e428e2b3"). InnerVolumeSpecName "kube-api-access-wbrsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:35:54.153407 master-0 kubenswrapper[33572]: I1204 22:35:54.153357 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/578da66e-114f-4fb7-ace1-ae07e428e2b3-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "578da66e-114f-4fb7-ace1-ae07e428e2b3" (UID: "578da66e-114f-4fb7-ace1-ae07e428e2b3"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 22:35:54.157760 master-0 kubenswrapper[33572]: I1204 22:35:54.157708 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-scripts" (OuterVolumeSpecName: "scripts") pod "578da66e-114f-4fb7-ace1-ae07e428e2b3" (UID: "578da66e-114f-4fb7-ace1-ae07e428e2b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:54.166036 master-0 kubenswrapper[33572]: I1204 22:35:54.165982 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "578da66e-114f-4fb7-ace1-ae07e428e2b3" (UID: "578da66e-114f-4fb7-ace1-ae07e428e2b3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:54.217779 master-0 kubenswrapper[33572]: I1204 22:35:54.216174 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data" (OuterVolumeSpecName: "config-data") pod "578da66e-114f-4fb7-ace1-ae07e428e2b3" (UID: "578da66e-114f-4fb7-ace1-ae07e428e2b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:54.225581 master-0 kubenswrapper[33572]: I1204 22:35:54.225167 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "578da66e-114f-4fb7-ace1-ae07e428e2b3" (UID: "578da66e-114f-4fb7-ace1-ae07e428e2b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:35:54.251229 master-0 kubenswrapper[33572]: I1204 22:35:54.251170 33572 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/578da66e-114f-4fb7-ace1-ae07e428e2b3-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:54.251229 master-0 kubenswrapper[33572]: I1204 22:35:54.251224 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:54.251366 master-0 kubenswrapper[33572]: I1204 22:35:54.251240 33572 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:54.251366 master-0 kubenswrapper[33572]: I1204 22:35:54.251253 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:54.251366 master-0 kubenswrapper[33572]: I1204 22:35:54.251265 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/578da66e-114f-4fb7-ace1-ae07e428e2b3-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:54.251366 master-0 kubenswrapper[33572]: I1204 22:35:54.251278 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbrsr\" (UniqueName: \"kubernetes.io/projected/578da66e-114f-4fb7-ace1-ae07e428e2b3-kube-api-access-wbrsr\") on node \"master-0\" DevicePath \"\"" Dec 04 22:35:54.885535 master-0 kubenswrapper[33572]: I1204 22:35:54.885434 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-nc4qg" event={"ID":"415ef8f6-9405-4807-8661-ca51383aa454","Type":"ContainerStarted","Data":"a593975623a72901f7989a74076ae3c520a179084e007b146ef3040bc33728df"} Dec 04 22:35:54.889633 master-0 kubenswrapper[33572]: I1204 22:35:54.889529 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74d58d9b69-7cpwt" event={"ID":"578da66e-114f-4fb7-ace1-ae07e428e2b3","Type":"ContainerDied","Data":"903cd1783a9d27f059acbd4bf10f0be207ad3eb5e5696e9651b271f1ee075065"} Dec 04 22:35:54.889633 master-0 kubenswrapper[33572]: I1204 22:35:54.889576 33572 scope.go:117] "RemoveContainer" containerID="93d4b7a0c200a2938f16675d1e013cfae3c0590581f4f158558b160bf18d88bc" Dec 04 22:35:54.889790 master-0 kubenswrapper[33572]: I1204 22:35:54.889724 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-74d58d9b69-7cpwt" Dec 04 22:35:54.936880 master-0 kubenswrapper[33572]: I1204 22:35:54.929131 33572 scope.go:117] "RemoveContainer" containerID="e42a0982a4cc4302175c24de74c027af1c769f4f0d9bedffecfa10409b502bb1" Dec 04 22:35:54.936880 master-0 kubenswrapper[33572]: I1204 22:35:54.935610 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-74d58d9b69-7cpwt"] Dec 04 22:35:54.951517 master-0 kubenswrapper[33572]: I1204 22:35:54.949471 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-74d58d9b69-7cpwt"] Dec 04 22:35:55.003820 master-0 kubenswrapper[33572]: I1204 22:35:55.003778 33572 scope.go:117] "RemoveContainer" containerID="c9b096a01d794fd74f6780bae723f9e6dfb1448705ba359343472831a2dbc15c" Dec 04 22:35:55.748137 master-0 kubenswrapper[33572]: I1204 22:35:55.747996 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7675d-volume-lvm-iscsi-0" Dec 04 22:35:56.482052 master-0 kubenswrapper[33572]: I1204 22:35:56.481995 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-fd96d6774-6kg5t" Dec 04 22:35:56.544367 master-0 kubenswrapper[33572]: I1204 22:35:56.544294 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" path="/var/lib/kubelet/pods/578da66e-114f-4fb7-ace1-ae07e428e2b3/volumes" Dec 04 22:35:56.865187 master-0 kubenswrapper[33572]: I1204 22:35:56.865006 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7675d-scheduler-0" Dec 04 22:35:56.937205 master-0 kubenswrapper[33572]: I1204 22:35:56.937141 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-nc4qg" event={"ID":"415ef8f6-9405-4807-8661-ca51383aa454","Type":"ContainerStarted","Data":"5e21fdbe67e53c4bf0d5a117687c3388a3fb0445e6b388899942d8529fd65c82"} Dec 04 22:35:56.970840 master-0 kubenswrapper[33572]: I1204 22:35:56.970669 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-nc4qg" podStartSLOduration=2.775742792 podStartE2EDuration="4.970644417s" podCreationTimestamp="2025-12-04 22:35:52 +0000 UTC" firstStartedPulling="2025-12-04 22:35:54.1030868 +0000 UTC m=+1017.830612449" lastFinishedPulling="2025-12-04 22:35:56.297988425 +0000 UTC m=+1020.025514074" observedRunningTime="2025-12-04 22:35:56.95465284 +0000 UTC m=+1020.682178489" watchObservedRunningTime="2025-12-04 22:35:56.970644417 +0000 UTC m=+1020.698170066" Dec 04 22:35:57.427572 master-0 kubenswrapper[33572]: I1204 22:35:57.427435 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-7675d-backup-0" Dec 04 22:35:58.208192 master-0 kubenswrapper[33572]: I1204 22:35:58.207281 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:35:58.213309 master-0 kubenswrapper[33572]: I1204 22:35:58.212221 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 22:35:58.213309 master-0 kubenswrapper[33572]: E1204 22:35:58.212779 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="init" Dec 04 22:35:58.213309 master-0 kubenswrapper[33572]: I1204 22:35:58.212796 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="init" Dec 04 22:35:58.213309 master-0 kubenswrapper[33572]: E1204 22:35:58.213084 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="ironic-api-log" Dec 04 22:35:58.213309 master-0 kubenswrapper[33572]: I1204 22:35:58.213099 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="ironic-api-log" Dec 04 22:35:58.213309 master-0 kubenswrapper[33572]: E1204 22:35:58.213129 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="ironic-api" Dec 04 22:35:58.213613 master-0 kubenswrapper[33572]: I1204 22:35:58.213329 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="ironic-api" Dec 04 22:35:58.213613 master-0 kubenswrapper[33572]: E1204 22:35:58.213409 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="ironic-api" Dec 04 22:35:58.213613 master-0 kubenswrapper[33572]: I1204 22:35:58.213425 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="ironic-api" Dec 04 22:35:58.214217 master-0 kubenswrapper[33572]: I1204 22:35:58.214003 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="ironic-api-log" Dec 04 22:35:58.214217 master-0 kubenswrapper[33572]: I1204 22:35:58.214037 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="ironic-api" Dec 04 22:35:58.216584 master-0 kubenswrapper[33572]: I1204 22:35:58.215926 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 22:35:58.220843 master-0 kubenswrapper[33572]: I1204 22:35:58.220780 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 04 22:35:58.221591 master-0 kubenswrapper[33572]: I1204 22:35:58.221558 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 04 22:35:58.226647 master-0 kubenswrapper[33572]: I1204 22:35:58.226590 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 22:35:58.285532 master-0 kubenswrapper[33572]: I1204 22:35:58.278399 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0230b30d-f462-4a11-8828-4d4b153961ee-openstack-config\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.285532 master-0 kubenswrapper[33572]: I1204 22:35:58.278447 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4b2l\" (UniqueName: \"kubernetes.io/projected/0230b30d-f462-4a11-8828-4d4b153961ee-kube-api-access-f4b2l\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.285532 master-0 kubenswrapper[33572]: I1204 22:35:58.278599 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0230b30d-f462-4a11-8828-4d4b153961ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.285532 master-0 kubenswrapper[33572]: I1204 22:35:58.278696 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0230b30d-f462-4a11-8828-4d4b153961ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.385833 master-0 kubenswrapper[33572]: I1204 22:35:58.381222 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0230b30d-f462-4a11-8828-4d4b153961ee-openstack-config\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.385833 master-0 kubenswrapper[33572]: I1204 22:35:58.381300 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4b2l\" (UniqueName: \"kubernetes.io/projected/0230b30d-f462-4a11-8828-4d4b153961ee-kube-api-access-f4b2l\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.385833 master-0 kubenswrapper[33572]: I1204 22:35:58.381440 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0230b30d-f462-4a11-8828-4d4b153961ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.385833 master-0 kubenswrapper[33572]: I1204 22:35:58.381482 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0230b30d-f462-4a11-8828-4d4b153961ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.385833 master-0 kubenswrapper[33572]: I1204 22:35:58.383107 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0230b30d-f462-4a11-8828-4d4b153961ee-openstack-config\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.387284 master-0 kubenswrapper[33572]: I1204 22:35:58.387261 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0230b30d-f462-4a11-8828-4d4b153961ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.388598 master-0 kubenswrapper[33572]: I1204 22:35:58.388569 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0230b30d-f462-4a11-8828-4d4b153961ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.423820 master-0 kubenswrapper[33572]: I1204 22:35:58.423665 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4b2l\" (UniqueName: \"kubernetes.io/projected/0230b30d-f462-4a11-8828-4d4b153961ee-kube-api-access-f4b2l\") pod \"openstackclient\" (UID: \"0230b30d-f462-4a11-8828-4d4b153961ee\") " pod="openstack/openstackclient" Dec 04 22:35:58.585572 master-0 kubenswrapper[33572]: I1204 22:35:58.585377 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 22:35:59.915940 master-0 kubenswrapper[33572]: W1204 22:35:59.915875 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0230b30d_f462_4a11_8828_4d4b153961ee.slice/crio-17a751b1e1dd5c115301d9a149ded8114a610397c086c6e362d3155dd50c56ca WatchSource:0}: Error finding container 17a751b1e1dd5c115301d9a149ded8114a610397c086c6e362d3155dd50c56ca: Status 404 returned error can't find the container with id 17a751b1e1dd5c115301d9a149ded8114a610397c086c6e362d3155dd50c56ca Dec 04 22:35:59.917480 master-0 kubenswrapper[33572]: I1204 22:35:59.917443 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 22:36:00.006996 master-0 kubenswrapper[33572]: I1204 22:36:00.006877 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0230b30d-f462-4a11-8828-4d4b153961ee","Type":"ContainerStarted","Data":"17a751b1e1dd5c115301d9a149ded8114a610397c086c6e362d3155dd50c56ca"} Dec 04 22:36:00.015544 master-0 kubenswrapper[33572]: I1204 22:36:00.015086 33572 generic.go:334] "Generic (PLEG): container finished" podID="415ef8f6-9405-4807-8661-ca51383aa454" containerID="5e21fdbe67e53c4bf0d5a117687c3388a3fb0445e6b388899942d8529fd65c82" exitCode=0 Dec 04 22:36:00.015544 master-0 kubenswrapper[33572]: I1204 22:36:00.015180 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-nc4qg" event={"ID":"415ef8f6-9405-4807-8661-ca51383aa454","Type":"ContainerDied","Data":"5e21fdbe67e53c4bf0d5a117687c3388a3fb0445e6b388899942d8529fd65c82"} Dec 04 22:36:01.536827 master-0 kubenswrapper[33572]: I1204 22:36:01.536778 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:36:01.666682 master-0 kubenswrapper[33572]: I1204 22:36:01.666633 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic\") pod \"415ef8f6-9405-4807-8661-ca51383aa454\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " Dec 04 22:36:01.666917 master-0 kubenswrapper[33572]: I1204 22:36:01.666843 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"415ef8f6-9405-4807-8661-ca51383aa454\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " Dec 04 22:36:01.666958 master-0 kubenswrapper[33572]: I1204 22:36:01.666924 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-combined-ca-bundle\") pod \"415ef8f6-9405-4807-8661-ca51383aa454\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " Dec 04 22:36:01.666993 master-0 kubenswrapper[33572]: I1204 22:36:01.666972 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-scripts\") pod \"415ef8f6-9405-4807-8661-ca51383aa454\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " Dec 04 22:36:01.667026 master-0 kubenswrapper[33572]: I1204 22:36:01.667015 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4znz\" (UniqueName: \"kubernetes.io/projected/415ef8f6-9405-4807-8661-ca51383aa454-kube-api-access-h4znz\") pod \"415ef8f6-9405-4807-8661-ca51383aa454\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " Dec 04 22:36:01.667059 master-0 kubenswrapper[33572]: I1204 22:36:01.667042 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-config\") pod \"415ef8f6-9405-4807-8661-ca51383aa454\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " Dec 04 22:36:01.667154 master-0 kubenswrapper[33572]: I1204 22:36:01.667124 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/415ef8f6-9405-4807-8661-ca51383aa454-etc-podinfo\") pod \"415ef8f6-9405-4807-8661-ca51383aa454\" (UID: \"415ef8f6-9405-4807-8661-ca51383aa454\") " Dec 04 22:36:01.667959 master-0 kubenswrapper[33572]: I1204 22:36:01.667900 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "415ef8f6-9405-4807-8661-ca51383aa454" (UID: "415ef8f6-9405-4807-8661-ca51383aa454"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:36:01.668595 master-0 kubenswrapper[33572]: I1204 22:36:01.668526 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "415ef8f6-9405-4807-8661-ca51383aa454" (UID: "415ef8f6-9405-4807-8661-ca51383aa454"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:36:01.675566 master-0 kubenswrapper[33572]: I1204 22:36:01.675514 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-659f96b785-cjs9k" Dec 04 22:36:01.676525 master-0 kubenswrapper[33572]: I1204 22:36:01.676102 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/415ef8f6-9405-4807-8661-ca51383aa454-kube-api-access-h4znz" (OuterVolumeSpecName: "kube-api-access-h4znz") pod "415ef8f6-9405-4807-8661-ca51383aa454" (UID: "415ef8f6-9405-4807-8661-ca51383aa454"). InnerVolumeSpecName "kube-api-access-h4znz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:01.676936 master-0 kubenswrapper[33572]: I1204 22:36:01.676623 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/415ef8f6-9405-4807-8661-ca51383aa454-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "415ef8f6-9405-4807-8661-ca51383aa454" (UID: "415ef8f6-9405-4807-8661-ca51383aa454"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 22:36:01.689237 master-0 kubenswrapper[33572]: I1204 22:36:01.687312 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-scripts" (OuterVolumeSpecName: "scripts") pod "415ef8f6-9405-4807-8661-ca51383aa454" (UID: "415ef8f6-9405-4807-8661-ca51383aa454"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:01.728976 master-0 kubenswrapper[33572]: I1204 22:36:01.728860 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-config" (OuterVolumeSpecName: "config") pod "415ef8f6-9405-4807-8661-ca51383aa454" (UID: "415ef8f6-9405-4807-8661-ca51383aa454"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:01.742939 master-0 kubenswrapper[33572]: I1204 22:36:01.742858 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "415ef8f6-9405-4807-8661-ca51383aa454" (UID: "415ef8f6-9405-4807-8661-ca51383aa454"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:01.774448 master-0 kubenswrapper[33572]: I1204 22:36:01.774369 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:01.774448 master-0 kubenswrapper[33572]: I1204 22:36:01.774434 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:01.774448 master-0 kubenswrapper[33572]: I1204 22:36:01.774444 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4znz\" (UniqueName: \"kubernetes.io/projected/415ef8f6-9405-4807-8661-ca51383aa454-kube-api-access-h4znz\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:01.774448 master-0 kubenswrapper[33572]: I1204 22:36:01.774455 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/415ef8f6-9405-4807-8661-ca51383aa454-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:01.774448 master-0 kubenswrapper[33572]: I1204 22:36:01.774465 33572 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/415ef8f6-9405-4807-8661-ca51383aa454-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:01.774812 master-0 kubenswrapper[33572]: I1204 22:36:01.774474 33572 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:01.774812 master-0 kubenswrapper[33572]: I1204 22:36:01.774484 33572 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/415ef8f6-9405-4807-8661-ca51383aa454-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:01.789772 master-0 kubenswrapper[33572]: I1204 22:36:01.788998 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8449cb68d4-sd8ww"] Dec 04 22:36:01.789772 master-0 kubenswrapper[33572]: I1204 22:36:01.789347 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8449cb68d4-sd8ww" podUID="b143d9c0-797c-4160-ac50-6264b20e4e34" containerName="neutron-api" containerID="cri-o://19ae8bad0cc98de0a31ce98e9a98f9b0f3fad2fcfe775e4a7419a369980623df" gracePeriod=30 Dec 04 22:36:01.790056 master-0 kubenswrapper[33572]: I1204 22:36:01.789993 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8449cb68d4-sd8ww" podUID="b143d9c0-797c-4160-ac50-6264b20e4e34" containerName="neutron-httpd" containerID="cri-o://fb8a84152e354a7086630d5fdd8b6221056abd41809a71bb9eeeab0f0e215536" gracePeriod=30 Dec 04 22:36:02.047892 master-0 kubenswrapper[33572]: I1204 22:36:02.047788 33572 generic.go:334] "Generic (PLEG): container finished" podID="b143d9c0-797c-4160-ac50-6264b20e4e34" containerID="fb8a84152e354a7086630d5fdd8b6221056abd41809a71bb9eeeab0f0e215536" exitCode=0 Dec 04 22:36:02.047892 master-0 kubenswrapper[33572]: I1204 22:36:02.047870 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8449cb68d4-sd8ww" event={"ID":"b143d9c0-797c-4160-ac50-6264b20e4e34","Type":"ContainerDied","Data":"fb8a84152e354a7086630d5fdd8b6221056abd41809a71bb9eeeab0f0e215536"} Dec 04 22:36:02.050017 master-0 kubenswrapper[33572]: I1204 22:36:02.049975 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-nc4qg" event={"ID":"415ef8f6-9405-4807-8661-ca51383aa454","Type":"ContainerDied","Data":"a593975623a72901f7989a74076ae3c520a179084e007b146ef3040bc33728df"} Dec 04 22:36:02.050017 master-0 kubenswrapper[33572]: I1204 22:36:02.050010 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a593975623a72901f7989a74076ae3c520a179084e007b146ef3040bc33728df" Dec 04 22:36:02.050254 master-0 kubenswrapper[33572]: I1204 22:36:02.050133 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-nc4qg" Dec 04 22:36:02.844526 master-0 kubenswrapper[33572]: I1204 22:36:02.840480 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d974c476-m8fbn"] Dec 04 22:36:02.844526 master-0 kubenswrapper[33572]: E1204 22:36:02.841102 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415ef8f6-9405-4807-8661-ca51383aa454" containerName="ironic-inspector-db-sync" Dec 04 22:36:02.844526 master-0 kubenswrapper[33572]: I1204 22:36:02.841119 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="415ef8f6-9405-4807-8661-ca51383aa454" containerName="ironic-inspector-db-sync" Dec 04 22:36:02.844526 master-0 kubenswrapper[33572]: I1204 22:36:02.841413 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="415ef8f6-9405-4807-8661-ca51383aa454" containerName="ironic-inspector-db-sync" Dec 04 22:36:02.844526 master-0 kubenswrapper[33572]: I1204 22:36:02.841434 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="578da66e-114f-4fb7-ace1-ae07e428e2b3" containerName="ironic-api" Dec 04 22:36:02.844526 master-0 kubenswrapper[33572]: I1204 22:36:02.843019 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:02.849579 master-0 kubenswrapper[33572]: I1204 22:36:02.846995 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 22:36:02.849579 master-0 kubenswrapper[33572]: I1204 22:36:02.847268 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 04 22:36:02.849579 master-0 kubenswrapper[33572]: I1204 22:36:02.847453 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 04 22:36:02.855463 master-0 kubenswrapper[33572]: I1204 22:36:02.854512 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d974c476-m8fbn"] Dec 04 22:36:02.929075 master-0 kubenswrapper[33572]: I1204 22:36:02.928056 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-public-tls-certs\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:02.929075 master-0 kubenswrapper[33572]: I1204 22:36:02.928182 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-internal-tls-certs\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:02.929075 master-0 kubenswrapper[33572]: I1204 22:36:02.928216 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2280c7c8-99f9-44e4-be67-358609c9c7d8-run-httpd\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:02.929075 master-0 kubenswrapper[33572]: I1204 22:36:02.928271 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-config-data\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:02.929075 master-0 kubenswrapper[33572]: I1204 22:36:02.928425 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-combined-ca-bundle\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:02.929075 master-0 kubenswrapper[33572]: I1204 22:36:02.928485 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbjp2\" (UniqueName: \"kubernetes.io/projected/2280c7c8-99f9-44e4-be67-358609c9c7d8-kube-api-access-jbjp2\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:02.929075 master-0 kubenswrapper[33572]: I1204 22:36:02.928606 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2280c7c8-99f9-44e4-be67-358609c9c7d8-log-httpd\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:02.929075 master-0 kubenswrapper[33572]: I1204 22:36:02.928662 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2280c7c8-99f9-44e4-be67-358609c9c7d8-etc-swift\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.031324 master-0 kubenswrapper[33572]: I1204 22:36:03.030773 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-public-tls-certs\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.031324 master-0 kubenswrapper[33572]: I1204 22:36:03.030863 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-internal-tls-certs\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.031324 master-0 kubenswrapper[33572]: I1204 22:36:03.030883 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2280c7c8-99f9-44e4-be67-358609c9c7d8-run-httpd\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.031324 master-0 kubenswrapper[33572]: I1204 22:36:03.030919 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-config-data\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.031324 master-0 kubenswrapper[33572]: I1204 22:36:03.030960 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-combined-ca-bundle\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.031324 master-0 kubenswrapper[33572]: I1204 22:36:03.030995 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbjp2\" (UniqueName: \"kubernetes.io/projected/2280c7c8-99f9-44e4-be67-358609c9c7d8-kube-api-access-jbjp2\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.031324 master-0 kubenswrapper[33572]: I1204 22:36:03.031029 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2280c7c8-99f9-44e4-be67-358609c9c7d8-log-httpd\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.031324 master-0 kubenswrapper[33572]: I1204 22:36:03.031052 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2280c7c8-99f9-44e4-be67-358609c9c7d8-etc-swift\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.039747 master-0 kubenswrapper[33572]: I1204 22:36:03.039657 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2280c7c8-99f9-44e4-be67-358609c9c7d8-run-httpd\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.059248 master-0 kubenswrapper[33572]: I1204 22:36:03.042519 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-config-data\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.059248 master-0 kubenswrapper[33572]: I1204 22:36:03.043678 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-internal-tls-certs\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.059248 master-0 kubenswrapper[33572]: I1204 22:36:03.050054 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2280c7c8-99f9-44e4-be67-358609c9c7d8-log-httpd\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.059248 master-0 kubenswrapper[33572]: I1204 22:36:03.057546 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2280c7c8-99f9-44e4-be67-358609c9c7d8-etc-swift\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.082606 master-0 kubenswrapper[33572]: I1204 22:36:03.060463 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-public-tls-certs\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.082606 master-0 kubenswrapper[33572]: I1204 22:36:03.065599 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2280c7c8-99f9-44e4-be67-358609c9c7d8-combined-ca-bundle\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.109522 master-0 kubenswrapper[33572]: I1204 22:36:03.105466 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbjp2\" (UniqueName: \"kubernetes.io/projected/2280c7c8-99f9-44e4-be67-358609c9c7d8-kube-api-access-jbjp2\") pod \"swift-proxy-d974c476-m8fbn\" (UID: \"2280c7c8-99f9-44e4-be67-358609c9c7d8\") " pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.177764 master-0 kubenswrapper[33572]: I1204 22:36:03.172848 33572 generic.go:334] "Generic (PLEG): container finished" podID="b143d9c0-797c-4160-ac50-6264b20e4e34" containerID="19ae8bad0cc98de0a31ce98e9a98f9b0f3fad2fcfe775e4a7419a369980623df" exitCode=0 Dec 04 22:36:03.177764 master-0 kubenswrapper[33572]: I1204 22:36:03.172911 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8449cb68d4-sd8ww" event={"ID":"b143d9c0-797c-4160-ac50-6264b20e4e34","Type":"ContainerDied","Data":"19ae8bad0cc98de0a31ce98e9a98f9b0f3fad2fcfe775e4a7419a369980623df"} Dec 04 22:36:03.287674 master-0 kubenswrapper[33572]: I1204 22:36:03.286836 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:03.445571 master-0 kubenswrapper[33572]: I1204 22:36:03.443583 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7675d-default-internal-api-0"] Dec 04 22:36:03.445571 master-0 kubenswrapper[33572]: I1204 22:36:03.443887 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-7675d-default-internal-api-0" podUID="4916f987-e677-4aee-b52c-88534ce7b28b" containerName="glance-log" containerID="cri-o://39c7519c41cb432277faa3e4c3b52c354404031a6488e3cb4f0175ea369923a3" gracePeriod=30 Dec 04 22:36:03.445571 master-0 kubenswrapper[33572]: I1204 22:36:03.444438 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-7675d-default-internal-api-0" podUID="4916f987-e677-4aee-b52c-88534ce7b28b" containerName="glance-httpd" containerID="cri-o://7c587e73131d9e3b574fc84f4d7a73ef757331f5e2f3fd6e2e7459e88734982d" gracePeriod=30 Dec 04 22:36:03.479219 master-0 kubenswrapper[33572]: I1204 22:36:03.479167 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:36:03.515666 master-0 kubenswrapper[33572]: I1204 22:36:03.513917 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-httpd-config\") pod \"b143d9c0-797c-4160-ac50-6264b20e4e34\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " Dec 04 22:36:03.515666 master-0 kubenswrapper[33572]: I1204 22:36:03.514143 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzzb5\" (UniqueName: \"kubernetes.io/projected/b143d9c0-797c-4160-ac50-6264b20e4e34-kube-api-access-dzzb5\") pod \"b143d9c0-797c-4160-ac50-6264b20e4e34\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " Dec 04 22:36:03.515666 master-0 kubenswrapper[33572]: I1204 22:36:03.514193 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-ovndb-tls-certs\") pod \"b143d9c0-797c-4160-ac50-6264b20e4e34\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " Dec 04 22:36:03.515666 master-0 kubenswrapper[33572]: I1204 22:36:03.514258 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-config\") pod \"b143d9c0-797c-4160-ac50-6264b20e4e34\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " Dec 04 22:36:03.515666 master-0 kubenswrapper[33572]: I1204 22:36:03.514319 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-combined-ca-bundle\") pod \"b143d9c0-797c-4160-ac50-6264b20e4e34\" (UID: \"b143d9c0-797c-4160-ac50-6264b20e4e34\") " Dec 04 22:36:03.522842 master-0 kubenswrapper[33572]: I1204 22:36:03.521993 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b143d9c0-797c-4160-ac50-6264b20e4e34-kube-api-access-dzzb5" (OuterVolumeSpecName: "kube-api-access-dzzb5") pod "b143d9c0-797c-4160-ac50-6264b20e4e34" (UID: "b143d9c0-797c-4160-ac50-6264b20e4e34"). InnerVolumeSpecName "kube-api-access-dzzb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:03.527682 master-0 kubenswrapper[33572]: I1204 22:36:03.527630 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b143d9c0-797c-4160-ac50-6264b20e4e34" (UID: "b143d9c0-797c-4160-ac50-6264b20e4e34"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:03.584431 master-0 kubenswrapper[33572]: I1204 22:36:03.584373 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-config" (OuterVolumeSpecName: "config") pod "b143d9c0-797c-4160-ac50-6264b20e4e34" (UID: "b143d9c0-797c-4160-ac50-6264b20e4e34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:03.616643 master-0 kubenswrapper[33572]: I1204 22:36:03.616491 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzzb5\" (UniqueName: \"kubernetes.io/projected/b143d9c0-797c-4160-ac50-6264b20e4e34-kube-api-access-dzzb5\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:03.616643 master-0 kubenswrapper[33572]: I1204 22:36:03.616549 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:03.616643 master-0 kubenswrapper[33572]: I1204 22:36:03.616560 33572 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-httpd-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:03.637023 master-0 kubenswrapper[33572]: I1204 22:36:03.636948 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b143d9c0-797c-4160-ac50-6264b20e4e34" (UID: "b143d9c0-797c-4160-ac50-6264b20e4e34"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:03.651408 master-0 kubenswrapper[33572]: I1204 22:36:03.651304 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b143d9c0-797c-4160-ac50-6264b20e4e34" (UID: "b143d9c0-797c-4160-ac50-6264b20e4e34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:03.719793 master-0 kubenswrapper[33572]: I1204 22:36:03.718391 33572 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:03.719793 master-0 kubenswrapper[33572]: I1204 22:36:03.718434 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b143d9c0-797c-4160-ac50-6264b20e4e34-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:03.865116 master-0 kubenswrapper[33572]: I1204 22:36:03.864939 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d974c476-m8fbn"] Dec 04 22:36:03.875784 master-0 kubenswrapper[33572]: W1204 22:36:03.875697 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2280c7c8_99f9_44e4_be67_358609c9c7d8.slice/crio-72ccd94e3443ff7305c71cad4516e6db4a8ffc26f05d1262c486c5c6a1c2d873 WatchSource:0}: Error finding container 72ccd94e3443ff7305c71cad4516e6db4a8ffc26f05d1262c486c5c6a1c2d873: Status 404 returned error can't find the container with id 72ccd94e3443ff7305c71cad4516e6db4a8ffc26f05d1262c486c5c6a1c2d873 Dec 04 22:36:04.168553 master-0 kubenswrapper[33572]: I1204 22:36:04.168517 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9fccc65cc-8m4nz"] Dec 04 22:36:04.169479 master-0 kubenswrapper[33572]: E1204 22:36:04.169460 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b143d9c0-797c-4160-ac50-6264b20e4e34" containerName="neutron-httpd" Dec 04 22:36:04.169587 master-0 kubenswrapper[33572]: I1204 22:36:04.169575 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b143d9c0-797c-4160-ac50-6264b20e4e34" containerName="neutron-httpd" Dec 04 22:36:04.169671 master-0 kubenswrapper[33572]: E1204 22:36:04.169661 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b143d9c0-797c-4160-ac50-6264b20e4e34" containerName="neutron-api" Dec 04 22:36:04.169728 master-0 kubenswrapper[33572]: I1204 22:36:04.169719 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b143d9c0-797c-4160-ac50-6264b20e4e34" containerName="neutron-api" Dec 04 22:36:04.170111 master-0 kubenswrapper[33572]: I1204 22:36:04.170055 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b143d9c0-797c-4160-ac50-6264b20e4e34" containerName="neutron-api" Dec 04 22:36:04.170197 master-0 kubenswrapper[33572]: I1204 22:36:04.170186 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b143d9c0-797c-4160-ac50-6264b20e4e34" containerName="neutron-httpd" Dec 04 22:36:04.171380 master-0 kubenswrapper[33572]: I1204 22:36:04.171363 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.203163 master-0 kubenswrapper[33572]: I1204 22:36:04.196486 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fccc65cc-8m4nz"] Dec 04 22:36:04.219344 master-0 kubenswrapper[33572]: I1204 22:36:04.219270 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8449cb68d4-sd8ww" event={"ID":"b143d9c0-797c-4160-ac50-6264b20e4e34","Type":"ContainerDied","Data":"cbcf2a201a61d69f3a6b4f0772dc469859f258b237f9432745bc778d7749e3bc"} Dec 04 22:36:04.219552 master-0 kubenswrapper[33572]: I1204 22:36:04.219354 33572 scope.go:117] "RemoveContainer" containerID="fb8a84152e354a7086630d5fdd8b6221056abd41809a71bb9eeeab0f0e215536" Dec 04 22:36:04.219552 master-0 kubenswrapper[33572]: I1204 22:36:04.219547 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8449cb68d4-sd8ww" Dec 04 22:36:04.248704 master-0 kubenswrapper[33572]: I1204 22:36:04.248159 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-nb\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.248704 master-0 kubenswrapper[33572]: I1204 22:36:04.248251 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-swift-storage-0\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.248704 master-0 kubenswrapper[33572]: I1204 22:36:04.248307 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-svc\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.248704 master-0 kubenswrapper[33572]: I1204 22:36:04.248364 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-sb\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.248704 master-0 kubenswrapper[33572]: I1204 22:36:04.248412 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fmcj\" (UniqueName: \"kubernetes.io/projected/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-kube-api-access-7fmcj\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.248704 master-0 kubenswrapper[33572]: I1204 22:36:04.248458 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-config\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.268720 master-0 kubenswrapper[33572]: I1204 22:36:04.267821 33572 generic.go:334] "Generic (PLEG): container finished" podID="4916f987-e677-4aee-b52c-88534ce7b28b" containerID="39c7519c41cb432277faa3e4c3b52c354404031a6488e3cb4f0175ea369923a3" exitCode=143 Dec 04 22:36:04.268720 master-0 kubenswrapper[33572]: I1204 22:36:04.268671 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-internal-api-0" event={"ID":"4916f987-e677-4aee-b52c-88534ce7b28b","Type":"ContainerDied","Data":"39c7519c41cb432277faa3e4c3b52c354404031a6488e3cb4f0175ea369923a3"} Dec 04 22:36:04.316525 master-0 kubenswrapper[33572]: I1204 22:36:04.309735 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8449cb68d4-sd8ww"] Dec 04 22:36:04.319255 master-0 kubenswrapper[33572]: I1204 22:36:04.318764 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d974c476-m8fbn" event={"ID":"2280c7c8-99f9-44e4-be67-358609c9c7d8","Type":"ContainerStarted","Data":"72ccd94e3443ff7305c71cad4516e6db4a8ffc26f05d1262c486c5c6a1c2d873"} Dec 04 22:36:04.320807 master-0 kubenswrapper[33572]: I1204 22:36:04.320526 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8449cb68d4-sd8ww"] Dec 04 22:36:04.350641 master-0 kubenswrapper[33572]: I1204 22:36:04.350574 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-nb\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.350817 master-0 kubenswrapper[33572]: I1204 22:36:04.350691 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-swift-storage-0\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.350817 master-0 kubenswrapper[33572]: I1204 22:36:04.350751 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-svc\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.350911 master-0 kubenswrapper[33572]: I1204 22:36:04.350823 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-sb\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.350911 master-0 kubenswrapper[33572]: I1204 22:36:04.350859 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fmcj\" (UniqueName: \"kubernetes.io/projected/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-kube-api-access-7fmcj\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.350976 master-0 kubenswrapper[33572]: I1204 22:36:04.350915 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-config\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.351974 master-0 kubenswrapper[33572]: I1204 22:36:04.351484 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-swift-storage-0\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.351974 master-0 kubenswrapper[33572]: I1204 22:36:04.351695 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-svc\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.351974 master-0 kubenswrapper[33572]: I1204 22:36:04.351966 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-config\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.352209 master-0 kubenswrapper[33572]: I1204 22:36:04.352069 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-sb\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.353456 master-0 kubenswrapper[33572]: I1204 22:36:04.352588 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-nb\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.386704 master-0 kubenswrapper[33572]: I1204 22:36:04.386582 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fmcj\" (UniqueName: \"kubernetes.io/projected/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-kube-api-access-7fmcj\") pod \"dnsmasq-dns-9fccc65cc-8m4nz\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.408157 master-0 kubenswrapper[33572]: I1204 22:36:04.408079 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Dec 04 22:36:04.413211 master-0 kubenswrapper[33572]: I1204 22:36:04.413161 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 04 22:36:04.420477 master-0 kubenswrapper[33572]: I1204 22:36:04.420102 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Dec 04 22:36:04.435655 master-0 kubenswrapper[33572]: I1204 22:36:04.433593 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Dec 04 22:36:04.435655 master-0 kubenswrapper[33572]: I1204 22:36:04.433787 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Dec 04 22:36:04.435655 master-0 kubenswrapper[33572]: I1204 22:36:04.433906 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Dec 04 22:36:04.456735 master-0 kubenswrapper[33572]: I1204 22:36:04.456653 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bad9361a-2000-4ec4-91fe-6936f503f86d-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.456978 master-0 kubenswrapper[33572]: I1204 22:36:04.456791 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-scripts\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.456978 master-0 kubenswrapper[33572]: I1204 22:36:04.456817 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.456978 master-0 kubenswrapper[33572]: I1204 22:36:04.456946 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-config\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.457093 master-0 kubenswrapper[33572]: I1204 22:36:04.457062 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.457302 master-0 kubenswrapper[33572]: I1204 22:36:04.457283 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.457436 master-0 kubenswrapper[33572]: I1204 22:36:04.457417 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fkbp\" (UniqueName: \"kubernetes.io/projected/bad9361a-2000-4ec4-91fe-6936f503f86d-kube-api-access-9fkbp\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.505674 master-0 kubenswrapper[33572]: I1204 22:36:04.505139 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:04.543955 master-0 kubenswrapper[33572]: I1204 22:36:04.543846 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b143d9c0-797c-4160-ac50-6264b20e4e34" path="/var/lib/kubelet/pods/b143d9c0-797c-4160-ac50-6264b20e4e34/volumes" Dec 04 22:36:04.559149 master-0 kubenswrapper[33572]: I1204 22:36:04.559094 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bad9361a-2000-4ec4-91fe-6936f503f86d-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.559927 master-0 kubenswrapper[33572]: I1204 22:36:04.559899 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-scripts\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.560048 master-0 kubenswrapper[33572]: I1204 22:36:04.560029 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.560198 master-0 kubenswrapper[33572]: I1204 22:36:04.560185 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-config\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.560357 master-0 kubenswrapper[33572]: I1204 22:36:04.560323 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.560991 master-0 kubenswrapper[33572]: I1204 22:36:04.560972 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.561266 master-0 kubenswrapper[33572]: I1204 22:36:04.561236 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fkbp\" (UniqueName: \"kubernetes.io/projected/bad9361a-2000-4ec4-91fe-6936f503f86d-kube-api-access-9fkbp\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.564320 master-0 kubenswrapper[33572]: I1204 22:36:04.563553 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.564320 master-0 kubenswrapper[33572]: I1204 22:36:04.563753 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.566601 master-0 kubenswrapper[33572]: I1204 22:36:04.566340 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-scripts\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.567174 master-0 kubenswrapper[33572]: I1204 22:36:04.567064 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bad9361a-2000-4ec4-91fe-6936f503f86d-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.568183 master-0 kubenswrapper[33572]: I1204 22:36:04.568148 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.576390 master-0 kubenswrapper[33572]: I1204 22:36:04.576338 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-config\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.586833 master-0 kubenswrapper[33572]: I1204 22:36:04.586777 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fkbp\" (UniqueName: \"kubernetes.io/projected/bad9361a-2000-4ec4-91fe-6936f503f86d-kube-api-access-9fkbp\") pod \"ironic-inspector-0\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:04.822648 master-0 kubenswrapper[33572]: I1204 22:36:04.822590 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 04 22:36:05.338939 master-0 kubenswrapper[33572]: I1204 22:36:05.338869 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d974c476-m8fbn" event={"ID":"2280c7c8-99f9-44e4-be67-358609c9c7d8","Type":"ContainerStarted","Data":"0b503d1605e81f435d109c0e97ab039b4600f0080066d6b01bac118a7b8a92c1"} Dec 04 22:36:05.525087 master-0 kubenswrapper[33572]: I1204 22:36:05.525028 33572 scope.go:117] "RemoveContainer" containerID="c7f3123b297498e99e0be827847475a3b74290a3707c53aea234e7f37d6d177f" Dec 04 22:36:07.369921 master-0 kubenswrapper[33572]: I1204 22:36:07.369666 33572 generic.go:334] "Generic (PLEG): container finished" podID="4916f987-e677-4aee-b52c-88534ce7b28b" containerID="7c587e73131d9e3b574fc84f4d7a73ef757331f5e2f3fd6e2e7459e88734982d" exitCode=0 Dec 04 22:36:07.369921 master-0 kubenswrapper[33572]: I1204 22:36:07.369736 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-internal-api-0" event={"ID":"4916f987-e677-4aee-b52c-88534ce7b28b","Type":"ContainerDied","Data":"7c587e73131d9e3b574fc84f4d7a73ef757331f5e2f3fd6e2e7459e88734982d"} Dec 04 22:36:07.445245 master-0 kubenswrapper[33572]: I1204 22:36:07.445143 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Dec 04 22:36:07.982525 master-0 kubenswrapper[33572]: I1204 22:36:07.980422 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-trw6l"] Dec 04 22:36:07.982525 master-0 kubenswrapper[33572]: I1204 22:36:07.982092 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trw6l" Dec 04 22:36:07.994674 master-0 kubenswrapper[33572]: I1204 22:36:07.993937 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-trw6l"] Dec 04 22:36:08.068402 master-0 kubenswrapper[33572]: I1204 22:36:08.060546 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf9072a8-1d23-4e01-b480-6aa621ae89a3-operator-scripts\") pod \"nova-api-db-create-trw6l\" (UID: \"cf9072a8-1d23-4e01-b480-6aa621ae89a3\") " pod="openstack/nova-api-db-create-trw6l" Dec 04 22:36:08.068402 master-0 kubenswrapper[33572]: I1204 22:36:08.060756 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7ls\" (UniqueName: \"kubernetes.io/projected/cf9072a8-1d23-4e01-b480-6aa621ae89a3-kube-api-access-np7ls\") pod \"nova-api-db-create-trw6l\" (UID: \"cf9072a8-1d23-4e01-b480-6aa621ae89a3\") " pod="openstack/nova-api-db-create-trw6l" Dec 04 22:36:08.166417 master-0 kubenswrapper[33572]: I1204 22:36:08.163624 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf9072a8-1d23-4e01-b480-6aa621ae89a3-operator-scripts\") pod \"nova-api-db-create-trw6l\" (UID: \"cf9072a8-1d23-4e01-b480-6aa621ae89a3\") " pod="openstack/nova-api-db-create-trw6l" Dec 04 22:36:08.166417 master-0 kubenswrapper[33572]: I1204 22:36:08.163788 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np7ls\" (UniqueName: \"kubernetes.io/projected/cf9072a8-1d23-4e01-b480-6aa621ae89a3-kube-api-access-np7ls\") pod \"nova-api-db-create-trw6l\" (UID: \"cf9072a8-1d23-4e01-b480-6aa621ae89a3\") " pod="openstack/nova-api-db-create-trw6l" Dec 04 22:36:08.166417 master-0 kubenswrapper[33572]: I1204 22:36:08.164481 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf9072a8-1d23-4e01-b480-6aa621ae89a3-operator-scripts\") pod \"nova-api-db-create-trw6l\" (UID: \"cf9072a8-1d23-4e01-b480-6aa621ae89a3\") " pod="openstack/nova-api-db-create-trw6l" Dec 04 22:36:08.177137 master-0 kubenswrapper[33572]: I1204 22:36:08.177052 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-23b3-account-create-update-kzbkj"] Dec 04 22:36:08.180852 master-0 kubenswrapper[33572]: I1204 22:36:08.180815 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23b3-account-create-update-kzbkj" Dec 04 22:36:08.184806 master-0 kubenswrapper[33572]: I1204 22:36:08.184758 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np7ls\" (UniqueName: \"kubernetes.io/projected/cf9072a8-1d23-4e01-b480-6aa621ae89a3-kube-api-access-np7ls\") pod \"nova-api-db-create-trw6l\" (UID: \"cf9072a8-1d23-4e01-b480-6aa621ae89a3\") " pod="openstack/nova-api-db-create-trw6l" Dec 04 22:36:08.200735 master-0 kubenswrapper[33572]: I1204 22:36:08.200666 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 04 22:36:08.208321 master-0 kubenswrapper[33572]: I1204 22:36:08.208252 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-23b3-account-create-update-kzbkj"] Dec 04 22:36:08.265842 master-0 kubenswrapper[33572]: I1204 22:36:08.265790 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zhld\" (UniqueName: \"kubernetes.io/projected/d81f44b5-895d-417b-aff2-1d54081061f9-kube-api-access-9zhld\") pod \"nova-api-23b3-account-create-update-kzbkj\" (UID: \"d81f44b5-895d-417b-aff2-1d54081061f9\") " pod="openstack/nova-api-23b3-account-create-update-kzbkj" Dec 04 22:36:08.266054 master-0 kubenswrapper[33572]: I1204 22:36:08.265949 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d81f44b5-895d-417b-aff2-1d54081061f9-operator-scripts\") pod \"nova-api-23b3-account-create-update-kzbkj\" (UID: \"d81f44b5-895d-417b-aff2-1d54081061f9\") " pod="openstack/nova-api-23b3-account-create-update-kzbkj" Dec 04 22:36:08.285867 master-0 kubenswrapper[33572]: I1204 22:36:08.284243 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-2q2sn"] Dec 04 22:36:08.287064 master-0 kubenswrapper[33572]: I1204 22:36:08.287035 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2q2sn" Dec 04 22:36:08.294545 master-0 kubenswrapper[33572]: I1204 22:36:08.294483 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2q2sn"] Dec 04 22:36:08.312529 master-0 kubenswrapper[33572]: I1204 22:36:08.312460 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trw6l" Dec 04 22:36:08.368636 master-0 kubenswrapper[33572]: I1204 22:36:08.368546 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d81f44b5-895d-417b-aff2-1d54081061f9-operator-scripts\") pod \"nova-api-23b3-account-create-update-kzbkj\" (UID: \"d81f44b5-895d-417b-aff2-1d54081061f9\") " pod="openstack/nova-api-23b3-account-create-update-kzbkj" Dec 04 22:36:08.368962 master-0 kubenswrapper[33572]: I1204 22:36:08.368930 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67167b90-f405-4945-a970-7c5d1a5dcef7-operator-scripts\") pod \"nova-cell0-db-create-2q2sn\" (UID: \"67167b90-f405-4945-a970-7c5d1a5dcef7\") " pod="openstack/nova-cell0-db-create-2q2sn" Dec 04 22:36:08.369057 master-0 kubenswrapper[33572]: I1204 22:36:08.369040 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zhld\" (UniqueName: \"kubernetes.io/projected/d81f44b5-895d-417b-aff2-1d54081061f9-kube-api-access-9zhld\") pod \"nova-api-23b3-account-create-update-kzbkj\" (UID: \"d81f44b5-895d-417b-aff2-1d54081061f9\") " pod="openstack/nova-api-23b3-account-create-update-kzbkj" Dec 04 22:36:08.369117 master-0 kubenswrapper[33572]: I1204 22:36:08.369098 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhhdv\" (UniqueName: \"kubernetes.io/projected/67167b90-f405-4945-a970-7c5d1a5dcef7-kube-api-access-qhhdv\") pod \"nova-cell0-db-create-2q2sn\" (UID: \"67167b90-f405-4945-a970-7c5d1a5dcef7\") " pod="openstack/nova-cell0-db-create-2q2sn" Dec 04 22:36:08.369493 master-0 kubenswrapper[33572]: I1204 22:36:08.369328 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d81f44b5-895d-417b-aff2-1d54081061f9-operator-scripts\") pod \"nova-api-23b3-account-create-update-kzbkj\" (UID: \"d81f44b5-895d-417b-aff2-1d54081061f9\") " pod="openstack/nova-api-23b3-account-create-update-kzbkj" Dec 04 22:36:08.377364 master-0 kubenswrapper[33572]: I1204 22:36:08.377317 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-j7k4l"] Dec 04 22:36:08.383989 master-0 kubenswrapper[33572]: I1204 22:36:08.379908 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j7k4l" Dec 04 22:36:08.388533 master-0 kubenswrapper[33572]: I1204 22:36:08.388472 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-80fd-account-create-update-4cjck"] Dec 04 22:36:08.390122 master-0 kubenswrapper[33572]: I1204 22:36:08.390091 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-80fd-account-create-update-4cjck" Dec 04 22:36:08.391864 master-0 kubenswrapper[33572]: I1204 22:36:08.391691 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 04 22:36:08.400396 master-0 kubenswrapper[33572]: I1204 22:36:08.400307 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-j7k4l"] Dec 04 22:36:08.414012 master-0 kubenswrapper[33572]: I1204 22:36:08.413964 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-80fd-account-create-update-4cjck"] Dec 04 22:36:08.421350 master-0 kubenswrapper[33572]: I1204 22:36:08.421301 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zhld\" (UniqueName: \"kubernetes.io/projected/d81f44b5-895d-417b-aff2-1d54081061f9-kube-api-access-9zhld\") pod \"nova-api-23b3-account-create-update-kzbkj\" (UID: \"d81f44b5-895d-417b-aff2-1d54081061f9\") " pod="openstack/nova-api-23b3-account-create-update-kzbkj" Dec 04 22:36:08.470801 master-0 kubenswrapper[33572]: I1204 22:36:08.470762 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf6nl\" (UniqueName: \"kubernetes.io/projected/2306e236-149a-4214-8600-218585ace100-kube-api-access-xf6nl\") pod \"nova-cell0-80fd-account-create-update-4cjck\" (UID: \"2306e236-149a-4214-8600-218585ace100\") " pod="openstack/nova-cell0-80fd-account-create-update-4cjck" Dec 04 22:36:08.471056 master-0 kubenswrapper[33572]: I1204 22:36:08.471037 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5qdd\" (UniqueName: \"kubernetes.io/projected/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-kube-api-access-z5qdd\") pod \"nova-cell1-db-create-j7k4l\" (UID: \"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b\") " pod="openstack/nova-cell1-db-create-j7k4l" Dec 04 22:36:08.471236 master-0 kubenswrapper[33572]: I1204 22:36:08.471217 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2306e236-149a-4214-8600-218585ace100-operator-scripts\") pod \"nova-cell0-80fd-account-create-update-4cjck\" (UID: \"2306e236-149a-4214-8600-218585ace100\") " pod="openstack/nova-cell0-80fd-account-create-update-4cjck" Dec 04 22:36:08.471375 master-0 kubenswrapper[33572]: I1204 22:36:08.471353 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-operator-scripts\") pod \"nova-cell1-db-create-j7k4l\" (UID: \"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b\") " pod="openstack/nova-cell1-db-create-j7k4l" Dec 04 22:36:08.472176 master-0 kubenswrapper[33572]: I1204 22:36:08.472118 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67167b90-f405-4945-a970-7c5d1a5dcef7-operator-scripts\") pod \"nova-cell0-db-create-2q2sn\" (UID: \"67167b90-f405-4945-a970-7c5d1a5dcef7\") " pod="openstack/nova-cell0-db-create-2q2sn" Dec 04 22:36:08.472267 master-0 kubenswrapper[33572]: I1204 22:36:08.472252 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67167b90-f405-4945-a970-7c5d1a5dcef7-operator-scripts\") pod \"nova-cell0-db-create-2q2sn\" (UID: \"67167b90-f405-4945-a970-7c5d1a5dcef7\") " pod="openstack/nova-cell0-db-create-2q2sn" Dec 04 22:36:08.472449 master-0 kubenswrapper[33572]: I1204 22:36:08.472435 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhhdv\" (UniqueName: \"kubernetes.io/projected/67167b90-f405-4945-a970-7c5d1a5dcef7-kube-api-access-qhhdv\") pod \"nova-cell0-db-create-2q2sn\" (UID: \"67167b90-f405-4945-a970-7c5d1a5dcef7\") " pod="openstack/nova-cell0-db-create-2q2sn" Dec 04 22:36:08.500549 master-0 kubenswrapper[33572]: I1204 22:36:08.494132 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhhdv\" (UniqueName: \"kubernetes.io/projected/67167b90-f405-4945-a970-7c5d1a5dcef7-kube-api-access-qhhdv\") pod \"nova-cell0-db-create-2q2sn\" (UID: \"67167b90-f405-4945-a970-7c5d1a5dcef7\") " pod="openstack/nova-cell0-db-create-2q2sn" Dec 04 22:36:08.578916 master-0 kubenswrapper[33572]: I1204 22:36:08.577860 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2433-account-create-update-cdgnq"] Dec 04 22:36:08.580390 master-0 kubenswrapper[33572]: I1204 22:36:08.580359 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2433-account-create-update-cdgnq" Dec 04 22:36:08.581705 master-0 kubenswrapper[33572]: I1204 22:36:08.581667 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23b3-account-create-update-kzbkj" Dec 04 22:36:08.582980 master-0 kubenswrapper[33572]: I1204 22:36:08.582935 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf6nl\" (UniqueName: \"kubernetes.io/projected/2306e236-149a-4214-8600-218585ace100-kube-api-access-xf6nl\") pod \"nova-cell0-80fd-account-create-update-4cjck\" (UID: \"2306e236-149a-4214-8600-218585ace100\") " pod="openstack/nova-cell0-80fd-account-create-update-4cjck" Dec 04 22:36:08.583460 master-0 kubenswrapper[33572]: I1204 22:36:08.583424 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5qdd\" (UniqueName: \"kubernetes.io/projected/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-kube-api-access-z5qdd\") pod \"nova-cell1-db-create-j7k4l\" (UID: \"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b\") " pod="openstack/nova-cell1-db-create-j7k4l" Dec 04 22:36:08.583621 master-0 kubenswrapper[33572]: I1204 22:36:08.583574 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2306e236-149a-4214-8600-218585ace100-operator-scripts\") pod \"nova-cell0-80fd-account-create-update-4cjck\" (UID: \"2306e236-149a-4214-8600-218585ace100\") " pod="openstack/nova-cell0-80fd-account-create-update-4cjck" Dec 04 22:36:08.583722 master-0 kubenswrapper[33572]: I1204 22:36:08.583696 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-operator-scripts\") pod \"nova-cell1-db-create-j7k4l\" (UID: \"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b\") " pod="openstack/nova-cell1-db-create-j7k4l" Dec 04 22:36:08.584975 master-0 kubenswrapper[33572]: I1204 22:36:08.584937 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2306e236-149a-4214-8600-218585ace100-operator-scripts\") pod \"nova-cell0-80fd-account-create-update-4cjck\" (UID: \"2306e236-149a-4214-8600-218585ace100\") " pod="openstack/nova-cell0-80fd-account-create-update-4cjck" Dec 04 22:36:08.585048 master-0 kubenswrapper[33572]: I1204 22:36:08.584980 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-operator-scripts\") pod \"nova-cell1-db-create-j7k4l\" (UID: \"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b\") " pod="openstack/nova-cell1-db-create-j7k4l" Dec 04 22:36:08.588215 master-0 kubenswrapper[33572]: I1204 22:36:08.588114 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2433-account-create-update-cdgnq"] Dec 04 22:36:08.589124 master-0 kubenswrapper[33572]: I1204 22:36:08.589073 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 04 22:36:08.604449 master-0 kubenswrapper[33572]: I1204 22:36:08.602558 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5qdd\" (UniqueName: \"kubernetes.io/projected/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-kube-api-access-z5qdd\") pod \"nova-cell1-db-create-j7k4l\" (UID: \"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b\") " pod="openstack/nova-cell1-db-create-j7k4l" Dec 04 22:36:08.607254 master-0 kubenswrapper[33572]: I1204 22:36:08.607206 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf6nl\" (UniqueName: \"kubernetes.io/projected/2306e236-149a-4214-8600-218585ace100-kube-api-access-xf6nl\") pod \"nova-cell0-80fd-account-create-update-4cjck\" (UID: \"2306e236-149a-4214-8600-218585ace100\") " pod="openstack/nova-cell0-80fd-account-create-update-4cjck" Dec 04 22:36:08.622993 master-0 kubenswrapper[33572]: I1204 22:36:08.622936 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2q2sn" Dec 04 22:36:08.686864 master-0 kubenswrapper[33572]: I1204 22:36:08.686782 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67xp\" (UniqueName: \"kubernetes.io/projected/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-kube-api-access-f67xp\") pod \"nova-cell1-2433-account-create-update-cdgnq\" (UID: \"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3\") " pod="openstack/nova-cell1-2433-account-create-update-cdgnq" Dec 04 22:36:08.687098 master-0 kubenswrapper[33572]: I1204 22:36:08.686994 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-operator-scripts\") pod \"nova-cell1-2433-account-create-update-cdgnq\" (UID: \"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3\") " pod="openstack/nova-cell1-2433-account-create-update-cdgnq" Dec 04 22:36:08.761267 master-0 kubenswrapper[33572]: I1204 22:36:08.761194 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j7k4l" Dec 04 22:36:08.769006 master-0 kubenswrapper[33572]: I1204 22:36:08.768656 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-80fd-account-create-update-4cjck" Dec 04 22:36:08.790826 master-0 kubenswrapper[33572]: I1204 22:36:08.790763 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-operator-scripts\") pod \"nova-cell1-2433-account-create-update-cdgnq\" (UID: \"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3\") " pod="openstack/nova-cell1-2433-account-create-update-cdgnq" Dec 04 22:36:08.791132 master-0 kubenswrapper[33572]: I1204 22:36:08.791007 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f67xp\" (UniqueName: \"kubernetes.io/projected/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-kube-api-access-f67xp\") pod \"nova-cell1-2433-account-create-update-cdgnq\" (UID: \"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3\") " pod="openstack/nova-cell1-2433-account-create-update-cdgnq" Dec 04 22:36:08.791735 master-0 kubenswrapper[33572]: I1204 22:36:08.791691 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-operator-scripts\") pod \"nova-cell1-2433-account-create-update-cdgnq\" (UID: \"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3\") " pod="openstack/nova-cell1-2433-account-create-update-cdgnq" Dec 04 22:36:08.807044 master-0 kubenswrapper[33572]: I1204 22:36:08.806979 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67xp\" (UniqueName: \"kubernetes.io/projected/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-kube-api-access-f67xp\") pod \"nova-cell1-2433-account-create-update-cdgnq\" (UID: \"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3\") " pod="openstack/nova-cell1-2433-account-create-update-cdgnq" Dec 04 22:36:09.006371 master-0 kubenswrapper[33572]: I1204 22:36:09.006132 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2433-account-create-update-cdgnq" Dec 04 22:36:11.303773 master-0 kubenswrapper[33572]: I1204 22:36:11.302076 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:36:11.303773 master-0 kubenswrapper[33572]: I1204 22:36:11.302371 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-7675d-default-external-api-0" podUID="bbe730dd-7824-4438-981d-bf5429987895" containerName="glance-log" containerID="cri-o://25115e1833d783845b249aff27c8ae862e9f9351ccb20cdef6a73ad6983c1391" gracePeriod=30 Dec 04 22:36:11.303773 master-0 kubenswrapper[33572]: I1204 22:36:11.302400 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-7675d-default-external-api-0" podUID="bbe730dd-7824-4438-981d-bf5429987895" containerName="glance-httpd" containerID="cri-o://1c8639fd70edb8891c15b7cf485434414e45dca658c9c6f2ba6d87161ec6ec70" gracePeriod=30 Dec 04 22:36:11.438133 master-0 kubenswrapper[33572]: I1204 22:36:11.434179 33572 generic.go:334] "Generic (PLEG): container finished" podID="bbe730dd-7824-4438-981d-bf5429987895" containerID="25115e1833d783845b249aff27c8ae862e9f9351ccb20cdef6a73ad6983c1391" exitCode=143 Dec 04 22:36:11.438133 master-0 kubenswrapper[33572]: I1204 22:36:11.434243 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-external-api-0" event={"ID":"bbe730dd-7824-4438-981d-bf5429987895","Type":"ContainerDied","Data":"25115e1833d783845b249aff27c8ae862e9f9351ccb20cdef6a73ad6983c1391"} Dec 04 22:36:15.548383 master-0 kubenswrapper[33572]: I1204 22:36:15.548318 33572 scope.go:117] "RemoveContainer" containerID="19ae8bad0cc98de0a31ce98e9a98f9b0f3fad2fcfe775e4a7419a369980623df" Dec 04 22:36:16.091884 master-0 kubenswrapper[33572]: I1204 22:36:16.091192 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:16.177852 master-0 kubenswrapper[33572]: I1204 22:36:16.177682 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-23b3-account-create-update-kzbkj"] Dec 04 22:36:16.199392 master-0 kubenswrapper[33572]: I1204 22:36:16.198140 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-internal-tls-certs\") pod \"4916f987-e677-4aee-b52c-88534ce7b28b\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " Dec 04 22:36:16.199392 master-0 kubenswrapper[33572]: I1204 22:36:16.198185 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-httpd-run\") pod \"4916f987-e677-4aee-b52c-88534ce7b28b\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " Dec 04 22:36:16.199392 master-0 kubenswrapper[33572]: I1204 22:36:16.198345 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") pod \"4916f987-e677-4aee-b52c-88534ce7b28b\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " Dec 04 22:36:16.199392 master-0 kubenswrapper[33572]: I1204 22:36:16.198386 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-config-data\") pod \"4916f987-e677-4aee-b52c-88534ce7b28b\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " Dec 04 22:36:16.199392 master-0 kubenswrapper[33572]: I1204 22:36:16.198429 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-combined-ca-bundle\") pod \"4916f987-e677-4aee-b52c-88534ce7b28b\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " Dec 04 22:36:16.199392 master-0 kubenswrapper[33572]: I1204 22:36:16.198539 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-logs\") pod \"4916f987-e677-4aee-b52c-88534ce7b28b\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " Dec 04 22:36:16.199392 master-0 kubenswrapper[33572]: I1204 22:36:16.198563 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-scripts\") pod \"4916f987-e677-4aee-b52c-88534ce7b28b\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " Dec 04 22:36:16.199392 master-0 kubenswrapper[33572]: I1204 22:36:16.198653 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r952\" (UniqueName: \"kubernetes.io/projected/4916f987-e677-4aee-b52c-88534ce7b28b-kube-api-access-4r952\") pod \"4916f987-e677-4aee-b52c-88534ce7b28b\" (UID: \"4916f987-e677-4aee-b52c-88534ce7b28b\") " Dec 04 22:36:16.203212 master-0 kubenswrapper[33572]: I1204 22:36:16.203056 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4916f987-e677-4aee-b52c-88534ce7b28b" (UID: "4916f987-e677-4aee-b52c-88534ce7b28b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:36:16.203622 master-0 kubenswrapper[33572]: I1204 22:36:16.203559 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-logs" (OuterVolumeSpecName: "logs") pod "4916f987-e677-4aee-b52c-88534ce7b28b" (UID: "4916f987-e677-4aee-b52c-88534ce7b28b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:36:16.208124 master-0 kubenswrapper[33572]: I1204 22:36:16.208083 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4916f987-e677-4aee-b52c-88534ce7b28b-kube-api-access-4r952" (OuterVolumeSpecName: "kube-api-access-4r952") pod "4916f987-e677-4aee-b52c-88534ce7b28b" (UID: "4916f987-e677-4aee-b52c-88534ce7b28b"). InnerVolumeSpecName "kube-api-access-4r952". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:16.208849 master-0 kubenswrapper[33572]: W1204 22:36:16.208790 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd81f44b5_895d_417b_aff2_1d54081061f9.slice/crio-871e0050dab48e9be28cef1468fee539072feb104c995bac01d03a98fb365b30 WatchSource:0}: Error finding container 871e0050dab48e9be28cef1468fee539072feb104c995bac01d03a98fb365b30: Status 404 returned error can't find the container with id 871e0050dab48e9be28cef1468fee539072feb104c995bac01d03a98fb365b30 Dec 04 22:36:16.228616 master-0 kubenswrapper[33572]: I1204 22:36:16.228572 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-scripts" (OuterVolumeSpecName: "scripts") pod "4916f987-e677-4aee-b52c-88534ce7b28b" (UID: "4916f987-e677-4aee-b52c-88534ce7b28b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:16.248251 master-0 kubenswrapper[33572]: I1204 22:36:16.247743 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f" (OuterVolumeSpecName: "glance") pod "4916f987-e677-4aee-b52c-88534ce7b28b" (UID: "4916f987-e677-4aee-b52c-88534ce7b28b"). InnerVolumeSpecName "pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 22:36:16.308371 master-0 kubenswrapper[33572]: I1204 22:36:16.308070 33572 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.308371 master-0 kubenswrapper[33572]: I1204 22:36:16.308136 33572 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") on node \"master-0\" " Dec 04 22:36:16.308371 master-0 kubenswrapper[33572]: I1204 22:36:16.308155 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4916f987-e677-4aee-b52c-88534ce7b28b-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.308371 master-0 kubenswrapper[33572]: I1204 22:36:16.308172 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.308371 master-0 kubenswrapper[33572]: I1204 22:36:16.308186 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r952\" (UniqueName: \"kubernetes.io/projected/4916f987-e677-4aee-b52c-88534ce7b28b-kube-api-access-4r952\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.541524 master-0 kubenswrapper[33572]: I1204 22:36:16.537141 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:16.550656 master-0 kubenswrapper[33572]: I1204 22:36:16.550605 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-internal-api-0" event={"ID":"4916f987-e677-4aee-b52c-88534ce7b28b","Type":"ContainerDied","Data":"5e33ffa1d70bdec93611e0538c3d57febf953d81f629014db6972c2187030c9f"} Dec 04 22:36:16.551182 master-0 kubenswrapper[33572]: I1204 22:36:16.550658 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" event={"ID":"c0304754-88f4-4abd-b4ff-2daf5fea19a8","Type":"ContainerStarted","Data":"aa4a2900c265affe04d8e58233c086e56904b2701c0d5a6d642783cc2d4160e5"} Dec 04 22:36:16.551182 master-0 kubenswrapper[33572]: I1204 22:36:16.550687 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-23b3-account-create-update-kzbkj" event={"ID":"d81f44b5-895d-417b-aff2-1d54081061f9","Type":"ContainerStarted","Data":"871e0050dab48e9be28cef1468fee539072feb104c995bac01d03a98fb365b30"} Dec 04 22:36:16.551182 master-0 kubenswrapper[33572]: I1204 22:36:16.550702 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-80fd-account-create-update-4cjck"] Dec 04 22:36:16.553597 master-0 kubenswrapper[33572]: I1204 22:36:16.551754 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:36:16.553597 master-0 kubenswrapper[33572]: I1204 22:36:16.551782 33572 scope.go:117] "RemoveContainer" containerID="7c587e73131d9e3b574fc84f4d7a73ef757331f5e2f3fd6e2e7459e88734982d" Dec 04 22:36:16.561538 master-0 kubenswrapper[33572]: I1204 22:36:16.560033 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d974c476-m8fbn" event={"ID":"2280c7c8-99f9-44e4-be67-358609c9c7d8","Type":"ContainerStarted","Data":"913b115454aad7aae9dea54b0b6b7119eec5c2b04da5f19d7df72d7937306228"} Dec 04 22:36:16.561538 master-0 kubenswrapper[33572]: I1204 22:36:16.561146 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:16.561538 master-0 kubenswrapper[33572]: I1204 22:36:16.561436 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:16.570524 master-0 kubenswrapper[33572]: I1204 22:36:16.565030 33572 generic.go:334] "Generic (PLEG): container finished" podID="bbe730dd-7824-4438-981d-bf5429987895" containerID="1c8639fd70edb8891c15b7cf485434414e45dca658c9c6f2ba6d87161ec6ec70" exitCode=0 Dec 04 22:36:16.570524 master-0 kubenswrapper[33572]: I1204 22:36:16.565096 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-external-api-0" event={"ID":"bbe730dd-7824-4438-981d-bf5429987895","Type":"ContainerDied","Data":"1c8639fd70edb8891c15b7cf485434414e45dca658c9c6f2ba6d87161ec6ec70"} Dec 04 22:36:16.572018 master-0 kubenswrapper[33572]: I1204 22:36:16.571698 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:16.582533 master-0 kubenswrapper[33572]: I1204 22:36:16.581887 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-d974c476-m8fbn" podUID="2280c7c8-99f9-44e4-be67-358609c9c7d8" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 22:36:16.615524 master-0 kubenswrapper[33572]: I1204 22:36:16.608477 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d974c476-m8fbn" podStartSLOduration=14.608453587 podStartE2EDuration="14.608453587s" podCreationTimestamp="2025-12-04 22:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:36:16.606700608 +0000 UTC m=+1040.334226247" watchObservedRunningTime="2025-12-04 22:36:16.608453587 +0000 UTC m=+1040.335979236" Dec 04 22:36:16.615524 master-0 kubenswrapper[33572]: I1204 22:36:16.613523 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxgmh\" (UniqueName: \"kubernetes.io/projected/bbe730dd-7824-4438-981d-bf5429987895-kube-api-access-pxgmh\") pod \"bbe730dd-7824-4438-981d-bf5429987895\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " Dec 04 22:36:16.615524 master-0 kubenswrapper[33572]: I1204 22:36:16.613801 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-combined-ca-bundle\") pod \"bbe730dd-7824-4438-981d-bf5429987895\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " Dec 04 22:36:16.615524 master-0 kubenswrapper[33572]: I1204 22:36:16.614072 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"bbe730dd-7824-4438-981d-bf5429987895\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " Dec 04 22:36:16.615524 master-0 kubenswrapper[33572]: I1204 22:36:16.614283 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-httpd-run\") pod \"bbe730dd-7824-4438-981d-bf5429987895\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " Dec 04 22:36:16.615524 master-0 kubenswrapper[33572]: I1204 22:36:16.614303 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-public-tls-certs\") pod \"bbe730dd-7824-4438-981d-bf5429987895\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " Dec 04 22:36:16.615524 master-0 kubenswrapper[33572]: I1204 22:36:16.614369 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-logs\") pod \"bbe730dd-7824-4438-981d-bf5429987895\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " Dec 04 22:36:16.615524 master-0 kubenswrapper[33572]: I1204 22:36:16.614400 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-scripts\") pod \"bbe730dd-7824-4438-981d-bf5429987895\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " Dec 04 22:36:16.615524 master-0 kubenswrapper[33572]: I1204 22:36:16.614453 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-config-data\") pod \"bbe730dd-7824-4438-981d-bf5429987895\" (UID: \"bbe730dd-7824-4438-981d-bf5429987895\") " Dec 04 22:36:16.617842 master-0 kubenswrapper[33572]: I1204 22:36:16.617121 33572 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 22:36:16.617842 master-0 kubenswrapper[33572]: I1204 22:36:16.617284 33572 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc" (UniqueName: "kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f") on node "master-0" Dec 04 22:36:16.623825 master-0 kubenswrapper[33572]: I1204 22:36:16.621578 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bbe730dd-7824-4438-981d-bf5429987895" (UID: "bbe730dd-7824-4438-981d-bf5429987895"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:36:16.623825 master-0 kubenswrapper[33572]: I1204 22:36:16.622387 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-logs" (OuterVolumeSpecName: "logs") pod "bbe730dd-7824-4438-981d-bf5429987895" (UID: "bbe730dd-7824-4438-981d-bf5429987895"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:36:16.631727 master-0 kubenswrapper[33572]: I1204 22:36:16.629075 33572 scope.go:117] "RemoveContainer" containerID="39c7519c41cb432277faa3e4c3b52c354404031a6488e3cb4f0175ea369923a3" Dec 04 22:36:16.657774 master-0 kubenswrapper[33572]: I1204 22:36:16.653873 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe730dd-7824-4438-981d-bf5429987895-kube-api-access-pxgmh" (OuterVolumeSpecName: "kube-api-access-pxgmh") pod "bbe730dd-7824-4438-981d-bf5429987895" (UID: "bbe730dd-7824-4438-981d-bf5429987895"). InnerVolumeSpecName "kube-api-access-pxgmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:16.685010 master-0 kubenswrapper[33572]: I1204 22:36:16.683982 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-scripts" (OuterVolumeSpecName: "scripts") pod "bbe730dd-7824-4438-981d-bf5429987895" (UID: "bbe730dd-7824-4438-981d-bf5429987895"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:16.694110 master-0 kubenswrapper[33572]: I1204 22:36:16.690126 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4916f987-e677-4aee-b52c-88534ce7b28b" (UID: "4916f987-e677-4aee-b52c-88534ce7b28b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:16.698532 master-0 kubenswrapper[33572]: I1204 22:36:16.696885 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358" (OuterVolumeSpecName: "glance") pod "bbe730dd-7824-4438-981d-bf5429987895" (UID: "bbe730dd-7824-4438-981d-bf5429987895"). InnerVolumeSpecName "pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 22:36:16.722535 master-0 kubenswrapper[33572]: I1204 22:36:16.719228 33572 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.722535 master-0 kubenswrapper[33572]: I1204 22:36:16.719277 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbe730dd-7824-4438-981d-bf5429987895-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.722535 master-0 kubenswrapper[33572]: I1204 22:36:16.719286 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.722535 master-0 kubenswrapper[33572]: I1204 22:36:16.719297 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxgmh\" (UniqueName: \"kubernetes.io/projected/bbe730dd-7824-4438-981d-bf5429987895-kube-api-access-pxgmh\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.722535 master-0 kubenswrapper[33572]: I1204 22:36:16.719308 33572 reconciler_common.go:293] "Volume detached for volume \"pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.722535 master-0 kubenswrapper[33572]: I1204 22:36:16.719332 33572 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") on node \"master-0\" " Dec 04 22:36:16.722535 master-0 kubenswrapper[33572]: I1204 22:36:16.719343 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.741558 master-0 kubenswrapper[33572]: I1204 22:36:16.741028 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bbe730dd-7824-4438-981d-bf5429987895" (UID: "bbe730dd-7824-4438-981d-bf5429987895"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:16.823542 master-0 kubenswrapper[33572]: I1204 22:36:16.820573 33572 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.833020 master-0 kubenswrapper[33572]: I1204 22:36:16.830690 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-config-data" (OuterVolumeSpecName: "config-data") pod "4916f987-e677-4aee-b52c-88534ce7b28b" (UID: "4916f987-e677-4aee-b52c-88534ce7b28b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:16.836776 master-0 kubenswrapper[33572]: I1204 22:36:16.836705 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbe730dd-7824-4438-981d-bf5429987895" (UID: "bbe730dd-7824-4438-981d-bf5429987895"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:16.841525 master-0 kubenswrapper[33572]: I1204 22:36:16.839843 33572 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 22:36:16.841525 master-0 kubenswrapper[33572]: I1204 22:36:16.840028 33572 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86" (UniqueName: "kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358") on node "master-0" Dec 04 22:36:16.872530 master-0 kubenswrapper[33572]: I1204 22:36:16.869461 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4916f987-e677-4aee-b52c-88534ce7b28b" (UID: "4916f987-e677-4aee-b52c-88534ce7b28b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:16.909743 master-0 kubenswrapper[33572]: I1204 22:36:16.909670 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-config-data" (OuterVolumeSpecName: "config-data") pod "bbe730dd-7824-4438-981d-bf5429987895" (UID: "bbe730dd-7824-4438-981d-bf5429987895"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:16.933209 master-0 kubenswrapper[33572]: I1204 22:36:16.933114 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.933209 master-0 kubenswrapper[33572]: I1204 22:36:16.933154 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe730dd-7824-4438-981d-bf5429987895-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.933209 master-0 kubenswrapper[33572]: I1204 22:36:16.933167 33572 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.933209 master-0 kubenswrapper[33572]: I1204 22:36:16.933176 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4916f987-e677-4aee-b52c-88534ce7b28b-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.933209 master-0 kubenswrapper[33572]: I1204 22:36:16.933186 33572 reconciler_common.go:293] "Volume detached for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:16.966691 master-0 kubenswrapper[33572]: I1204 22:36:16.966558 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2q2sn"] Dec 04 22:36:16.985527 master-0 kubenswrapper[33572]: I1204 22:36:16.985006 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fccc65cc-8m4nz"] Dec 04 22:36:16.998528 master-0 kubenswrapper[33572]: I1204 22:36:16.996399 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-j7k4l"] Dec 04 22:36:17.024466 master-0 kubenswrapper[33572]: I1204 22:36:17.014939 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-trw6l"] Dec 04 22:36:17.114529 master-0 kubenswrapper[33572]: I1204 22:36:17.111008 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Dec 04 22:36:17.135860 master-0 kubenswrapper[33572]: I1204 22:36:17.135381 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2433-account-create-update-cdgnq"] Dec 04 22:36:17.176600 master-0 kubenswrapper[33572]: W1204 22:36:17.176222 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbad9361a_2000_4ec4_91fe_6936f503f86d.slice/crio-b7193c1d1f1b9cfda30ab9afc3fc600a2d962a3c0f0038b7445d325d8619dca2 WatchSource:0}: Error finding container b7193c1d1f1b9cfda30ab9afc3fc600a2d962a3c0f0038b7445d325d8619dca2: Status 404 returned error can't find the container with id b7193c1d1f1b9cfda30ab9afc3fc600a2d962a3c0f0038b7445d325d8619dca2 Dec 04 22:36:17.208844 master-0 kubenswrapper[33572]: I1204 22:36:17.207493 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7675d-default-internal-api-0"] Dec 04 22:36:17.220721 master-0 kubenswrapper[33572]: I1204 22:36:17.218023 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7675d-default-internal-api-0"] Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: I1204 22:36:17.376292 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7675d-default-internal-api-0"] Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: E1204 22:36:17.377235 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe730dd-7824-4438-981d-bf5429987895" containerName="glance-httpd" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: I1204 22:36:17.377253 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe730dd-7824-4438-981d-bf5429987895" containerName="glance-httpd" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: E1204 22:36:17.377295 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4916f987-e677-4aee-b52c-88534ce7b28b" containerName="glance-log" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: I1204 22:36:17.377302 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4916f987-e677-4aee-b52c-88534ce7b28b" containerName="glance-log" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: E1204 22:36:17.377359 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe730dd-7824-4438-981d-bf5429987895" containerName="glance-log" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: I1204 22:36:17.377365 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe730dd-7824-4438-981d-bf5429987895" containerName="glance-log" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: E1204 22:36:17.377379 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4916f987-e677-4aee-b52c-88534ce7b28b" containerName="glance-httpd" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: I1204 22:36:17.377385 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4916f987-e677-4aee-b52c-88534ce7b28b" containerName="glance-httpd" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: I1204 22:36:17.378099 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="4916f987-e677-4aee-b52c-88534ce7b28b" containerName="glance-log" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: I1204 22:36:17.378137 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="4916f987-e677-4aee-b52c-88534ce7b28b" containerName="glance-httpd" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: I1204 22:36:17.378168 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe730dd-7824-4438-981d-bf5429987895" containerName="glance-httpd" Dec 04 22:36:17.379525 master-0 kubenswrapper[33572]: I1204 22:36:17.378203 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe730dd-7824-4438-981d-bf5429987895" containerName="glance-log" Dec 04 22:36:17.395531 master-0 kubenswrapper[33572]: I1204 22:36:17.391828 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.421691 master-0 kubenswrapper[33572]: I1204 22:36:17.418410 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7675d-default-internal-api-0"] Dec 04 22:36:17.430532 master-0 kubenswrapper[33572]: I1204 22:36:17.428828 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-7675d-default-internal-config-data" Dec 04 22:36:17.430532 master-0 kubenswrapper[33572]: I1204 22:36:17.428940 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 22:36:17.619603 master-0 kubenswrapper[33572]: I1204 22:36:17.617797 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5c2c1216-db02-4bb8-856c-a56a68e06d23","Type":"ContainerStarted","Data":"60ff475ca542f5b6d96e253ec48bda0abd0decb588db2aa8ef83bb9b0068a44e"} Dec 04 22:36:17.630817 master-0 kubenswrapper[33572]: I1204 22:36:17.628589 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.630817 master-0 kubenswrapper[33572]: I1204 22:36:17.628642 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/843efe19-fdc4-4678-9088-c7e928e0d216-httpd-run\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.630817 master-0 kubenswrapper[33572]: I1204 22:36:17.628699 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-combined-ca-bundle\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.630817 master-0 kubenswrapper[33572]: I1204 22:36:17.628758 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-internal-tls-certs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.630817 master-0 kubenswrapper[33572]: I1204 22:36:17.628775 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkqgc\" (UniqueName: \"kubernetes.io/projected/843efe19-fdc4-4678-9088-c7e928e0d216-kube-api-access-dkqgc\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.630817 master-0 kubenswrapper[33572]: I1204 22:36:17.628839 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-scripts\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.630817 master-0 kubenswrapper[33572]: I1204 22:36:17.628872 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/843efe19-fdc4-4678-9088-c7e928e0d216-logs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.630817 master-0 kubenswrapper[33572]: I1204 22:36:17.628898 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-config-data\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.634242 master-0 kubenswrapper[33572]: I1204 22:36:17.633812 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2433-account-create-update-cdgnq" event={"ID":"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3","Type":"ContainerStarted","Data":"69b52288034fe9a598b7ea16f0a7d68930eb7a7b9820815e114ed6ea3de15f1a"} Dec 04 22:36:17.637368 master-0 kubenswrapper[33572]: I1204 22:36:17.637164 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-trw6l" event={"ID":"cf9072a8-1d23-4e01-b480-6aa621ae89a3","Type":"ContainerStarted","Data":"d318f696265b5063b76fe42a6afd778e5ed19ef0eaa083e1768fe7b45cf58b72"} Dec 04 22:36:17.648893 master-0 kubenswrapper[33572]: I1204 22:36:17.648582 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-23b3-account-create-update-kzbkj" event={"ID":"d81f44b5-895d-417b-aff2-1d54081061f9","Type":"ContainerStarted","Data":"082c871709438f1274362858f45d52463bb8b43b9ce1cf0e394b3b9521eecd14"} Dec 04 22:36:17.665541 master-0 kubenswrapper[33572]: I1204 22:36:17.655731 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-80fd-account-create-update-4cjck" event={"ID":"2306e236-149a-4214-8600-218585ace100","Type":"ContainerStarted","Data":"242c47f45e03b07771899700798be9deba66cf0c07123c21cad3994123bc0422"} Dec 04 22:36:17.665541 master-0 kubenswrapper[33572]: I1204 22:36:17.655790 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-80fd-account-create-update-4cjck" event={"ID":"2306e236-149a-4214-8600-218585ace100","Type":"ContainerStarted","Data":"70e8d227071a77671d8370606f4d21db35d91bbd1b8d116a0d9c609bd3b3aabe"} Dec 04 22:36:17.711106 master-0 kubenswrapper[33572]: I1204 22:36:17.685675 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2q2sn" event={"ID":"67167b90-f405-4945-a970-7c5d1a5dcef7","Type":"ContainerStarted","Data":"4c5601a6dd584d847db5e6296f0eaa5934906f136443aa63338b0d8679ac80c3"} Dec 04 22:36:17.711106 master-0 kubenswrapper[33572]: I1204 22:36:17.685727 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2q2sn" event={"ID":"67167b90-f405-4945-a970-7c5d1a5dcef7","Type":"ContainerStarted","Data":"7fc97f2ad915a803e1617f277159f729f3c6efad6d2f08dab5db01948fb3fdf9"} Dec 04 22:36:17.711106 master-0 kubenswrapper[33572]: I1204 22:36:17.703310 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-external-api-0" event={"ID":"bbe730dd-7824-4438-981d-bf5429987895","Type":"ContainerDied","Data":"475d9fedb2eaaf45ff8375b7b0f4cf62b367259bcc2edaff25997421da71fc44"} Dec 04 22:36:17.711106 master-0 kubenswrapper[33572]: I1204 22:36:17.703366 33572 scope.go:117] "RemoveContainer" containerID="1c8639fd70edb8891c15b7cf485434414e45dca658c9c6f2ba6d87161ec6ec70" Dec 04 22:36:17.711106 master-0 kubenswrapper[33572]: I1204 22:36:17.703564 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:17.711106 master-0 kubenswrapper[33572]: I1204 22:36:17.709723 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" event={"ID":"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790","Type":"ContainerStarted","Data":"9137bbadee76dbf3cbc50c1db0ebe0f7b1bf71eed8b5d837e9734452b38f831d"} Dec 04 22:36:17.711872 master-0 kubenswrapper[33572]: I1204 22:36:17.711740 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j7k4l" event={"ID":"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b","Type":"ContainerStarted","Data":"a5e1d5e6943df0bb64fdeb000117bedd6ffce96fec86ae5ecd8e66985ddd7d16"} Dec 04 22:36:17.711872 master-0 kubenswrapper[33572]: I1204 22:36:17.711770 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j7k4l" event={"ID":"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b","Type":"ContainerStarted","Data":"b79c4afa6c6d49428df8e63f4d11f3ef873b029272c1f8df8a0e275955c8cef9"} Dec 04 22:36:17.717791 master-0 kubenswrapper[33572]: I1204 22:36:17.717662 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0230b30d-f462-4a11-8828-4d4b153961ee","Type":"ContainerStarted","Data":"8d4c3f9246aafd55917828b63afd245e42ecfa3fed4e2e06eb74b95445d2a072"} Dec 04 22:36:17.721563 master-0 kubenswrapper[33572]: I1204 22:36:17.720769 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"bad9361a-2000-4ec4-91fe-6936f503f86d","Type":"ContainerStarted","Data":"b7193c1d1f1b9cfda30ab9afc3fc600a2d962a3c0f0038b7445d325d8619dca2"} Dec 04 22:36:17.732981 master-0 kubenswrapper[33572]: I1204 22:36:17.731726 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-scripts\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.732981 master-0 kubenswrapper[33572]: I1204 22:36:17.731804 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/843efe19-fdc4-4678-9088-c7e928e0d216-logs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.732981 master-0 kubenswrapper[33572]: I1204 22:36:17.731880 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-config-data\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.732981 master-0 kubenswrapper[33572]: I1204 22:36:17.732055 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.732981 master-0 kubenswrapper[33572]: I1204 22:36:17.732106 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/843efe19-fdc4-4678-9088-c7e928e0d216-httpd-run\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.732981 master-0 kubenswrapper[33572]: I1204 22:36:17.732196 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-combined-ca-bundle\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.732981 master-0 kubenswrapper[33572]: I1204 22:36:17.732269 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-internal-tls-certs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.732981 master-0 kubenswrapper[33572]: I1204 22:36:17.732290 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkqgc\" (UniqueName: \"kubernetes.io/projected/843efe19-fdc4-4678-9088-c7e928e0d216-kube-api-access-dkqgc\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.738715 master-0 kubenswrapper[33572]: I1204 22:36:17.738674 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/843efe19-fdc4-4678-9088-c7e928e0d216-httpd-run\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.739523 master-0 kubenswrapper[33572]: I1204 22:36:17.739450 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-80fd-account-create-update-4cjck" podStartSLOduration=9.739427432 podStartE2EDuration="9.739427432s" podCreationTimestamp="2025-12-04 22:36:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:36:17.702920836 +0000 UTC m=+1041.430446485" watchObservedRunningTime="2025-12-04 22:36:17.739427432 +0000 UTC m=+1041.466953081" Dec 04 22:36:17.742709 master-0 kubenswrapper[33572]: I1204 22:36:17.742660 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-combined-ca-bundle\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.744756 master-0 kubenswrapper[33572]: I1204 22:36:17.744674 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-scripts\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.744985 master-0 kubenswrapper[33572]: I1204 22:36:17.744949 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/843efe19-fdc4-4678-9088-c7e928e0d216-logs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.745430 master-0 kubenswrapper[33572]: I1204 22:36:17.745358 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-2q2sn" podStartSLOduration=9.745344115 podStartE2EDuration="9.745344115s" podCreationTimestamp="2025-12-04 22:36:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:36:17.722620084 +0000 UTC m=+1041.450145733" watchObservedRunningTime="2025-12-04 22:36:17.745344115 +0000 UTC m=+1041.472869764" Dec 04 22:36:17.747421 master-0 kubenswrapper[33572]: I1204 22:36:17.747387 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-internal-tls-certs\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.757582 master-0 kubenswrapper[33572]: I1204 22:36:17.757416 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-d974c476-m8fbn" podUID="2280c7c8-99f9-44e4-be67-358609c9c7d8" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 22:36:17.758021 master-0 kubenswrapper[33572]: I1204 22:36:17.758002 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:36:17.758080 master-0 kubenswrapper[33572]: I1204 22:36:17.758033 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/19b4b44bb68e6cf797033e7d835d74dc088fa8fee5def4ff67ebc77f83d36479/globalmount\"" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.761918 master-0 kubenswrapper[33572]: I1204 22:36:17.761672 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/843efe19-fdc4-4678-9088-c7e928e0d216-config-data\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.768015 master-0 kubenswrapper[33572]: I1204 22:36:17.767968 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkqgc\" (UniqueName: \"kubernetes.io/projected/843efe19-fdc4-4678-9088-c7e928e0d216-kube-api-access-dkqgc\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:17.772197 master-0 kubenswrapper[33572]: I1204 22:36:17.772092 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-j7k4l" podStartSLOduration=9.772071683 podStartE2EDuration="9.772071683s" podCreationTimestamp="2025-12-04 22:36:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:36:17.757976349 +0000 UTC m=+1041.485501998" watchObservedRunningTime="2025-12-04 22:36:17.772071683 +0000 UTC m=+1041.499597332" Dec 04 22:36:17.808609 master-0 kubenswrapper[33572]: I1204 22:36:17.806826 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.828195655 podStartE2EDuration="19.806808531s" podCreationTimestamp="2025-12-04 22:35:58 +0000 UTC" firstStartedPulling="2025-12-04 22:35:59.918850182 +0000 UTC m=+1023.646375831" lastFinishedPulling="2025-12-04 22:36:15.897463068 +0000 UTC m=+1039.624988707" observedRunningTime="2025-12-04 22:36:17.782999912 +0000 UTC m=+1041.510525561" watchObservedRunningTime="2025-12-04 22:36:17.806808531 +0000 UTC m=+1041.534334180" Dec 04 22:36:18.156868 master-0 kubenswrapper[33572]: I1204 22:36:18.156669 33572 scope.go:117] "RemoveContainer" containerID="25115e1833d783845b249aff27c8ae862e9f9351ccb20cdef6a73ad6983c1391" Dec 04 22:36:18.157585 master-0 kubenswrapper[33572]: I1204 22:36:18.157562 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:36:18.176618 master-0 kubenswrapper[33572]: I1204 22:36:18.176444 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:36:18.202981 master-0 kubenswrapper[33572]: I1204 22:36:18.202925 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:36:18.205576 master-0 kubenswrapper[33572]: I1204 22:36:18.205013 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.210400 master-0 kubenswrapper[33572]: I1204 22:36:18.210330 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-7675d-default-external-config-data" Dec 04 22:36:18.212087 master-0 kubenswrapper[33572]: I1204 22:36:18.210752 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 22:36:18.284225 master-0 kubenswrapper[33572]: I1204 22:36:18.284174 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:36:18.299738 master-0 kubenswrapper[33572]: I1204 22:36:18.299690 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:18.361615 master-0 kubenswrapper[33572]: I1204 22:36:18.361469 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-httpd-run\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.361615 master-0 kubenswrapper[33572]: I1204 22:36:18.361545 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-combined-ca-bundle\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.361615 master-0 kubenswrapper[33572]: I1204 22:36:18.361567 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-config-data\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.361615 master-0 kubenswrapper[33572]: I1204 22:36:18.361610 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-logs\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.361906 master-0 kubenswrapper[33572]: I1204 22:36:18.361636 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6j2c\" (UniqueName: \"kubernetes.io/projected/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-kube-api-access-q6j2c\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.361906 master-0 kubenswrapper[33572]: I1204 22:36:18.361665 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-public-tls-certs\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.361906 master-0 kubenswrapper[33572]: I1204 22:36:18.361684 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.361906 master-0 kubenswrapper[33572]: I1204 22:36:18.361838 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-scripts\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.465220 master-0 kubenswrapper[33572]: I1204 22:36:18.465163 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-scripts\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.465406 master-0 kubenswrapper[33572]: I1204 22:36:18.465235 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-httpd-run\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.465464 master-0 kubenswrapper[33572]: I1204 22:36:18.465418 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-combined-ca-bundle\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.465516 master-0 kubenswrapper[33572]: I1204 22:36:18.465495 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-config-data\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.465646 master-0 kubenswrapper[33572]: I1204 22:36:18.465620 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-logs\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.465698 master-0 kubenswrapper[33572]: I1204 22:36:18.465653 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6j2c\" (UniqueName: \"kubernetes.io/projected/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-kube-api-access-q6j2c\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.465734 master-0 kubenswrapper[33572]: I1204 22:36:18.465703 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.466484 master-0 kubenswrapper[33572]: I1204 22:36:18.466443 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-public-tls-certs\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.467313 master-0 kubenswrapper[33572]: I1204 22:36:18.466777 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-logs\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.467313 master-0 kubenswrapper[33572]: I1204 22:36:18.467017 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-httpd-run\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.470448 master-0 kubenswrapper[33572]: I1204 22:36:18.470425 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-public-tls-certs\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.473318 master-0 kubenswrapper[33572]: I1204 22:36:18.473273 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-scripts\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.478006 master-0 kubenswrapper[33572]: I1204 22:36:18.477963 33572 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 22:36:18.478086 master-0 kubenswrapper[33572]: I1204 22:36:18.478013 33572 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c09096dd0f6c531150e055f5f0297026538faf280cff6c93023f9f573f827900/globalmount\"" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.481877 master-0 kubenswrapper[33572]: I1204 22:36:18.479658 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-config-data\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.488603 master-0 kubenswrapper[33572]: I1204 22:36:18.487371 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-combined-ca-bundle\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.493653 master-0 kubenswrapper[33572]: I1204 22:36:18.492840 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6j2c\" (UniqueName: \"kubernetes.io/projected/ab659283-4ba9-4f71-bdc0-1d19ba9f130a-kube-api-access-q6j2c\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:18.538980 master-0 kubenswrapper[33572]: I1204 22:36:18.538918 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4916f987-e677-4aee-b52c-88534ce7b28b" path="/var/lib/kubelet/pods/4916f987-e677-4aee-b52c-88534ce7b28b/volumes" Dec 04 22:36:18.539734 master-0 kubenswrapper[33572]: I1204 22:36:18.539712 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe730dd-7824-4438-981d-bf5429987895" path="/var/lib/kubelet/pods/bbe730dd-7824-4438-981d-bf5429987895/volumes" Dec 04 22:36:18.623949 master-0 kubenswrapper[33572]: I1204 22:36:18.623812 33572 scope.go:117] "RemoveContainer" containerID="272f41cd51619be85d71ef666cf16c84cd73ede27da9011d69f041d21915ac3c" Dec 04 22:36:18.732884 master-0 kubenswrapper[33572]: I1204 22:36:18.732826 33572 generic.go:334] "Generic (PLEG): container finished" podID="67167b90-f405-4945-a970-7c5d1a5dcef7" containerID="4c5601a6dd584d847db5e6296f0eaa5934906f136443aa63338b0d8679ac80c3" exitCode=0 Dec 04 22:36:18.733018 master-0 kubenswrapper[33572]: I1204 22:36:18.732941 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2q2sn" event={"ID":"67167b90-f405-4945-a970-7c5d1a5dcef7","Type":"ContainerDied","Data":"4c5601a6dd584d847db5e6296f0eaa5934906f136443aa63338b0d8679ac80c3"} Dec 04 22:36:18.739841 master-0 kubenswrapper[33572]: I1204 22:36:18.737247 33572 generic.go:334] "Generic (PLEG): container finished" podID="bad9361a-2000-4ec4-91fe-6936f503f86d" containerID="2cec7f3bfe0d9f0871f32c81ac6c3f560661b9b9151f8643741b76e530a4bc64" exitCode=0 Dec 04 22:36:18.739841 master-0 kubenswrapper[33572]: I1204 22:36:18.737297 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"bad9361a-2000-4ec4-91fe-6936f503f86d","Type":"ContainerDied","Data":"2cec7f3bfe0d9f0871f32c81ac6c3f560661b9b9151f8643741b76e530a4bc64"} Dec 04 22:36:18.746350 master-0 kubenswrapper[33572]: I1204 22:36:18.746311 33572 generic.go:334] "Generic (PLEG): container finished" podID="d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3" containerID="5ba2aa41593fdbd3e0b2385aa3a69c480fdbfe1b4a47b7d8332aabb99b886942" exitCode=0 Dec 04 22:36:18.746422 master-0 kubenswrapper[33572]: I1204 22:36:18.746384 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2433-account-create-update-cdgnq" event={"ID":"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3","Type":"ContainerDied","Data":"5ba2aa41593fdbd3e0b2385aa3a69c480fdbfe1b4a47b7d8332aabb99b886942"} Dec 04 22:36:18.748063 master-0 kubenswrapper[33572]: I1204 22:36:18.748039 33572 generic.go:334] "Generic (PLEG): container finished" podID="d81f44b5-895d-417b-aff2-1d54081061f9" containerID="082c871709438f1274362858f45d52463bb8b43b9ce1cf0e394b3b9521eecd14" exitCode=0 Dec 04 22:36:18.748125 master-0 kubenswrapper[33572]: I1204 22:36:18.748095 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-23b3-account-create-update-kzbkj" event={"ID":"d81f44b5-895d-417b-aff2-1d54081061f9","Type":"ContainerDied","Data":"082c871709438f1274362858f45d52463bb8b43b9ce1cf0e394b3b9521eecd14"} Dec 04 22:36:18.750023 master-0 kubenswrapper[33572]: I1204 22:36:18.750011 33572 generic.go:334] "Generic (PLEG): container finished" podID="2306e236-149a-4214-8600-218585ace100" containerID="242c47f45e03b07771899700798be9deba66cf0c07123c21cad3994123bc0422" exitCode=0 Dec 04 22:36:18.750069 master-0 kubenswrapper[33572]: I1204 22:36:18.750049 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-80fd-account-create-update-4cjck" event={"ID":"2306e236-149a-4214-8600-218585ace100","Type":"ContainerDied","Data":"242c47f45e03b07771899700798be9deba66cf0c07123c21cad3994123bc0422"} Dec 04 22:36:18.751521 master-0 kubenswrapper[33572]: I1204 22:36:18.751482 33572 generic.go:334] "Generic (PLEG): container finished" podID="2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b" containerID="a5e1d5e6943df0bb64fdeb000117bedd6ffce96fec86ae5ecd8e66985ddd7d16" exitCode=0 Dec 04 22:36:18.751580 master-0 kubenswrapper[33572]: I1204 22:36:18.751539 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j7k4l" event={"ID":"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b","Type":"ContainerDied","Data":"a5e1d5e6943df0bb64fdeb000117bedd6ffce96fec86ae5ecd8e66985ddd7d16"} Dec 04 22:36:18.752661 master-0 kubenswrapper[33572]: I1204 22:36:18.752631 33572 generic.go:334] "Generic (PLEG): container finished" podID="a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" containerID="095dcd17a8fc7ecc1902e798a974640c9ba3121574eea44963691e538ce86ce5" exitCode=0 Dec 04 22:36:18.752712 master-0 kubenswrapper[33572]: I1204 22:36:18.752671 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" event={"ID":"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790","Type":"ContainerDied","Data":"095dcd17a8fc7ecc1902e798a974640c9ba3121574eea44963691e538ce86ce5"} Dec 04 22:36:18.761803 master-0 kubenswrapper[33572]: I1204 22:36:18.761743 33572 generic.go:334] "Generic (PLEG): container finished" podID="cf9072a8-1d23-4e01-b480-6aa621ae89a3" containerID="513d79bb156bbe5c2d9e61efcc644f4744c5bad082e30b7c689c2a3acfd9c94c" exitCode=0 Dec 04 22:36:18.761964 master-0 kubenswrapper[33572]: I1204 22:36:18.761931 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-trw6l" event={"ID":"cf9072a8-1d23-4e01-b480-6aa621ae89a3","Type":"ContainerDied","Data":"513d79bb156bbe5c2d9e61efcc644f4744c5bad082e30b7c689c2a3acfd9c94c"} Dec 04 22:36:18.840326 master-0 kubenswrapper[33572]: I1204 22:36:18.840265 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1e7a1f54-2fa3-4020-82f4-1c1678d45afc\" (UniqueName: \"kubernetes.io/csi/topolvm.io^0e2d7b00-e159-4c6c-8413-fc308bb9193f\") pod \"glance-7675d-default-internal-api-0\" (UID: \"843efe19-fdc4-4678-9088-c7e928e0d216\") " pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:19.061047 master-0 kubenswrapper[33572]: I1204 22:36:19.060964 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:19.262644 master-0 kubenswrapper[33572]: I1204 22:36:19.262592 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23b3-account-create-update-kzbkj" Dec 04 22:36:19.278265 master-0 kubenswrapper[33572]: I1204 22:36:19.278205 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zhld\" (UniqueName: \"kubernetes.io/projected/d81f44b5-895d-417b-aff2-1d54081061f9-kube-api-access-9zhld\") pod \"d81f44b5-895d-417b-aff2-1d54081061f9\" (UID: \"d81f44b5-895d-417b-aff2-1d54081061f9\") " Dec 04 22:36:19.278726 master-0 kubenswrapper[33572]: I1204 22:36:19.278689 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d81f44b5-895d-417b-aff2-1d54081061f9-operator-scripts\") pod \"d81f44b5-895d-417b-aff2-1d54081061f9\" (UID: \"d81f44b5-895d-417b-aff2-1d54081061f9\") " Dec 04 22:36:19.279743 master-0 kubenswrapper[33572]: I1204 22:36:19.279640 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d81f44b5-895d-417b-aff2-1d54081061f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d81f44b5-895d-417b-aff2-1d54081061f9" (UID: "d81f44b5-895d-417b-aff2-1d54081061f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:19.303942 master-0 kubenswrapper[33572]: I1204 22:36:19.302030 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d81f44b5-895d-417b-aff2-1d54081061f9-kube-api-access-9zhld" (OuterVolumeSpecName: "kube-api-access-9zhld") pod "d81f44b5-895d-417b-aff2-1d54081061f9" (UID: "d81f44b5-895d-417b-aff2-1d54081061f9"). InnerVolumeSpecName "kube-api-access-9zhld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:19.384517 master-0 kubenswrapper[33572]: I1204 22:36:19.381937 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d81f44b5-895d-417b-aff2-1d54081061f9-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:19.384517 master-0 kubenswrapper[33572]: I1204 22:36:19.381989 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zhld\" (UniqueName: \"kubernetes.io/projected/d81f44b5-895d-417b-aff2-1d54081061f9-kube-api-access-9zhld\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:19.487522 master-0 kubenswrapper[33572]: I1204 22:36:19.485076 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 04 22:36:19.691520 master-0 kubenswrapper[33572]: I1204 22:36:19.688670 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fkbp\" (UniqueName: \"kubernetes.io/projected/bad9361a-2000-4ec4-91fe-6936f503f86d-kube-api-access-9fkbp\") pod \"bad9361a-2000-4ec4-91fe-6936f503f86d\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " Dec 04 22:36:19.691520 master-0 kubenswrapper[33572]: I1204 22:36:19.688903 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-config\") pod \"bad9361a-2000-4ec4-91fe-6936f503f86d\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " Dec 04 22:36:19.691520 master-0 kubenswrapper[33572]: I1204 22:36:19.688928 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bad9361a-2000-4ec4-91fe-6936f503f86d-etc-podinfo\") pod \"bad9361a-2000-4ec4-91fe-6936f503f86d\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " Dec 04 22:36:19.691520 master-0 kubenswrapper[33572]: I1204 22:36:19.688954 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-combined-ca-bundle\") pod \"bad9361a-2000-4ec4-91fe-6936f503f86d\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " Dec 04 22:36:19.691520 master-0 kubenswrapper[33572]: I1204 22:36:19.688976 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"bad9361a-2000-4ec4-91fe-6936f503f86d\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " Dec 04 22:36:19.691520 master-0 kubenswrapper[33572]: I1204 22:36:19.689005 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-scripts\") pod \"bad9361a-2000-4ec4-91fe-6936f503f86d\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " Dec 04 22:36:19.691520 master-0 kubenswrapper[33572]: I1204 22:36:19.689151 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic\") pod \"bad9361a-2000-4ec4-91fe-6936f503f86d\" (UID: \"bad9361a-2000-4ec4-91fe-6936f503f86d\") " Dec 04 22:36:19.691520 master-0 kubenswrapper[33572]: I1204 22:36:19.689860 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "bad9361a-2000-4ec4-91fe-6936f503f86d" (UID: "bad9361a-2000-4ec4-91fe-6936f503f86d"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:36:19.696514 master-0 kubenswrapper[33572]: I1204 22:36:19.693301 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "bad9361a-2000-4ec4-91fe-6936f503f86d" (UID: "bad9361a-2000-4ec4-91fe-6936f503f86d"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:36:19.696514 master-0 kubenswrapper[33572]: I1204 22:36:19.693833 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad9361a-2000-4ec4-91fe-6936f503f86d-kube-api-access-9fkbp" (OuterVolumeSpecName: "kube-api-access-9fkbp") pod "bad9361a-2000-4ec4-91fe-6936f503f86d" (UID: "bad9361a-2000-4ec4-91fe-6936f503f86d"). InnerVolumeSpecName "kube-api-access-9fkbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:19.696514 master-0 kubenswrapper[33572]: I1204 22:36:19.694479 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-scripts" (OuterVolumeSpecName: "scripts") pod "bad9361a-2000-4ec4-91fe-6936f503f86d" (UID: "bad9361a-2000-4ec4-91fe-6936f503f86d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:19.696634 master-0 kubenswrapper[33572]: I1204 22:36:19.696484 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b81bb92a-3f08-4bbb-9e5e-aad2234eab86\" (UniqueName: \"kubernetes.io/csi/topolvm.io^257968eb-98c5-47fc-bdf8-815d8d23b358\") pod \"glance-7675d-default-external-api-0\" (UID: \"ab659283-4ba9-4f71-bdc0-1d19ba9f130a\") " pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:19.697816 master-0 kubenswrapper[33572]: I1204 22:36:19.697025 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-config" (OuterVolumeSpecName: "config") pod "bad9361a-2000-4ec4-91fe-6936f503f86d" (UID: "bad9361a-2000-4ec4-91fe-6936f503f86d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:19.701553 master-0 kubenswrapper[33572]: I1204 22:36:19.697892 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bad9361a-2000-4ec4-91fe-6936f503f86d-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "bad9361a-2000-4ec4-91fe-6936f503f86d" (UID: "bad9361a-2000-4ec4-91fe-6936f503f86d"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 22:36:19.734257 master-0 kubenswrapper[33572]: I1204 22:36:19.733575 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bad9361a-2000-4ec4-91fe-6936f503f86d" (UID: "bad9361a-2000-4ec4-91fe-6936f503f86d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:19.763512 master-0 kubenswrapper[33572]: I1204 22:36:19.763376 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:19.793515 master-0 kubenswrapper[33572]: I1204 22:36:19.792639 33572 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:19.793515 master-0 kubenswrapper[33572]: I1204 22:36:19.792689 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fkbp\" (UniqueName: \"kubernetes.io/projected/bad9361a-2000-4ec4-91fe-6936f503f86d-kube-api-access-9fkbp\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:19.793515 master-0 kubenswrapper[33572]: I1204 22:36:19.792704 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:19.793515 master-0 kubenswrapper[33572]: I1204 22:36:19.792716 33572 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bad9361a-2000-4ec4-91fe-6936f503f86d-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:19.793515 master-0 kubenswrapper[33572]: I1204 22:36:19.792731 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:19.793515 master-0 kubenswrapper[33572]: I1204 22:36:19.792745 33572 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/bad9361a-2000-4ec4-91fe-6936f503f86d-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:19.793515 master-0 kubenswrapper[33572]: I1204 22:36:19.792887 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bad9361a-2000-4ec4-91fe-6936f503f86d-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:19.806514 master-0 kubenswrapper[33572]: I1204 22:36:19.805734 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 04 22:36:19.806514 master-0 kubenswrapper[33572]: I1204 22:36:19.805764 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"bad9361a-2000-4ec4-91fe-6936f503f86d","Type":"ContainerDied","Data":"b7193c1d1f1b9cfda30ab9afc3fc600a2d962a3c0f0038b7445d325d8619dca2"} Dec 04 22:36:19.806514 master-0 kubenswrapper[33572]: I1204 22:36:19.805848 33572 scope.go:117] "RemoveContainer" containerID="2cec7f3bfe0d9f0871f32c81ac6c3f560661b9b9151f8643741b76e530a4bc64" Dec 04 22:36:19.835516 master-0 kubenswrapper[33572]: I1204 22:36:19.832196 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" event={"ID":"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790","Type":"ContainerStarted","Data":"3e28f85dcff415799db894d82bc38aa597140573407611014b13faad6628ecaa"} Dec 04 22:36:19.835516 master-0 kubenswrapper[33572]: I1204 22:36:19.832264 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:19.839512 master-0 kubenswrapper[33572]: I1204 22:36:19.836114 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-23b3-account-create-update-kzbkj" Dec 04 22:36:19.839512 master-0 kubenswrapper[33572]: I1204 22:36:19.838477 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-23b3-account-create-update-kzbkj" event={"ID":"d81f44b5-895d-417b-aff2-1d54081061f9","Type":"ContainerDied","Data":"871e0050dab48e9be28cef1468fee539072feb104c995bac01d03a98fb365b30"} Dec 04 22:36:19.839512 master-0 kubenswrapper[33572]: I1204 22:36:19.838590 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="871e0050dab48e9be28cef1468fee539072feb104c995bac01d03a98fb365b30" Dec 04 22:36:19.848513 master-0 kubenswrapper[33572]: I1204 22:36:19.848103 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7675d-default-internal-api-0"] Dec 04 22:36:19.874275 master-0 kubenswrapper[33572]: I1204 22:36:19.874200 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" podStartSLOduration=15.874180245 podStartE2EDuration="15.874180245s" podCreationTimestamp="2025-12-04 22:36:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:36:19.86663881 +0000 UTC m=+1043.594164459" watchObservedRunningTime="2025-12-04 22:36:19.874180245 +0000 UTC m=+1043.601705894" Dec 04 22:36:20.852528 master-0 kubenswrapper[33572]: I1204 22:36:20.851995 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-internal-api-0" event={"ID":"843efe19-fdc4-4678-9088-c7e928e0d216","Type":"ContainerStarted","Data":"561351444c9e270350d6a44ca309088d1e9cf7e61a8d52d235decf807a242951"} Dec 04 22:36:21.056588 master-0 kubenswrapper[33572]: I1204 22:36:21.056350 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j7k4l" Dec 04 22:36:21.133547 master-0 kubenswrapper[33572]: I1204 22:36:21.133450 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-operator-scripts\") pod \"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b\" (UID: \"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b\") " Dec 04 22:36:21.133934 master-0 kubenswrapper[33572]: I1204 22:36:21.133892 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5qdd\" (UniqueName: \"kubernetes.io/projected/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-kube-api-access-z5qdd\") pod \"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b\" (UID: \"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b\") " Dec 04 22:36:21.134157 master-0 kubenswrapper[33572]: I1204 22:36:21.134104 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b" (UID: "2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:21.134860 master-0 kubenswrapper[33572]: I1204 22:36:21.134823 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:21.159221 master-0 kubenswrapper[33572]: I1204 22:36:21.159128 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-kube-api-access-z5qdd" (OuterVolumeSpecName: "kube-api-access-z5qdd") pod "2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b" (UID: "2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b"). InnerVolumeSpecName "kube-api-access-z5qdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:21.238115 master-0 kubenswrapper[33572]: I1204 22:36:21.237888 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5qdd\" (UniqueName: \"kubernetes.io/projected/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b-kube-api-access-z5qdd\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:21.600312 master-0 kubenswrapper[33572]: I1204 22:36:21.600222 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2433-account-create-update-cdgnq" Dec 04 22:36:21.608136 master-0 kubenswrapper[33572]: E1204 22:36:21.608025 33572 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0304754_88f4_4abd_b4ff_2daf5fea19a8.slice/crio-conmon-aa4a2900c265affe04d8e58233c086e56904b2701c0d5a6d642783cc2d4160e5.scope\": RecentStats: unable to find data in memory cache]" Dec 04 22:36:21.617336 master-0 kubenswrapper[33572]: I1204 22:36:21.614840 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trw6l" Dec 04 22:36:21.625020 master-0 kubenswrapper[33572]: I1204 22:36:21.623211 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2q2sn" Dec 04 22:36:21.648246 master-0 kubenswrapper[33572]: I1204 22:36:21.646626 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-80fd-account-create-update-4cjck" Dec 04 22:36:21.649485 master-0 kubenswrapper[33572]: I1204 22:36:21.649342 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f67xp\" (UniqueName: \"kubernetes.io/projected/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-kube-api-access-f67xp\") pod \"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3\" (UID: \"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3\") " Dec 04 22:36:21.649485 master-0 kubenswrapper[33572]: I1204 22:36:21.649443 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-operator-scripts\") pod \"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3\" (UID: \"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3\") " Dec 04 22:36:21.651075 master-0 kubenswrapper[33572]: I1204 22:36:21.651016 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3" (UID: "d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:21.654386 master-0 kubenswrapper[33572]: I1204 22:36:21.654337 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-kube-api-access-f67xp" (OuterVolumeSpecName: "kube-api-access-f67xp") pod "d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3" (UID: "d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3"). InnerVolumeSpecName "kube-api-access-f67xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:21.658218 master-0 kubenswrapper[33572]: I1204 22:36:21.656115 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Dec 04 22:36:21.699734 master-0 kubenswrapper[33572]: I1204 22:36:21.699666 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Dec 04 22:36:21.751987 master-0 kubenswrapper[33572]: I1204 22:36:21.751894 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7ls\" (UniqueName: \"kubernetes.io/projected/cf9072a8-1d23-4e01-b480-6aa621ae89a3-kube-api-access-np7ls\") pod \"cf9072a8-1d23-4e01-b480-6aa621ae89a3\" (UID: \"cf9072a8-1d23-4e01-b480-6aa621ae89a3\") " Dec 04 22:36:21.752253 master-0 kubenswrapper[33572]: I1204 22:36:21.752035 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2306e236-149a-4214-8600-218585ace100-operator-scripts\") pod \"2306e236-149a-4214-8600-218585ace100\" (UID: \"2306e236-149a-4214-8600-218585ace100\") " Dec 04 22:36:21.752253 master-0 kubenswrapper[33572]: I1204 22:36:21.752103 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf9072a8-1d23-4e01-b480-6aa621ae89a3-operator-scripts\") pod \"cf9072a8-1d23-4e01-b480-6aa621ae89a3\" (UID: \"cf9072a8-1d23-4e01-b480-6aa621ae89a3\") " Dec 04 22:36:21.752253 master-0 kubenswrapper[33572]: I1204 22:36:21.752145 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf6nl\" (UniqueName: \"kubernetes.io/projected/2306e236-149a-4214-8600-218585ace100-kube-api-access-xf6nl\") pod \"2306e236-149a-4214-8600-218585ace100\" (UID: \"2306e236-149a-4214-8600-218585ace100\") " Dec 04 22:36:21.752253 master-0 kubenswrapper[33572]: I1204 22:36:21.752167 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67167b90-f405-4945-a970-7c5d1a5dcef7-operator-scripts\") pod \"67167b90-f405-4945-a970-7c5d1a5dcef7\" (UID: \"67167b90-f405-4945-a970-7c5d1a5dcef7\") " Dec 04 22:36:21.752649 master-0 kubenswrapper[33572]: I1204 22:36:21.752339 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhhdv\" (UniqueName: \"kubernetes.io/projected/67167b90-f405-4945-a970-7c5d1a5dcef7-kube-api-access-qhhdv\") pod \"67167b90-f405-4945-a970-7c5d1a5dcef7\" (UID: \"67167b90-f405-4945-a970-7c5d1a5dcef7\") " Dec 04 22:36:21.752846 master-0 kubenswrapper[33572]: I1204 22:36:21.752733 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf9072a8-1d23-4e01-b480-6aa621ae89a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf9072a8-1d23-4e01-b480-6aa621ae89a3" (UID: "cf9072a8-1d23-4e01-b480-6aa621ae89a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:21.753067 master-0 kubenswrapper[33572]: I1204 22:36:21.753005 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:21.753067 master-0 kubenswrapper[33572]: I1204 22:36:21.753035 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf9072a8-1d23-4e01-b480-6aa621ae89a3-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:21.753067 master-0 kubenswrapper[33572]: I1204 22:36:21.753048 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f67xp\" (UniqueName: \"kubernetes.io/projected/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3-kube-api-access-f67xp\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:21.753586 master-0 kubenswrapper[33572]: I1204 22:36:21.753554 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2306e236-149a-4214-8600-218585ace100-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2306e236-149a-4214-8600-218585ace100" (UID: "2306e236-149a-4214-8600-218585ace100"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:21.754002 master-0 kubenswrapper[33572]: I1204 22:36:21.753856 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67167b90-f405-4945-a970-7c5d1a5dcef7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67167b90-f405-4945-a970-7c5d1a5dcef7" (UID: "67167b90-f405-4945-a970-7c5d1a5dcef7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:21.755277 master-0 kubenswrapper[33572]: I1204 22:36:21.755243 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67167b90-f405-4945-a970-7c5d1a5dcef7-kube-api-access-qhhdv" (OuterVolumeSpecName: "kube-api-access-qhhdv") pod "67167b90-f405-4945-a970-7c5d1a5dcef7" (UID: "67167b90-f405-4945-a970-7c5d1a5dcef7"). InnerVolumeSpecName "kube-api-access-qhhdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:21.756165 master-0 kubenswrapper[33572]: I1204 22:36:21.756132 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2306e236-149a-4214-8600-218585ace100-kube-api-access-xf6nl" (OuterVolumeSpecName: "kube-api-access-xf6nl") pod "2306e236-149a-4214-8600-218585ace100" (UID: "2306e236-149a-4214-8600-218585ace100"). InnerVolumeSpecName "kube-api-access-xf6nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:21.757307 master-0 kubenswrapper[33572]: I1204 22:36:21.757082 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9072a8-1d23-4e01-b480-6aa621ae89a3-kube-api-access-np7ls" (OuterVolumeSpecName: "kube-api-access-np7ls") pod "cf9072a8-1d23-4e01-b480-6aa621ae89a3" (UID: "cf9072a8-1d23-4e01-b480-6aa621ae89a3"). InnerVolumeSpecName "kube-api-access-np7ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:21.855555 master-0 kubenswrapper[33572]: I1204 22:36:21.855482 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2306e236-149a-4214-8600-218585ace100-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:21.855555 master-0 kubenswrapper[33572]: I1204 22:36:21.855546 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf6nl\" (UniqueName: \"kubernetes.io/projected/2306e236-149a-4214-8600-218585ace100-kube-api-access-xf6nl\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:21.855555 master-0 kubenswrapper[33572]: I1204 22:36:21.855562 33572 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67167b90-f405-4945-a970-7c5d1a5dcef7-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:21.856105 master-0 kubenswrapper[33572]: I1204 22:36:21.855573 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhhdv\" (UniqueName: \"kubernetes.io/projected/67167b90-f405-4945-a970-7c5d1a5dcef7-kube-api-access-qhhdv\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:21.856105 master-0 kubenswrapper[33572]: I1204 22:36:21.855585 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7ls\" (UniqueName: \"kubernetes.io/projected/cf9072a8-1d23-4e01-b480-6aa621ae89a3-kube-api-access-np7ls\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:21.873964 master-0 kubenswrapper[33572]: I1204 22:36:21.873863 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-j7k4l" event={"ID":"2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b","Type":"ContainerDied","Data":"b79c4afa6c6d49428df8e63f4d11f3ef873b029272c1f8df8a0e275955c8cef9"} Dec 04 22:36:21.873964 master-0 kubenswrapper[33572]: I1204 22:36:21.873908 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b79c4afa6c6d49428df8e63f4d11f3ef873b029272c1f8df8a0e275955c8cef9" Dec 04 22:36:21.874155 master-0 kubenswrapper[33572]: I1204 22:36:21.873974 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-j7k4l" Dec 04 22:36:21.883030 master-0 kubenswrapper[33572]: I1204 22:36:21.882933 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2q2sn" Dec 04 22:36:21.885606 master-0 kubenswrapper[33572]: I1204 22:36:21.882924 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2q2sn" event={"ID":"67167b90-f405-4945-a970-7c5d1a5dcef7","Type":"ContainerDied","Data":"7fc97f2ad915a803e1617f277159f729f3c6efad6d2f08dab5db01948fb3fdf9"} Dec 04 22:36:21.885606 master-0 kubenswrapper[33572]: I1204 22:36:21.883185 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fc97f2ad915a803e1617f277159f729f3c6efad6d2f08dab5db01948fb3fdf9" Dec 04 22:36:21.886276 master-0 kubenswrapper[33572]: I1204 22:36:21.886240 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-internal-api-0" event={"ID":"843efe19-fdc4-4678-9088-c7e928e0d216","Type":"ContainerStarted","Data":"0dc9ab730927b60ed62d9d82e7975b3395a09f68f76db7f7bba5a5212b647f00"} Dec 04 22:36:21.889898 master-0 kubenswrapper[33572]: I1204 22:36:21.889871 33572 generic.go:334] "Generic (PLEG): container finished" podID="5c2c1216-db02-4bb8-856c-a56a68e06d23" containerID="60ff475ca542f5b6d96e253ec48bda0abd0decb588db2aa8ef83bb9b0068a44e" exitCode=0 Dec 04 22:36:21.889979 master-0 kubenswrapper[33572]: I1204 22:36:21.889951 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5c2c1216-db02-4bb8-856c-a56a68e06d23","Type":"ContainerDied","Data":"60ff475ca542f5b6d96e253ec48bda0abd0decb588db2aa8ef83bb9b0068a44e"} Dec 04 22:36:21.892942 master-0 kubenswrapper[33572]: I1204 22:36:21.892902 33572 generic.go:334] "Generic (PLEG): container finished" podID="c0304754-88f4-4abd-b4ff-2daf5fea19a8" containerID="aa4a2900c265affe04d8e58233c086e56904b2701c0d5a6d642783cc2d4160e5" exitCode=1 Dec 04 22:36:21.893150 master-0 kubenswrapper[33572]: I1204 22:36:21.893006 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" event={"ID":"c0304754-88f4-4abd-b4ff-2daf5fea19a8","Type":"ContainerDied","Data":"aa4a2900c265affe04d8e58233c086e56904b2701c0d5a6d642783cc2d4160e5"} Dec 04 22:36:21.893277 master-0 kubenswrapper[33572]: I1204 22:36:21.893263 33572 scope.go:117] "RemoveContainer" containerID="c7f3123b297498e99e0be827847475a3b74290a3707c53aea234e7f37d6d177f" Dec 04 22:36:21.894087 master-0 kubenswrapper[33572]: I1204 22:36:21.894048 33572 scope.go:117] "RemoveContainer" containerID="aa4a2900c265affe04d8e58233c086e56904b2701c0d5a6d642783cc2d4160e5" Dec 04 22:36:21.894414 master-0 kubenswrapper[33572]: E1204 22:36:21.894391 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-57f48bc457-9xrbh_openstack(c0304754-88f4-4abd-b4ff-2daf5fea19a8)\"" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" podUID="c0304754-88f4-4abd-b4ff-2daf5fea19a8" Dec 04 22:36:21.896362 master-0 kubenswrapper[33572]: I1204 22:36:21.896320 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2433-account-create-update-cdgnq" event={"ID":"d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3","Type":"ContainerDied","Data":"69b52288034fe9a598b7ea16f0a7d68930eb7a7b9820815e114ed6ea3de15f1a"} Dec 04 22:36:21.896431 master-0 kubenswrapper[33572]: I1204 22:36:21.896367 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b52288034fe9a598b7ea16f0a7d68930eb7a7b9820815e114ed6ea3de15f1a" Dec 04 22:36:21.896431 master-0 kubenswrapper[33572]: I1204 22:36:21.896330 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2433-account-create-update-cdgnq" Dec 04 22:36:21.901643 master-0 kubenswrapper[33572]: I1204 22:36:21.901466 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-trw6l" event={"ID":"cf9072a8-1d23-4e01-b480-6aa621ae89a3","Type":"ContainerDied","Data":"d318f696265b5063b76fe42a6afd778e5ed19ef0eaa083e1768fe7b45cf58b72"} Dec 04 22:36:21.901716 master-0 kubenswrapper[33572]: I1204 22:36:21.901661 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-trw6l" Dec 04 22:36:21.902302 master-0 kubenswrapper[33572]: I1204 22:36:21.902269 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d318f696265b5063b76fe42a6afd778e5ed19ef0eaa083e1768fe7b45cf58b72" Dec 04 22:36:21.904479 master-0 kubenswrapper[33572]: I1204 22:36:21.904431 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-80fd-account-create-update-4cjck" event={"ID":"2306e236-149a-4214-8600-218585ace100","Type":"ContainerDied","Data":"70e8d227071a77671d8370606f4d21db35d91bbd1b8d116a0d9c609bd3b3aabe"} Dec 04 22:36:21.904563 master-0 kubenswrapper[33572]: I1204 22:36:21.904481 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e8d227071a77671d8370606f4d21db35d91bbd1b8d116a0d9c609bd3b3aabe" Dec 04 22:36:21.904563 master-0 kubenswrapper[33572]: I1204 22:36:21.904552 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-80fd-account-create-update-4cjck" Dec 04 22:36:22.517770 master-0 kubenswrapper[33572]: I1204 22:36:22.517715 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Dec 04 22:36:22.518557 master-0 kubenswrapper[33572]: E1204 22:36:22.518540 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9072a8-1d23-4e01-b480-6aa621ae89a3" containerName="mariadb-database-create" Dec 04 22:36:22.518633 master-0 kubenswrapper[33572]: I1204 22:36:22.518620 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9072a8-1d23-4e01-b480-6aa621ae89a3" containerName="mariadb-database-create" Dec 04 22:36:22.518747 master-0 kubenswrapper[33572]: E1204 22:36:22.518736 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67167b90-f405-4945-a970-7c5d1a5dcef7" containerName="mariadb-database-create" Dec 04 22:36:22.518805 master-0 kubenswrapper[33572]: I1204 22:36:22.518796 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="67167b90-f405-4945-a970-7c5d1a5dcef7" containerName="mariadb-database-create" Dec 04 22:36:22.518888 master-0 kubenswrapper[33572]: E1204 22:36:22.518878 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2306e236-149a-4214-8600-218585ace100" containerName="mariadb-account-create-update" Dec 04 22:36:22.518945 master-0 kubenswrapper[33572]: I1204 22:36:22.518934 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2306e236-149a-4214-8600-218585ace100" containerName="mariadb-account-create-update" Dec 04 22:36:22.519012 master-0 kubenswrapper[33572]: E1204 22:36:22.519002 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad9361a-2000-4ec4-91fe-6936f503f86d" containerName="ironic-python-agent-init" Dec 04 22:36:22.519074 master-0 kubenswrapper[33572]: I1204 22:36:22.519064 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad9361a-2000-4ec4-91fe-6936f503f86d" containerName="ironic-python-agent-init" Dec 04 22:36:22.519134 master-0 kubenswrapper[33572]: E1204 22:36:22.519124 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b" containerName="mariadb-database-create" Dec 04 22:36:22.519193 master-0 kubenswrapper[33572]: I1204 22:36:22.519184 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b" containerName="mariadb-database-create" Dec 04 22:36:22.519432 master-0 kubenswrapper[33572]: E1204 22:36:22.519420 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3" containerName="mariadb-account-create-update" Dec 04 22:36:22.519539 master-0 kubenswrapper[33572]: I1204 22:36:22.519528 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3" containerName="mariadb-account-create-update" Dec 04 22:36:22.519615 master-0 kubenswrapper[33572]: E1204 22:36:22.519604 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d81f44b5-895d-417b-aff2-1d54081061f9" containerName="mariadb-account-create-update" Dec 04 22:36:22.519673 master-0 kubenswrapper[33572]: I1204 22:36:22.519664 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d81f44b5-895d-417b-aff2-1d54081061f9" containerName="mariadb-account-create-update" Dec 04 22:36:22.519989 master-0 kubenswrapper[33572]: I1204 22:36:22.519973 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="67167b90-f405-4945-a970-7c5d1a5dcef7" containerName="mariadb-database-create" Dec 04 22:36:22.520076 master-0 kubenswrapper[33572]: I1204 22:36:22.520066 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2306e236-149a-4214-8600-218585ace100" containerName="mariadb-account-create-update" Dec 04 22:36:22.520164 master-0 kubenswrapper[33572]: I1204 22:36:22.520154 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="d81f44b5-895d-417b-aff2-1d54081061f9" containerName="mariadb-account-create-update" Dec 04 22:36:22.520226 master-0 kubenswrapper[33572]: I1204 22:36:22.520217 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad9361a-2000-4ec4-91fe-6936f503f86d" containerName="ironic-python-agent-init" Dec 04 22:36:22.520313 master-0 kubenswrapper[33572]: I1204 22:36:22.520302 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9072a8-1d23-4e01-b480-6aa621ae89a3" containerName="mariadb-database-create" Dec 04 22:36:22.520676 master-0 kubenswrapper[33572]: I1204 22:36:22.520660 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3" containerName="mariadb-account-create-update" Dec 04 22:36:22.520759 master-0 kubenswrapper[33572]: I1204 22:36:22.520748 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b" containerName="mariadb-database-create" Dec 04 22:36:22.524913 master-0 kubenswrapper[33572]: I1204 22:36:22.524846 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 04 22:36:22.527413 master-0 kubenswrapper[33572]: I1204 22:36:22.527369 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Dec 04 22:36:22.527707 master-0 kubenswrapper[33572]: I1204 22:36:22.527684 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Dec 04 22:36:22.530757 master-0 kubenswrapper[33572]: I1204 22:36:22.530624 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Dec 04 22:36:22.530964 master-0 kubenswrapper[33572]: I1204 22:36:22.530936 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Dec 04 22:36:22.531221 master-0 kubenswrapper[33572]: I1204 22:36:22.531193 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Dec 04 22:36:22.557186 master-0 kubenswrapper[33572]: I1204 22:36:22.557057 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad9361a-2000-4ec4-91fe-6936f503f86d" path="/var/lib/kubelet/pods/bad9361a-2000-4ec4-91fe-6936f503f86d/volumes" Dec 04 22:36:22.656627 master-0 kubenswrapper[33572]: I1204 22:36:22.653633 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Dec 04 22:36:22.682536 master-0 kubenswrapper[33572]: I1204 22:36:22.679644 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.682536 master-0 kubenswrapper[33572]: I1204 22:36:22.679831 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h278j\" (UniqueName: \"kubernetes.io/projected/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-kube-api-access-h278j\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.682536 master-0 kubenswrapper[33572]: I1204 22:36:22.679915 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.682536 master-0 kubenswrapper[33572]: I1204 22:36:22.679951 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-scripts\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.682536 master-0 kubenswrapper[33572]: I1204 22:36:22.680136 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.682536 master-0 kubenswrapper[33572]: I1204 22:36:22.680167 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.682536 master-0 kubenswrapper[33572]: I1204 22:36:22.680227 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.682536 master-0 kubenswrapper[33572]: I1204 22:36:22.680298 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.682536 master-0 kubenswrapper[33572]: I1204 22:36:22.680392 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-config\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.765452 master-0 kubenswrapper[33572]: I1204 22:36:22.764863 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-7675d-default-external-api-0"] Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.782752 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.782808 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-scripts\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.782868 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.782889 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.782924 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.782947 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.782992 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-config\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.783014 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.783076 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h278j\" (UniqueName: \"kubernetes.io/projected/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-kube-api-access-h278j\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.785619 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.787864 master-0 kubenswrapper[33572]: I1204 22:36:22.785825 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.788609 master-0 kubenswrapper[33572]: I1204 22:36:22.787943 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-scripts\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.796326 master-0 kubenswrapper[33572]: I1204 22:36:22.796282 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.797107 master-0 kubenswrapper[33572]: I1204 22:36:22.797068 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.797291 master-0 kubenswrapper[33572]: I1204 22:36:22.797252 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.797672 master-0 kubenswrapper[33572]: I1204 22:36:22.797643 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-config\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.805384 master-0 kubenswrapper[33572]: I1204 22:36:22.805094 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.852237 master-0 kubenswrapper[33572]: I1204 22:36:22.852183 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h278j\" (UniqueName: \"kubernetes.io/projected/8c4ffc03-eb9a-4bc0-90a8-6d84f25426af-kube-api-access-h278j\") pod \"ironic-inspector-0\" (UID: \"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af\") " pod="openstack/ironic-inspector-0" Dec 04 22:36:22.862195 master-0 kubenswrapper[33572]: I1204 22:36:22.862142 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 04 22:36:22.931904 master-0 kubenswrapper[33572]: I1204 22:36:22.931842 33572 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:36:22.943187 master-0 kubenswrapper[33572]: I1204 22:36:22.943125 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-internal-api-0" event={"ID":"843efe19-fdc4-4678-9088-c7e928e0d216","Type":"ContainerStarted","Data":"239c4b8aee18e2ec817790b28f6912413e023cb4f8d17f06406d3a975e5eefef"} Dec 04 22:36:22.948260 master-0 kubenswrapper[33572]: I1204 22:36:22.948208 33572 scope.go:117] "RemoveContainer" containerID="aa4a2900c265affe04d8e58233c086e56904b2701c0d5a6d642783cc2d4160e5" Dec 04 22:36:22.948579 master-0 kubenswrapper[33572]: E1204 22:36:22.948538 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-57f48bc457-9xrbh_openstack(c0304754-88f4-4abd-b4ff-2daf5fea19a8)\"" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" podUID="c0304754-88f4-4abd-b4ff-2daf5fea19a8" Dec 04 22:36:22.949247 master-0 kubenswrapper[33572]: I1204 22:36:22.949219 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-external-api-0" event={"ID":"ab659283-4ba9-4f71-bdc0-1d19ba9f130a","Type":"ContainerStarted","Data":"96b3049c51ebad8654d0afa8d83559773eec49914bce455806c00c86a536b97f"} Dec 04 22:36:22.987524 master-0 kubenswrapper[33572]: I1204 22:36:22.987255 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-7675d-default-internal-api-0" podStartSLOduration=5.987209599 podStartE2EDuration="5.987209599s" podCreationTimestamp="2025-12-04 22:36:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:36:22.971685155 +0000 UTC m=+1046.699210814" watchObservedRunningTime="2025-12-04 22:36:22.987209599 +0000 UTC m=+1046.714735248" Dec 04 22:36:23.294117 master-0 kubenswrapper[33572]: I1204 22:36:23.294071 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d974c476-m8fbn" Dec 04 22:36:23.599870 master-0 kubenswrapper[33572]: I1204 22:36:23.599820 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Dec 04 22:36:23.962116 master-0 kubenswrapper[33572]: I1204 22:36:23.962051 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-external-api-0" event={"ID":"ab659283-4ba9-4f71-bdc0-1d19ba9f130a","Type":"ContainerStarted","Data":"465cac96b0c6c8efb9b2e94ce392960e0701156a8eb8c964959459baed3ff10e"} Dec 04 22:36:23.964658 master-0 kubenswrapper[33572]: I1204 22:36:23.964599 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af","Type":"ContainerStarted","Data":"534db664e5e0df1d182eb34811d3f1039506b866cb1dad9980d1ac284e55c5bd"} Dec 04 22:36:23.964738 master-0 kubenswrapper[33572]: I1204 22:36:23.964667 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af","Type":"ContainerStarted","Data":"39e3b86a2418c8aea94388dbdaacf9998c9af726ba3043132bf7e6553853cbf0"} Dec 04 22:36:24.552382 master-0 kubenswrapper[33572]: I1204 22:36:24.552330 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:36:24.831600 master-0 kubenswrapper[33572]: I1204 22:36:24.821101 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6967585dd7-g4s2p"] Dec 04 22:36:24.831600 master-0 kubenswrapper[33572]: I1204 22:36:24.821383 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" podUID="6320a890-2938-48d8-a0c1-60ec3d24ae6f" containerName="dnsmasq-dns" containerID="cri-o://49a2aae73c5770bf4ff6df828527d4cb612e447c1e972e69068684b4ed20bda1" gracePeriod=10 Dec 04 22:36:24.976681 master-0 kubenswrapper[33572]: I1204 22:36:24.976641 33572 generic.go:334] "Generic (PLEG): container finished" podID="8c4ffc03-eb9a-4bc0-90a8-6d84f25426af" containerID="534db664e5e0df1d182eb34811d3f1039506b866cb1dad9980d1ac284e55c5bd" exitCode=0 Dec 04 22:36:24.977258 master-0 kubenswrapper[33572]: I1204 22:36:24.976725 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af","Type":"ContainerDied","Data":"534db664e5e0df1d182eb34811d3f1039506b866cb1dad9980d1ac284e55c5bd"} Dec 04 22:36:24.981806 master-0 kubenswrapper[33572]: I1204 22:36:24.981772 33572 generic.go:334] "Generic (PLEG): container finished" podID="6320a890-2938-48d8-a0c1-60ec3d24ae6f" containerID="49a2aae73c5770bf4ff6df828527d4cb612e447c1e972e69068684b4ed20bda1" exitCode=0 Dec 04 22:36:24.981879 master-0 kubenswrapper[33572]: I1204 22:36:24.981825 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" event={"ID":"6320a890-2938-48d8-a0c1-60ec3d24ae6f","Type":"ContainerDied","Data":"49a2aae73c5770bf4ff6df828527d4cb612e447c1e972e69068684b4ed20bda1"} Dec 04 22:36:24.983824 master-0 kubenswrapper[33572]: I1204 22:36:24.983582 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-7675d-default-external-api-0" event={"ID":"ab659283-4ba9-4f71-bdc0-1d19ba9f130a","Type":"ContainerStarted","Data":"8dad5e9c80158cc129465408aab53969a275e1f55d0f6becb0a1172dea03e876"} Dec 04 22:36:25.032286 master-0 kubenswrapper[33572]: I1204 22:36:25.032087 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-7675d-default-external-api-0" podStartSLOduration=7.032070899 podStartE2EDuration="7.032070899s" podCreationTimestamp="2025-12-04 22:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:36:25.028786639 +0000 UTC m=+1048.756312288" watchObservedRunningTime="2025-12-04 22:36:25.032070899 +0000 UTC m=+1048.759596548" Dec 04 22:36:26.885220 master-0 kubenswrapper[33572]: I1204 22:36:26.885044 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k62b9"] Dec 04 22:36:26.887103 master-0 kubenswrapper[33572]: I1204 22:36:26.887056 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:26.891589 master-0 kubenswrapper[33572]: I1204 22:36:26.891524 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 22:36:26.891797 master-0 kubenswrapper[33572]: I1204 22:36:26.891760 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 04 22:36:26.912164 master-0 kubenswrapper[33572]: I1204 22:36:26.912109 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k62b9"] Dec 04 22:36:27.039280 master-0 kubenswrapper[33572]: I1204 22:36:27.039169 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.039551 master-0 kubenswrapper[33572]: I1204 22:36:27.039302 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-config-data\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.039551 master-0 kubenswrapper[33572]: I1204 22:36:27.039456 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-scripts\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.039551 master-0 kubenswrapper[33572]: I1204 22:36:27.039540 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wfb6\" (UniqueName: \"kubernetes.io/projected/f26cb867-ec4f-4c99-8185-d397e453bb90-kube-api-access-5wfb6\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.142128 master-0 kubenswrapper[33572]: I1204 22:36:27.141978 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-scripts\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.142370 master-0 kubenswrapper[33572]: I1204 22:36:27.142135 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wfb6\" (UniqueName: \"kubernetes.io/projected/f26cb867-ec4f-4c99-8185-d397e453bb90-kube-api-access-5wfb6\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.142466 master-0 kubenswrapper[33572]: I1204 22:36:27.142406 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.142466 master-0 kubenswrapper[33572]: I1204 22:36:27.142449 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-config-data\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.146604 master-0 kubenswrapper[33572]: I1204 22:36:27.146350 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-config-data\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.147448 master-0 kubenswrapper[33572]: I1204 22:36:27.147411 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-scripts\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.147612 master-0 kubenswrapper[33572]: I1204 22:36:27.147576 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.156784 master-0 kubenswrapper[33572]: I1204 22:36:27.156752 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wfb6\" (UniqueName: \"kubernetes.io/projected/f26cb867-ec4f-4c99-8185-d397e453bb90-kube-api-access-5wfb6\") pod \"nova-cell0-conductor-db-sync-k62b9\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.219403 master-0 kubenswrapper[33572]: I1204 22:36:27.219338 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:27.604011 master-0 kubenswrapper[33572]: I1204 22:36:27.603962 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:36:27.785657 master-0 kubenswrapper[33572]: I1204 22:36:27.785426 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-swift-storage-0\") pod \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " Dec 04 22:36:27.785657 master-0 kubenswrapper[33572]: I1204 22:36:27.785556 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-svc\") pod \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " Dec 04 22:36:27.785657 master-0 kubenswrapper[33572]: I1204 22:36:27.785649 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wbzf\" (UniqueName: \"kubernetes.io/projected/6320a890-2938-48d8-a0c1-60ec3d24ae6f-kube-api-access-5wbzf\") pod \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " Dec 04 22:36:27.785901 master-0 kubenswrapper[33572]: I1204 22:36:27.785685 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-config\") pod \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " Dec 04 22:36:27.785901 master-0 kubenswrapper[33572]: I1204 22:36:27.785763 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-sb\") pod \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " Dec 04 22:36:27.785965 master-0 kubenswrapper[33572]: I1204 22:36:27.785935 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-nb\") pod \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\" (UID: \"6320a890-2938-48d8-a0c1-60ec3d24ae6f\") " Dec 04 22:36:27.791397 master-0 kubenswrapper[33572]: I1204 22:36:27.791321 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6320a890-2938-48d8-a0c1-60ec3d24ae6f-kube-api-access-5wbzf" (OuterVolumeSpecName: "kube-api-access-5wbzf") pod "6320a890-2938-48d8-a0c1-60ec3d24ae6f" (UID: "6320a890-2938-48d8-a0c1-60ec3d24ae6f"). InnerVolumeSpecName "kube-api-access-5wbzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:27.827720 master-0 kubenswrapper[33572]: I1204 22:36:27.827597 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k62b9"] Dec 04 22:36:27.846791 master-0 kubenswrapper[33572]: I1204 22:36:27.846733 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6320a890-2938-48d8-a0c1-60ec3d24ae6f" (UID: "6320a890-2938-48d8-a0c1-60ec3d24ae6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:27.846940 master-0 kubenswrapper[33572]: I1204 22:36:27.846879 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-config" (OuterVolumeSpecName: "config") pod "6320a890-2938-48d8-a0c1-60ec3d24ae6f" (UID: "6320a890-2938-48d8-a0c1-60ec3d24ae6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:27.852212 master-0 kubenswrapper[33572]: I1204 22:36:27.851985 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6320a890-2938-48d8-a0c1-60ec3d24ae6f" (UID: "6320a890-2938-48d8-a0c1-60ec3d24ae6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:27.853065 master-0 kubenswrapper[33572]: I1204 22:36:27.852958 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6320a890-2938-48d8-a0c1-60ec3d24ae6f" (UID: "6320a890-2938-48d8-a0c1-60ec3d24ae6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:27.859396 master-0 kubenswrapper[33572]: I1204 22:36:27.859357 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6320a890-2938-48d8-a0c1-60ec3d24ae6f" (UID: "6320a890-2938-48d8-a0c1-60ec3d24ae6f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:36:27.889234 master-0 kubenswrapper[33572]: I1204 22:36:27.889184 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:27.889234 master-0 kubenswrapper[33572]: I1204 22:36:27.889228 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:27.889662 master-0 kubenswrapper[33572]: I1204 22:36:27.889242 33572 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:27.889662 master-0 kubenswrapper[33572]: I1204 22:36:27.889258 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:27.889662 master-0 kubenswrapper[33572]: I1204 22:36:27.889271 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wbzf\" (UniqueName: \"kubernetes.io/projected/6320a890-2938-48d8-a0c1-60ec3d24ae6f-kube-api-access-5wbzf\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:27.889662 master-0 kubenswrapper[33572]: I1204 22:36:27.889282 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6320a890-2938-48d8-a0c1-60ec3d24ae6f-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:28.047650 master-0 kubenswrapper[33572]: I1204 22:36:28.044617 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af","Type":"ContainerStarted","Data":"c65c37531be4d6ed5a6a43cb940dc1871300aa573191d15966b00cbe8f544812"} Dec 04 22:36:28.047986 master-0 kubenswrapper[33572]: I1204 22:36:28.047942 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k62b9" event={"ID":"f26cb867-ec4f-4c99-8185-d397e453bb90","Type":"ContainerStarted","Data":"913d6771150fba22bcb0302e1d64d4cb2f7162721eed308297ab0fe76399767b"} Dec 04 22:36:28.051872 master-0 kubenswrapper[33572]: I1204 22:36:28.051849 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" Dec 04 22:36:28.051944 master-0 kubenswrapper[33572]: I1204 22:36:28.051850 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6967585dd7-g4s2p" event={"ID":"6320a890-2938-48d8-a0c1-60ec3d24ae6f","Type":"ContainerDied","Data":"2bf0706b2df6f0fe0d13b49487ac2b6279b49ce1d6ec943951e98a0d0c97db61"} Dec 04 22:36:28.051944 master-0 kubenswrapper[33572]: I1204 22:36:28.051920 33572 scope.go:117] "RemoveContainer" containerID="49a2aae73c5770bf4ff6df828527d4cb612e447c1e972e69068684b4ed20bda1" Dec 04 22:36:28.056458 master-0 kubenswrapper[33572]: I1204 22:36:28.056424 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5c2c1216-db02-4bb8-856c-a56a68e06d23","Type":"ContainerStarted","Data":"ffd36e5d6e150b002db06f2232e39d6d5924ff66b522113a2b1e78a73a390535"} Dec 04 22:36:28.082911 master-0 kubenswrapper[33572]: I1204 22:36:28.082769 33572 scope.go:117] "RemoveContainer" containerID="fee7c9db32cace22190856c6fa1a721fec110e11436888a08215dab910970945" Dec 04 22:36:28.121331 master-0 kubenswrapper[33572]: I1204 22:36:28.121202 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6967585dd7-g4s2p"] Dec 04 22:36:28.129822 master-0 kubenswrapper[33572]: I1204 22:36:28.129685 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6967585dd7-g4s2p"] Dec 04 22:36:28.548606 master-0 kubenswrapper[33572]: I1204 22:36:28.548472 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6320a890-2938-48d8-a0c1-60ec3d24ae6f" path="/var/lib/kubelet/pods/6320a890-2938-48d8-a0c1-60ec3d24ae6f/volumes" Dec 04 22:36:29.062426 master-0 kubenswrapper[33572]: I1204 22:36:29.062350 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:29.062426 master-0 kubenswrapper[33572]: I1204 22:36:29.062400 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:29.074205 master-0 kubenswrapper[33572]: I1204 22:36:29.073310 33572 generic.go:334] "Generic (PLEG): container finished" podID="8c4ffc03-eb9a-4bc0-90a8-6d84f25426af" containerID="c65c37531be4d6ed5a6a43cb940dc1871300aa573191d15966b00cbe8f544812" exitCode=0 Dec 04 22:36:29.074205 master-0 kubenswrapper[33572]: I1204 22:36:29.073437 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af","Type":"ContainerDied","Data":"c65c37531be4d6ed5a6a43cb940dc1871300aa573191d15966b00cbe8f544812"} Dec 04 22:36:29.117835 master-0 kubenswrapper[33572]: I1204 22:36:29.117614 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:29.154645 master-0 kubenswrapper[33572]: I1204 22:36:29.149253 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:29.765208 master-0 kubenswrapper[33572]: I1204 22:36:29.765100 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:29.765208 master-0 kubenswrapper[33572]: I1204 22:36:29.765149 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:29.805775 master-0 kubenswrapper[33572]: I1204 22:36:29.805676 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:29.830917 master-0 kubenswrapper[33572]: I1204 22:36:29.830858 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:30.096914 master-0 kubenswrapper[33572]: I1204 22:36:30.096872 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:30.096914 master-0 kubenswrapper[33572]: I1204 22:36:30.096919 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:30.097740 master-0 kubenswrapper[33572]: I1204 22:36:30.096935 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:30.097740 master-0 kubenswrapper[33572]: I1204 22:36:30.097382 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:32.013850 master-0 kubenswrapper[33572]: I1204 22:36:32.013790 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:32.129582 master-0 kubenswrapper[33572]: I1204 22:36:32.126178 33572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:36:32.129582 master-0 kubenswrapper[33572]: I1204 22:36:32.126144 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af","Type":"ContainerStarted","Data":"ea07e85d6002aa970e621b6b431e185d3c08ef47c87913b629b292014c9ce1bf"} Dec 04 22:36:32.129582 master-0 kubenswrapper[33572]: I1204 22:36:32.126262 33572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:36:32.129582 master-0 kubenswrapper[33572]: I1204 22:36:32.126301 33572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 22:36:32.246385 master-0 kubenswrapper[33572]: I1204 22:36:32.246322 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:32.247989 master-0 kubenswrapper[33572]: I1204 22:36:32.247963 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-7675d-default-internal-api-0" Dec 04 22:36:32.269789 master-0 kubenswrapper[33572]: I1204 22:36:32.269750 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-7675d-default-external-api-0" Dec 04 22:36:34.527795 master-0 kubenswrapper[33572]: I1204 22:36:34.527726 33572 scope.go:117] "RemoveContainer" containerID="aa4a2900c265affe04d8e58233c086e56904b2701c0d5a6d642783cc2d4160e5" Dec 04 22:36:34.528656 master-0 kubenswrapper[33572]: E1204 22:36:34.528024 33572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-57f48bc457-9xrbh_openstack(c0304754-88f4-4abd-b4ff-2daf5fea19a8)\"" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" podUID="c0304754-88f4-4abd-b4ff-2daf5fea19a8" Dec 04 22:36:38.235163 master-0 kubenswrapper[33572]: I1204 22:36:38.235008 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af","Type":"ContainerStarted","Data":"9e54b66385ed6a26745d0492fc24f0b3c122263f9d65fcc69e6ec92df53c8ffa"} Dec 04 22:36:38.239263 master-0 kubenswrapper[33572]: I1204 22:36:38.238528 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k62b9" event={"ID":"f26cb867-ec4f-4c99-8185-d397e453bb90","Type":"ContainerStarted","Data":"8c5dbec18cb7e4add3367a1efac5162eeeaed9f7beb683bd63cda3b8fa6aecf5"} Dec 04 22:36:38.265640 master-0 kubenswrapper[33572]: I1204 22:36:38.264388 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-k62b9" podStartSLOduration=2.309657942 podStartE2EDuration="12.264340786s" podCreationTimestamp="2025-12-04 22:36:26 +0000 UTC" firstStartedPulling="2025-12-04 22:36:27.853759003 +0000 UTC m=+1051.581284652" lastFinishedPulling="2025-12-04 22:36:37.808441837 +0000 UTC m=+1061.535967496" observedRunningTime="2025-12-04 22:36:38.259434601 +0000 UTC m=+1061.986960290" watchObservedRunningTime="2025-12-04 22:36:38.264340786 +0000 UTC m=+1061.991866475" Dec 04 22:36:39.253650 master-0 kubenswrapper[33572]: I1204 22:36:39.253574 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af","Type":"ContainerStarted","Data":"e9ce9fc89109aef14e2299d2bc4610aea875a1136af4de67fb53089f83b0eb18"} Dec 04 22:36:40.269636 master-0 kubenswrapper[33572]: I1204 22:36:40.269566 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af","Type":"ContainerStarted","Data":"abd883d57fb035b346e2920e41bace8d90a617c8cee71ceac355725ccb1b9f30"} Dec 04 22:36:40.269636 master-0 kubenswrapper[33572]: I1204 22:36:40.269628 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"8c4ffc03-eb9a-4bc0-90a8-6d84f25426af","Type":"ContainerStarted","Data":"fc2e2578ffc6a0a476df5d81fe6822c3dd875bcea88a9914d332d5a691a2c8e8"} Dec 04 22:36:40.270872 master-0 kubenswrapper[33572]: I1204 22:36:40.269925 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 04 22:36:40.270872 master-0 kubenswrapper[33572]: I1204 22:36:40.269964 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 04 22:36:40.318881 master-0 kubenswrapper[33572]: I1204 22:36:40.318775 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=15.984498497 podStartE2EDuration="19.318727655s" podCreationTimestamp="2025-12-04 22:36:21 +0000 UTC" firstStartedPulling="2025-12-04 22:36:23.966370353 +0000 UTC m=+1047.693896002" lastFinishedPulling="2025-12-04 22:36:27.300599491 +0000 UTC m=+1051.028125160" observedRunningTime="2025-12-04 22:36:40.310087139 +0000 UTC m=+1064.037612788" watchObservedRunningTime="2025-12-04 22:36:40.318727655 +0000 UTC m=+1064.046253324" Dec 04 22:36:42.863042 master-0 kubenswrapper[33572]: I1204 22:36:42.862970 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Dec 04 22:36:42.863042 master-0 kubenswrapper[33572]: I1204 22:36:42.863031 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Dec 04 22:36:42.863042 master-0 kubenswrapper[33572]: I1204 22:36:42.863048 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 04 22:36:42.863802 master-0 kubenswrapper[33572]: I1204 22:36:42.863059 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 04 22:36:42.903104 master-0 kubenswrapper[33572]: I1204 22:36:42.903048 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Dec 04 22:36:42.906426 master-0 kubenswrapper[33572]: I1204 22:36:42.906399 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Dec 04 22:36:42.933337 master-0 kubenswrapper[33572]: I1204 22:36:42.933261 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 04 22:36:43.316489 master-0 kubenswrapper[33572]: I1204 22:36:43.314600 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 04 22:36:43.329357 master-0 kubenswrapper[33572]: I1204 22:36:43.329287 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 04 22:36:43.333112 master-0 kubenswrapper[33572]: I1204 22:36:43.333045 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 04 22:36:46.536627 master-0 kubenswrapper[33572]: I1204 22:36:46.536557 33572 scope.go:117] "RemoveContainer" containerID="aa4a2900c265affe04d8e58233c086e56904b2701c0d5a6d642783cc2d4160e5" Dec 04 22:36:47.377993 master-0 kubenswrapper[33572]: I1204 22:36:47.377930 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" event={"ID":"c0304754-88f4-4abd-b4ff-2daf5fea19a8","Type":"ContainerStarted","Data":"1d21dab0394148a7f08b11068824537b26e55db12591488950e71c794083cceb"} Dec 04 22:36:47.378268 master-0 kubenswrapper[33572]: I1204 22:36:47.378156 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:36:53.038694 master-0 kubenswrapper[33572]: I1204 22:36:53.038604 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-57f48bc457-9xrbh" Dec 04 22:36:54.492305 master-0 kubenswrapper[33572]: I1204 22:36:54.492227 33572 generic.go:334] "Generic (PLEG): container finished" podID="f26cb867-ec4f-4c99-8185-d397e453bb90" containerID="8c5dbec18cb7e4add3367a1efac5162eeeaed9f7beb683bd63cda3b8fa6aecf5" exitCode=0 Dec 04 22:36:54.493894 master-0 kubenswrapper[33572]: I1204 22:36:54.492343 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k62b9" event={"ID":"f26cb867-ec4f-4c99-8185-d397e453bb90","Type":"ContainerDied","Data":"8c5dbec18cb7e4add3367a1efac5162eeeaed9f7beb683bd63cda3b8fa6aecf5"} Dec 04 22:36:56.125470 master-0 kubenswrapper[33572]: I1204 22:36:56.125374 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:56.242363 master-0 kubenswrapper[33572]: I1204 22:36:56.241387 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wfb6\" (UniqueName: \"kubernetes.io/projected/f26cb867-ec4f-4c99-8185-d397e453bb90-kube-api-access-5wfb6\") pod \"f26cb867-ec4f-4c99-8185-d397e453bb90\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " Dec 04 22:36:56.242363 master-0 kubenswrapper[33572]: I1204 22:36:56.241541 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-combined-ca-bundle\") pod \"f26cb867-ec4f-4c99-8185-d397e453bb90\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " Dec 04 22:36:56.242363 master-0 kubenswrapper[33572]: I1204 22:36:56.241627 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-config-data\") pod \"f26cb867-ec4f-4c99-8185-d397e453bb90\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " Dec 04 22:36:56.242363 master-0 kubenswrapper[33572]: I1204 22:36:56.241842 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-scripts\") pod \"f26cb867-ec4f-4c99-8185-d397e453bb90\" (UID: \"f26cb867-ec4f-4c99-8185-d397e453bb90\") " Dec 04 22:36:56.256686 master-0 kubenswrapper[33572]: I1204 22:36:56.256529 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-scripts" (OuterVolumeSpecName: "scripts") pod "f26cb867-ec4f-4c99-8185-d397e453bb90" (UID: "f26cb867-ec4f-4c99-8185-d397e453bb90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:56.259182 master-0 kubenswrapper[33572]: I1204 22:36:56.259075 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26cb867-ec4f-4c99-8185-d397e453bb90-kube-api-access-5wfb6" (OuterVolumeSpecName: "kube-api-access-5wfb6") pod "f26cb867-ec4f-4c99-8185-d397e453bb90" (UID: "f26cb867-ec4f-4c99-8185-d397e453bb90"). InnerVolumeSpecName "kube-api-access-5wfb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:36:56.274391 master-0 kubenswrapper[33572]: I1204 22:36:56.274315 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f26cb867-ec4f-4c99-8185-d397e453bb90" (UID: "f26cb867-ec4f-4c99-8185-d397e453bb90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:56.313840 master-0 kubenswrapper[33572]: I1204 22:36:56.313656 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-config-data" (OuterVolumeSpecName: "config-data") pod "f26cb867-ec4f-4c99-8185-d397e453bb90" (UID: "f26cb867-ec4f-4c99-8185-d397e453bb90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:36:56.349186 master-0 kubenswrapper[33572]: I1204 22:36:56.348423 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:56.349186 master-0 kubenswrapper[33572]: I1204 22:36:56.348496 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wfb6\" (UniqueName: \"kubernetes.io/projected/f26cb867-ec4f-4c99-8185-d397e453bb90-kube-api-access-5wfb6\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:56.349186 master-0 kubenswrapper[33572]: I1204 22:36:56.348565 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:56.349186 master-0 kubenswrapper[33572]: I1204 22:36:56.348594 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26cb867-ec4f-4c99-8185-d397e453bb90-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:36:56.558471 master-0 kubenswrapper[33572]: I1204 22:36:56.558401 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k62b9" Dec 04 22:36:56.563063 master-0 kubenswrapper[33572]: I1204 22:36:56.562984 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k62b9" event={"ID":"f26cb867-ec4f-4c99-8185-d397e453bb90","Type":"ContainerDied","Data":"913d6771150fba22bcb0302e1d64d4cb2f7162721eed308297ab0fe76399767b"} Dec 04 22:36:56.563063 master-0 kubenswrapper[33572]: I1204 22:36:56.563058 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="913d6771150fba22bcb0302e1d64d4cb2f7162721eed308297ab0fe76399767b" Dec 04 22:36:56.709224 master-0 kubenswrapper[33572]: I1204 22:36:56.709140 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 22:36:56.709898 master-0 kubenswrapper[33572]: E1204 22:36:56.709681 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26cb867-ec4f-4c99-8185-d397e453bb90" containerName="nova-cell0-conductor-db-sync" Dec 04 22:36:56.709898 master-0 kubenswrapper[33572]: I1204 22:36:56.709701 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26cb867-ec4f-4c99-8185-d397e453bb90" containerName="nova-cell0-conductor-db-sync" Dec 04 22:36:56.709898 master-0 kubenswrapper[33572]: E1204 22:36:56.709712 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6320a890-2938-48d8-a0c1-60ec3d24ae6f" containerName="dnsmasq-dns" Dec 04 22:36:56.709898 master-0 kubenswrapper[33572]: I1204 22:36:56.709718 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6320a890-2938-48d8-a0c1-60ec3d24ae6f" containerName="dnsmasq-dns" Dec 04 22:36:56.709898 master-0 kubenswrapper[33572]: E1204 22:36:56.709731 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6320a890-2938-48d8-a0c1-60ec3d24ae6f" containerName="init" Dec 04 22:36:56.709898 master-0 kubenswrapper[33572]: I1204 22:36:56.709737 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6320a890-2938-48d8-a0c1-60ec3d24ae6f" containerName="init" Dec 04 22:36:56.710357 master-0 kubenswrapper[33572]: I1204 22:36:56.709988 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="6320a890-2938-48d8-a0c1-60ec3d24ae6f" containerName="dnsmasq-dns" Dec 04 22:36:56.710357 master-0 kubenswrapper[33572]: I1204 22:36:56.710011 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26cb867-ec4f-4c99-8185-d397e453bb90" containerName="nova-cell0-conductor-db-sync" Dec 04 22:36:56.710875 master-0 kubenswrapper[33572]: I1204 22:36:56.710765 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:56.714971 master-0 kubenswrapper[33572]: I1204 22:36:56.714922 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 22:36:56.720547 master-0 kubenswrapper[33572]: I1204 22:36:56.720469 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 22:36:56.757589 master-0 kubenswrapper[33572]: I1204 22:36:56.757354 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f5d81bf-ea13-4465-9386-5258cc486507-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6f5d81bf-ea13-4465-9386-5258cc486507\") " pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:56.757823 master-0 kubenswrapper[33572]: I1204 22:36:56.757706 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5d81bf-ea13-4465-9386-5258cc486507-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6f5d81bf-ea13-4465-9386-5258cc486507\") " pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:56.757823 master-0 kubenswrapper[33572]: I1204 22:36:56.757741 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45c92\" (UniqueName: \"kubernetes.io/projected/6f5d81bf-ea13-4465-9386-5258cc486507-kube-api-access-45c92\") pod \"nova-cell0-conductor-0\" (UID: \"6f5d81bf-ea13-4465-9386-5258cc486507\") " pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:56.861538 master-0 kubenswrapper[33572]: I1204 22:36:56.861349 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f5d81bf-ea13-4465-9386-5258cc486507-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6f5d81bf-ea13-4465-9386-5258cc486507\") " pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:56.861941 master-0 kubenswrapper[33572]: I1204 22:36:56.861893 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5d81bf-ea13-4465-9386-5258cc486507-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6f5d81bf-ea13-4465-9386-5258cc486507\") " pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:56.861991 master-0 kubenswrapper[33572]: I1204 22:36:56.861964 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45c92\" (UniqueName: \"kubernetes.io/projected/6f5d81bf-ea13-4465-9386-5258cc486507-kube-api-access-45c92\") pod \"nova-cell0-conductor-0\" (UID: \"6f5d81bf-ea13-4465-9386-5258cc486507\") " pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:56.866347 master-0 kubenswrapper[33572]: I1204 22:36:56.866287 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5d81bf-ea13-4465-9386-5258cc486507-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6f5d81bf-ea13-4465-9386-5258cc486507\") " pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:56.874649 master-0 kubenswrapper[33572]: I1204 22:36:56.874605 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f5d81bf-ea13-4465-9386-5258cc486507-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6f5d81bf-ea13-4465-9386-5258cc486507\") " pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:56.887217 master-0 kubenswrapper[33572]: I1204 22:36:56.887100 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45c92\" (UniqueName: \"kubernetes.io/projected/6f5d81bf-ea13-4465-9386-5258cc486507-kube-api-access-45c92\") pod \"nova-cell0-conductor-0\" (UID: \"6f5d81bf-ea13-4465-9386-5258cc486507\") " pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:57.053033 master-0 kubenswrapper[33572]: I1204 22:36:57.052914 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:57.573587 master-0 kubenswrapper[33572]: W1204 22:36:57.568015 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f5d81bf_ea13_4465_9386_5258cc486507.slice/crio-fd2d28d936acd02d56f8e44bd8f2b838f2f297276862a943d2341fa253d7565b WatchSource:0}: Error finding container fd2d28d936acd02d56f8e44bd8f2b838f2f297276862a943d2341fa253d7565b: Status 404 returned error can't find the container with id fd2d28d936acd02d56f8e44bd8f2b838f2f297276862a943d2341fa253d7565b Dec 04 22:36:57.585591 master-0 kubenswrapper[33572]: I1204 22:36:57.580405 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 22:36:58.585074 master-0 kubenswrapper[33572]: I1204 22:36:58.584912 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6f5d81bf-ea13-4465-9386-5258cc486507","Type":"ContainerStarted","Data":"b64434510bce209dabf83fb9b08701f78c4c6c4240f5883a866c2cbce76ce3fc"} Dec 04 22:36:58.585074 master-0 kubenswrapper[33572]: I1204 22:36:58.585060 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6f5d81bf-ea13-4465-9386-5258cc486507","Type":"ContainerStarted","Data":"fd2d28d936acd02d56f8e44bd8f2b838f2f297276862a943d2341fa253d7565b"} Dec 04 22:36:58.586487 master-0 kubenswrapper[33572]: I1204 22:36:58.585215 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 22:36:58.625529 master-0 kubenswrapper[33572]: I1204 22:36:58.625356 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.625325004 podStartE2EDuration="2.625325004s" podCreationTimestamp="2025-12-04 22:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:36:58.615076485 +0000 UTC m=+1082.342602154" watchObservedRunningTime="2025-12-04 22:36:58.625325004 +0000 UTC m=+1082.352850693" Dec 04 22:37:02.123562 master-0 kubenswrapper[33572]: I1204 22:37:02.119983 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 22:37:02.686532 master-0 kubenswrapper[33572]: I1204 22:37:02.683466 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-96wfh"] Dec 04 22:37:02.686532 master-0 kubenswrapper[33572]: I1204 22:37:02.685326 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:02.692538 master-0 kubenswrapper[33572]: I1204 22:37:02.689245 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 04 22:37:02.697433 master-0 kubenswrapper[33572]: I1204 22:37:02.693192 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 04 22:37:02.725526 master-0 kubenswrapper[33572]: I1204 22:37:02.724250 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-96wfh"] Dec 04 22:37:02.812226 master-0 kubenswrapper[33572]: I1204 22:37:02.810293 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Dec 04 22:37:02.843659 master-0 kubenswrapper[33572]: I1204 22:37:02.815827 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:02.846245 master-0 kubenswrapper[33572]: I1204 22:37:02.846186 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-ironic-compute-config-data" Dec 04 22:37:02.873541 master-0 kubenswrapper[33572]: I1204 22:37:02.870531 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Dec 04 22:37:02.892523 master-0 kubenswrapper[33572]: I1204 22:37:02.887698 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-config-data\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:02.892523 master-0 kubenswrapper[33572]: I1204 22:37:02.887770 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtj6n\" (UniqueName: \"kubernetes.io/projected/384308f7-6942-4013-9bcd-e56fde1ab09f-kube-api-access-wtj6n\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:02.892523 master-0 kubenswrapper[33572]: I1204 22:37:02.887824 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14343bc6-14e1-4b02-9753-ab42edb7d682-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"14343bc6-14e1-4b02-9753-ab42edb7d682\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:02.892523 master-0 kubenswrapper[33572]: I1204 22:37:02.887895 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-scripts\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:02.892523 master-0 kubenswrapper[33572]: I1204 22:37:02.887945 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14343bc6-14e1-4b02-9753-ab42edb7d682-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"14343bc6-14e1-4b02-9753-ab42edb7d682\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:02.892523 master-0 kubenswrapper[33572]: I1204 22:37:02.887982 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:02.892523 master-0 kubenswrapper[33572]: I1204 22:37:02.888027 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s546r\" (UniqueName: \"kubernetes.io/projected/14343bc6-14e1-4b02-9753-ab42edb7d682-kube-api-access-s546r\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"14343bc6-14e1-4b02-9753-ab42edb7d682\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:02.945334 master-0 kubenswrapper[33572]: I1204 22:37:02.945194 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:02.947254 master-0 kubenswrapper[33572]: I1204 22:37:02.947222 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:37:02.951871 master-0 kubenswrapper[33572]: I1204 22:37:02.951836 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 22:37:02.960722 master-0 kubenswrapper[33572]: I1204 22:37:02.959542 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:02.990099 master-0 kubenswrapper[33572]: I1204 22:37:02.990028 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s546r\" (UniqueName: \"kubernetes.io/projected/14343bc6-14e1-4b02-9753-ab42edb7d682-kube-api-access-s546r\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"14343bc6-14e1-4b02-9753-ab42edb7d682\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:02.990099 master-0 kubenswrapper[33572]: I1204 22:37:02.990090 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-config-data\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:02.990099 master-0 kubenswrapper[33572]: I1204 22:37:02.990118 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c57ec571-8d99-48b0-a466-78284aab8064-logs\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:02.990099 master-0 kubenswrapper[33572]: I1204 22:37:02.990145 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtj6n\" (UniqueName: \"kubernetes.io/projected/384308f7-6942-4013-9bcd-e56fde1ab09f-kube-api-access-wtj6n\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:02.990819 master-0 kubenswrapper[33572]: I1204 22:37:02.990182 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14343bc6-14e1-4b02-9753-ab42edb7d682-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"14343bc6-14e1-4b02-9753-ab42edb7d682\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:02.990819 master-0 kubenswrapper[33572]: I1204 22:37:02.990245 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvw8h\" (UniqueName: \"kubernetes.io/projected/c57ec571-8d99-48b0-a466-78284aab8064-kube-api-access-bvw8h\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:02.990819 master-0 kubenswrapper[33572]: I1204 22:37:02.990263 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-scripts\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:02.990819 master-0 kubenswrapper[33572]: I1204 22:37:02.990308 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-config-data\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:02.990819 master-0 kubenswrapper[33572]: I1204 22:37:02.990326 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:02.990819 master-0 kubenswrapper[33572]: I1204 22:37:02.990346 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14343bc6-14e1-4b02-9753-ab42edb7d682-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"14343bc6-14e1-4b02-9753-ab42edb7d682\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:02.990819 master-0 kubenswrapper[33572]: I1204 22:37:02.990378 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:02.996085 master-0 kubenswrapper[33572]: I1204 22:37:02.995044 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-scripts\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:03.007610 master-0 kubenswrapper[33572]: I1204 22:37:02.996403 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14343bc6-14e1-4b02-9753-ab42edb7d682-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"14343bc6-14e1-4b02-9753-ab42edb7d682\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:03.007610 master-0 kubenswrapper[33572]: I1204 22:37:02.998312 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14343bc6-14e1-4b02-9753-ab42edb7d682-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"14343bc6-14e1-4b02-9753-ab42edb7d682\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:03.007610 master-0 kubenswrapper[33572]: I1204 22:37:03.003263 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:03.010495 master-0 kubenswrapper[33572]: I1204 22:37:03.010449 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtj6n\" (UniqueName: \"kubernetes.io/projected/384308f7-6942-4013-9bcd-e56fde1ab09f-kube-api-access-wtj6n\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:03.021230 master-0 kubenswrapper[33572]: I1204 22:37:03.021180 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s546r\" (UniqueName: \"kubernetes.io/projected/14343bc6-14e1-4b02-9753-ab42edb7d682-kube-api-access-s546r\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"14343bc6-14e1-4b02-9753-ab42edb7d682\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:03.052524 master-0 kubenswrapper[33572]: I1204 22:37:03.040238 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-config-data\") pod \"nova-cell0-cell-mapping-96wfh\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:03.066770 master-0 kubenswrapper[33572]: I1204 22:37:03.055115 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:03.066770 master-0 kubenswrapper[33572]: I1204 22:37:03.062960 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 22:37:03.066770 master-0 kubenswrapper[33572]: I1204 22:37:03.064460 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.067533 master-0 kubenswrapper[33572]: I1204 22:37:03.067107 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 22:37:03.093529 master-0 kubenswrapper[33572]: I1204 22:37:03.082163 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 22:37:03.097516 master-0 kubenswrapper[33572]: I1204 22:37:03.094167 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c57ec571-8d99-48b0-a466-78284aab8064-logs\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:03.097516 master-0 kubenswrapper[33572]: I1204 22:37:03.094350 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvw8h\" (UniqueName: \"kubernetes.io/projected/c57ec571-8d99-48b0-a466-78284aab8064-kube-api-access-bvw8h\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:03.097516 master-0 kubenswrapper[33572]: I1204 22:37:03.094381 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.097516 master-0 kubenswrapper[33572]: I1204 22:37:03.094420 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.097516 master-0 kubenswrapper[33572]: I1204 22:37:03.094476 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-config-data\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:03.097516 master-0 kubenswrapper[33572]: I1204 22:37:03.094522 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:03.097516 master-0 kubenswrapper[33572]: I1204 22:37:03.094564 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqsjj\" (UniqueName: \"kubernetes.io/projected/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-kube-api-access-fqsjj\") pod \"nova-cell1-novncproxy-0\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.097516 master-0 kubenswrapper[33572]: I1204 22:37:03.095232 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c57ec571-8d99-48b0-a466-78284aab8064-logs\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:03.101299 master-0 kubenswrapper[33572]: I1204 22:37:03.101239 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-config-data\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:03.111680 master-0 kubenswrapper[33572]: I1204 22:37:03.111417 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:03.125530 master-0 kubenswrapper[33572]: I1204 22:37:03.121563 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:37:03.125530 master-0 kubenswrapper[33572]: I1204 22:37:03.123155 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 22:37:03.125530 master-0 kubenswrapper[33572]: I1204 22:37:03.124566 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 22:37:03.138007 master-0 kubenswrapper[33572]: I1204 22:37:03.131268 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvw8h\" (UniqueName: \"kubernetes.io/projected/c57ec571-8d99-48b0-a466-78284aab8064-kube-api-access-bvw8h\") pod \"nova-api-0\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " pod="openstack/nova-api-0" Dec 04 22:37:03.145751 master-0 kubenswrapper[33572]: I1204 22:37:03.145689 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:37:03.171526 master-0 kubenswrapper[33572]: I1204 22:37:03.170904 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:03.204711 master-0 kubenswrapper[33572]: I1204 22:37:03.197797 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqsjj\" (UniqueName: \"kubernetes.io/projected/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-kube-api-access-fqsjj\") pod \"nova-cell1-novncproxy-0\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.204711 master-0 kubenswrapper[33572]: I1204 22:37:03.198614 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.204711 master-0 kubenswrapper[33572]: I1204 22:37:03.198642 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.207614 master-0 kubenswrapper[33572]: I1204 22:37:03.206402 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.212472 master-0 kubenswrapper[33572]: I1204 22:37:03.209477 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.245222 master-0 kubenswrapper[33572]: I1204 22:37:03.245155 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:03.248532 master-0 kubenswrapper[33572]: I1204 22:37:03.247341 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:37:03.250222 master-0 kubenswrapper[33572]: I1204 22:37:03.250170 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 22:37:03.266975 master-0 kubenswrapper[33572]: I1204 22:37:03.266694 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqsjj\" (UniqueName: \"kubernetes.io/projected/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-kube-api-access-fqsjj\") pod \"nova-cell1-novncproxy-0\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.277948 master-0 kubenswrapper[33572]: I1204 22:37:03.277892 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:37:03.286642 master-0 kubenswrapper[33572]: I1204 22:37:03.286281 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:03.303090 master-0 kubenswrapper[33572]: I1204 22:37:03.302980 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.303090 master-0 kubenswrapper[33572]: I1204 22:37:03.303079 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp6xq\" (UniqueName: \"kubernetes.io/projected/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-kube-api-access-gp6xq\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.303352 master-0 kubenswrapper[33572]: I1204 22:37:03.303159 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:03.303352 master-0 kubenswrapper[33572]: I1204 22:37:03.303221 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-config-data\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.303352 master-0 kubenswrapper[33572]: I1204 22:37:03.303286 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-logs\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.303352 master-0 kubenswrapper[33572]: I1204 22:37:03.303331 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-config-data\") pod \"nova-scheduler-0\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:03.303473 master-0 kubenswrapper[33572]: I1204 22:37:03.303420 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnwr\" (UniqueName: \"kubernetes.io/projected/83a5bf22-49cc-4341-926c-62c129088b57-kube-api-access-mwnwr\") pod \"nova-scheduler-0\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:03.344174 master-0 kubenswrapper[33572]: I1204 22:37:03.344132 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79b46df4b9-2z4nn"] Dec 04 22:37:03.346249 master-0 kubenswrapper[33572]: I1204 22:37:03.346223 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.355418 master-0 kubenswrapper[33572]: I1204 22:37:03.354817 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.404658 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp6xq\" (UniqueName: \"kubernetes.io/projected/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-kube-api-access-gp6xq\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.404698 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.404735 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.404768 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-config-data\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.404802 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-logs\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.404831 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-svc\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.404848 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-config-data\") pod \"nova-scheduler-0\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.404892 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-nb\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.404929 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnwr\" (UniqueName: \"kubernetes.io/projected/83a5bf22-49cc-4341-926c-62c129088b57-kube-api-access-mwnwr\") pod \"nova-scheduler-0\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.405039 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-sb\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.405076 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6rm\" (UniqueName: \"kubernetes.io/projected/9408932d-d92d-4d1a-bd67-7e90e89b2341-kube-api-access-rn6rm\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.405098 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-swift-storage-0\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.405551 master-0 kubenswrapper[33572]: I1204 22:37:03.405127 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-config\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.408324 master-0 kubenswrapper[33572]: I1204 22:37:03.408248 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-logs\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.413174 master-0 kubenswrapper[33572]: I1204 22:37:03.412435 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:03.413534 master-0 kubenswrapper[33572]: I1204 22:37:03.413444 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.425087 master-0 kubenswrapper[33572]: I1204 22:37:03.425050 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b46df4b9-2z4nn"] Dec 04 22:37:03.465854 master-0 kubenswrapper[33572]: I1204 22:37:03.448215 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp6xq\" (UniqueName: \"kubernetes.io/projected/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-kube-api-access-gp6xq\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.467719 master-0 kubenswrapper[33572]: I1204 22:37:03.467576 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-config-data\") pod \"nova-scheduler-0\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:03.469239 master-0 kubenswrapper[33572]: I1204 22:37:03.469210 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-config-data\") pod \"nova-metadata-0\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " pod="openstack/nova-metadata-0" Dec 04 22:37:03.494520 master-0 kubenswrapper[33572]: I1204 22:37:03.474159 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnwr\" (UniqueName: \"kubernetes.io/projected/83a5bf22-49cc-4341-926c-62c129088b57-kube-api-access-mwnwr\") pod \"nova-scheduler-0\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:03.519678 master-0 kubenswrapper[33572]: I1204 22:37:03.510985 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-nb\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.519678 master-0 kubenswrapper[33572]: I1204 22:37:03.511116 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-sb\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.519678 master-0 kubenswrapper[33572]: I1204 22:37:03.511146 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6rm\" (UniqueName: \"kubernetes.io/projected/9408932d-d92d-4d1a-bd67-7e90e89b2341-kube-api-access-rn6rm\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.519678 master-0 kubenswrapper[33572]: I1204 22:37:03.511169 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-swift-storage-0\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.519678 master-0 kubenswrapper[33572]: I1204 22:37:03.511198 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-config\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.519678 master-0 kubenswrapper[33572]: I1204 22:37:03.511291 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-svc\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.546475 master-0 kubenswrapper[33572]: I1204 22:37:03.526144 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-svc\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.546475 master-0 kubenswrapper[33572]: I1204 22:37:03.526596 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-sb\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.546475 master-0 kubenswrapper[33572]: I1204 22:37:03.526581 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-swift-storage-0\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.546475 master-0 kubenswrapper[33572]: I1204 22:37:03.546415 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-nb\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.548586 master-0 kubenswrapper[33572]: I1204 22:37:03.547449 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-config\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.555371 master-0 kubenswrapper[33572]: I1204 22:37:03.549863 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6rm\" (UniqueName: \"kubernetes.io/projected/9408932d-d92d-4d1a-bd67-7e90e89b2341-kube-api-access-rn6rm\") pod \"dnsmasq-dns-79b46df4b9-2z4nn\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.578597 master-0 kubenswrapper[33572]: I1204 22:37:03.576602 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-96wfh"] Dec 04 22:37:03.698540 master-0 kubenswrapper[33572]: I1204 22:37:03.694904 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-96wfh" event={"ID":"384308f7-6942-4013-9bcd-e56fde1ab09f","Type":"ContainerStarted","Data":"3466a92dc262cf71b24d5565919c7162b749983a013f4412fef97bee14dd0bdf"} Dec 04 22:37:03.713701 master-0 kubenswrapper[33572]: I1204 22:37:03.713023 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 22:37:03.764588 master-0 kubenswrapper[33572]: I1204 22:37:03.764491 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:37:03.848222 master-0 kubenswrapper[33572]: I1204 22:37:03.848158 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:03.954030 master-0 kubenswrapper[33572]: I1204 22:37:03.937175 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8rh5h"] Dec 04 22:37:03.954030 master-0 kubenswrapper[33572]: I1204 22:37:03.940175 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:03.954030 master-0 kubenswrapper[33572]: I1204 22:37:03.953184 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 04 22:37:03.954030 master-0 kubenswrapper[33572]: I1204 22:37:03.953397 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 22:37:03.983593 master-0 kubenswrapper[33572]: I1204 22:37:03.983545 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8rh5h"] Dec 04 22:37:04.055710 master-0 kubenswrapper[33572]: I1204 22:37:04.055004 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Dec 04 22:37:04.063750 master-0 kubenswrapper[33572]: I1204 22:37:04.063688 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.063887 master-0 kubenswrapper[33572]: I1204 22:37:04.063838 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-scripts\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.064023 master-0 kubenswrapper[33572]: I1204 22:37:04.064000 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-config-data\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.064109 master-0 kubenswrapper[33572]: I1204 22:37:04.064087 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q6b9\" (UniqueName: \"kubernetes.io/projected/26e68458-b43a-471d-8d30-bfe010b365f3-kube-api-access-4q6b9\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.169410 master-0 kubenswrapper[33572]: I1204 22:37:04.169359 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-config-data\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.171492 master-0 kubenswrapper[33572]: I1204 22:37:04.171469 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q6b9\" (UniqueName: \"kubernetes.io/projected/26e68458-b43a-471d-8d30-bfe010b365f3-kube-api-access-4q6b9\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.171693 master-0 kubenswrapper[33572]: I1204 22:37:04.171675 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.171845 master-0 kubenswrapper[33572]: I1204 22:37:04.171830 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-scripts\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.175005 master-0 kubenswrapper[33572]: I1204 22:37:04.174986 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-scripts\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.178760 master-0 kubenswrapper[33572]: I1204 22:37:04.178739 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.180726 master-0 kubenswrapper[33572]: I1204 22:37:04.180685 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-config-data\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.197826 master-0 kubenswrapper[33572]: I1204 22:37:04.197715 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q6b9\" (UniqueName: \"kubernetes.io/projected/26e68458-b43a-471d-8d30-bfe010b365f3-kube-api-access-4q6b9\") pod \"nova-cell1-conductor-db-sync-8rh5h\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.260862 master-0 kubenswrapper[33572]: I1204 22:37:04.260803 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:04.367526 master-0 kubenswrapper[33572]: I1204 22:37:04.366979 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:04.597951 master-0 kubenswrapper[33572]: I1204 22:37:04.597889 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 22:37:04.597951 master-0 kubenswrapper[33572]: I1204 22:37:04.597946 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:37:04.613847 master-0 kubenswrapper[33572]: W1204 22:37:04.613799 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9408932d_d92d_4d1a_bd67_7e90e89b2341.slice/crio-bf58895a768d015404e0c772730c96332b3cdeb3db19686570083728c2687ea1 WatchSource:0}: Error finding container bf58895a768d015404e0c772730c96332b3cdeb3db19686570083728c2687ea1: Status 404 returned error can't find the container with id bf58895a768d015404e0c772730c96332b3cdeb3db19686570083728c2687ea1 Dec 04 22:37:04.624130 master-0 kubenswrapper[33572]: I1204 22:37:04.624075 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b46df4b9-2z4nn"] Dec 04 22:37:04.633384 master-0 kubenswrapper[33572]: I1204 22:37:04.633315 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:04.713293 master-0 kubenswrapper[33572]: I1204 22:37:04.713229 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"83a5bf22-49cc-4341-926c-62c129088b57","Type":"ContainerStarted","Data":"fb6d1aa0cc663471ba2b35792702435e5044c612b3aaabc375246afd90d94f4b"} Dec 04 22:37:04.715042 master-0 kubenswrapper[33572]: I1204 22:37:04.715012 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-96wfh" event={"ID":"384308f7-6942-4013-9bcd-e56fde1ab09f","Type":"ContainerStarted","Data":"5e0a765d17be796298bfbad89487ca763f9600f55ffb5dc9d673752542258cf2"} Dec 04 22:37:04.719072 master-0 kubenswrapper[33572]: I1204 22:37:04.719042 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"14343bc6-14e1-4b02-9753-ab42edb7d682","Type":"ContainerStarted","Data":"632c472f59a4ba9fcbf7947c6ea4daf1911153a94152cecf001c9d506c19b348"} Dec 04 22:37:04.725356 master-0 kubenswrapper[33572]: I1204 22:37:04.725294 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74d1c2d-00eb-4165-b8e9-76b3640bd88d","Type":"ContainerStarted","Data":"af2ac0f3d8ed13b983c0c9c4977a629064e50484771322144afa5dca580e614f"} Dec 04 22:37:04.728160 master-0 kubenswrapper[33572]: I1204 22:37:04.727875 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9904f70c-a3e6-41c4-85dd-078ab3d0cb70","Type":"ContainerStarted","Data":"e9e488ab93f8344a12da7c1005046c93aa13b7caec9c51c3a01d7508403180e5"} Dec 04 22:37:04.732009 master-0 kubenswrapper[33572]: I1204 22:37:04.730803 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c57ec571-8d99-48b0-a466-78284aab8064","Type":"ContainerStarted","Data":"53f7e164cf886b918ece99286f666f1615b7d27073deec6d798906bd1c6095b9"} Dec 04 22:37:04.732009 master-0 kubenswrapper[33572]: I1204 22:37:04.731295 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" event={"ID":"9408932d-d92d-4d1a-bd67-7e90e89b2341","Type":"ContainerStarted","Data":"bf58895a768d015404e0c772730c96332b3cdeb3db19686570083728c2687ea1"} Dec 04 22:37:04.945011 master-0 kubenswrapper[33572]: I1204 22:37:04.944916 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-96wfh" podStartSLOduration=2.944896552 podStartE2EDuration="2.944896552s" podCreationTimestamp="2025-12-04 22:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:04.750881419 +0000 UTC m=+1088.478407068" watchObservedRunningTime="2025-12-04 22:37:04.944896552 +0000 UTC m=+1088.672422211" Dec 04 22:37:04.957440 master-0 kubenswrapper[33572]: I1204 22:37:04.957233 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8rh5h"] Dec 04 22:37:05.753466 master-0 kubenswrapper[33572]: I1204 22:37:05.748399 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8rh5h" event={"ID":"26e68458-b43a-471d-8d30-bfe010b365f3","Type":"ContainerStarted","Data":"342ab6428cfe3389305d6d9a5b6e25caf9b329efe3f47477ce696e2be0645bd3"} Dec 04 22:37:05.753466 master-0 kubenswrapper[33572]: I1204 22:37:05.748459 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8rh5h" event={"ID":"26e68458-b43a-471d-8d30-bfe010b365f3","Type":"ContainerStarted","Data":"12c1f8fd375eff79ab3dfc43b0b73433099bbd87238c098f1420342966451de1"} Dec 04 22:37:05.753466 master-0 kubenswrapper[33572]: I1204 22:37:05.752230 33572 generic.go:334] "Generic (PLEG): container finished" podID="9408932d-d92d-4d1a-bd67-7e90e89b2341" containerID="4a4a1e62d0ca6213e68359b71811f72710a8ddceda5840fabaf96f7217606cee" exitCode=0 Dec 04 22:37:05.755055 master-0 kubenswrapper[33572]: I1204 22:37:05.754837 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" event={"ID":"9408932d-d92d-4d1a-bd67-7e90e89b2341","Type":"ContainerDied","Data":"4a4a1e62d0ca6213e68359b71811f72710a8ddceda5840fabaf96f7217606cee"} Dec 04 22:37:05.810565 master-0 kubenswrapper[33572]: I1204 22:37:05.810463 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8rh5h" podStartSLOduration=2.810443427 podStartE2EDuration="2.810443427s" podCreationTimestamp="2025-12-04 22:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:05.773111109 +0000 UTC m=+1089.500636758" watchObservedRunningTime="2025-12-04 22:37:05.810443427 +0000 UTC m=+1089.537969076" Dec 04 22:37:07.377845 master-0 kubenswrapper[33572]: I1204 22:37:07.377600 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:07.428220 master-0 kubenswrapper[33572]: I1204 22:37:07.424603 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 22:37:07.799630 master-0 kubenswrapper[33572]: I1204 22:37:07.799520 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74d1c2d-00eb-4165-b8e9-76b3640bd88d","Type":"ContainerStarted","Data":"09444a3ac61d09f5fe2425675b39d15f04458f0d4b58afe434c7b92de38afdaa"} Dec 04 22:37:07.802326 master-0 kubenswrapper[33572]: I1204 22:37:07.802057 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9904f70c-a3e6-41c4-85dd-078ab3d0cb70" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11" gracePeriod=30 Dec 04 22:37:07.808790 master-0 kubenswrapper[33572]: I1204 22:37:07.808707 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c57ec571-8d99-48b0-a466-78284aab8064","Type":"ContainerStarted","Data":"2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818"} Dec 04 22:37:07.833686 master-0 kubenswrapper[33572]: I1204 22:37:07.825857 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.961736052 podStartE2EDuration="4.825835604s" podCreationTimestamp="2025-12-04 22:37:03 +0000 UTC" firstStartedPulling="2025-12-04 22:37:04.513938424 +0000 UTC m=+1088.241464073" lastFinishedPulling="2025-12-04 22:37:07.378037986 +0000 UTC m=+1091.105563625" observedRunningTime="2025-12-04 22:37:07.820398855 +0000 UTC m=+1091.547924514" watchObservedRunningTime="2025-12-04 22:37:07.825835604 +0000 UTC m=+1091.553361253" Dec 04 22:37:07.855197 master-0 kubenswrapper[33572]: I1204 22:37:07.855093 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.002874404 podStartE2EDuration="4.855074021s" podCreationTimestamp="2025-12-04 22:37:03 +0000 UTC" firstStartedPulling="2025-12-04 22:37:04.525907691 +0000 UTC m=+1088.253433340" lastFinishedPulling="2025-12-04 22:37:07.378107308 +0000 UTC m=+1091.105632957" observedRunningTime="2025-12-04 22:37:07.845120079 +0000 UTC m=+1091.572645758" watchObservedRunningTime="2025-12-04 22:37:07.855074021 +0000 UTC m=+1091.582599660" Dec 04 22:37:08.356448 master-0 kubenswrapper[33572]: I1204 22:37:08.356357 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:08.714393 master-0 kubenswrapper[33572]: I1204 22:37:08.714200 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 22:37:08.829001 master-0 kubenswrapper[33572]: I1204 22:37:08.828921 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"83a5bf22-49cc-4341-926c-62c129088b57","Type":"ContainerStarted","Data":"6169c3bf4abef692c40f35f7b8f7af2e834631788ebf68c6323cfd0702aa65ba"} Dec 04 22:37:08.833845 master-0 kubenswrapper[33572]: I1204 22:37:08.833653 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74d1c2d-00eb-4165-b8e9-76b3640bd88d","Type":"ContainerStarted","Data":"7427e6e84e6b35778966b90ec113f783374ae543a4251f264fb712a9cbf33a59"} Dec 04 22:37:08.833845 master-0 kubenswrapper[33572]: I1204 22:37:08.833785 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" containerName="nova-metadata-log" containerID="cri-o://09444a3ac61d09f5fe2425675b39d15f04458f0d4b58afe434c7b92de38afdaa" gracePeriod=30 Dec 04 22:37:08.833845 master-0 kubenswrapper[33572]: I1204 22:37:08.833825 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" containerName="nova-metadata-metadata" containerID="cri-o://7427e6e84e6b35778966b90ec113f783374ae543a4251f264fb712a9cbf33a59" gracePeriod=30 Dec 04 22:37:08.842336 master-0 kubenswrapper[33572]: I1204 22:37:08.841366 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9904f70c-a3e6-41c4-85dd-078ab3d0cb70","Type":"ContainerStarted","Data":"4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11"} Dec 04 22:37:08.853382 master-0 kubenswrapper[33572]: I1204 22:37:08.853140 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c57ec571-8d99-48b0-a466-78284aab8064","Type":"ContainerStarted","Data":"bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44"} Dec 04 22:37:08.856948 master-0 kubenswrapper[33572]: I1204 22:37:08.856704 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" event={"ID":"9408932d-d92d-4d1a-bd67-7e90e89b2341","Type":"ContainerStarted","Data":"d0b3d68e04df90b1f3a9b46d744450614f696daf560f87b2d61928bad6ef7292"} Dec 04 22:37:08.856948 master-0 kubenswrapper[33572]: I1204 22:37:08.856914 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:09.371590 master-0 kubenswrapper[33572]: I1204 22:37:09.371459 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.613075235 podStartE2EDuration="6.371440742s" podCreationTimestamp="2025-12-04 22:37:03 +0000 UTC" firstStartedPulling="2025-12-04 22:37:04.630304189 +0000 UTC m=+1088.357829838" lastFinishedPulling="2025-12-04 22:37:07.388669686 +0000 UTC m=+1091.116195345" observedRunningTime="2025-12-04 22:37:09.350127681 +0000 UTC m=+1093.077653330" watchObservedRunningTime="2025-12-04 22:37:09.371440742 +0000 UTC m=+1093.098966391" Dec 04 22:37:09.394692 master-0 kubenswrapper[33572]: I1204 22:37:09.393155 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" podStartSLOduration=6.393136674 podStartE2EDuration="6.393136674s" podCreationTimestamp="2025-12-04 22:37:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:09.380341254 +0000 UTC m=+1093.107866913" watchObservedRunningTime="2025-12-04 22:37:09.393136674 +0000 UTC m=+1093.120662323" Dec 04 22:37:09.425539 master-0 kubenswrapper[33572]: I1204 22:37:09.421949 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.295316136 podStartE2EDuration="7.421927119s" podCreationTimestamp="2025-12-04 22:37:02 +0000 UTC" firstStartedPulling="2025-12-04 22:37:04.253139549 +0000 UTC m=+1087.980665198" lastFinishedPulling="2025-12-04 22:37:07.379750532 +0000 UTC m=+1091.107276181" observedRunningTime="2025-12-04 22:37:09.405273525 +0000 UTC m=+1093.132799184" watchObservedRunningTime="2025-12-04 22:37:09.421927119 +0000 UTC m=+1093.149452768" Dec 04 22:37:09.873632 master-0 kubenswrapper[33572]: I1204 22:37:09.873563 33572 generic.go:334] "Generic (PLEG): container finished" podID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" containerID="7427e6e84e6b35778966b90ec113f783374ae543a4251f264fb712a9cbf33a59" exitCode=0 Dec 04 22:37:09.873632 master-0 kubenswrapper[33572]: I1204 22:37:09.873611 33572 generic.go:334] "Generic (PLEG): container finished" podID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" containerID="09444a3ac61d09f5fe2425675b39d15f04458f0d4b58afe434c7b92de38afdaa" exitCode=143 Dec 04 22:37:09.874283 master-0 kubenswrapper[33572]: I1204 22:37:09.873627 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74d1c2d-00eb-4165-b8e9-76b3640bd88d","Type":"ContainerDied","Data":"7427e6e84e6b35778966b90ec113f783374ae543a4251f264fb712a9cbf33a59"} Dec 04 22:37:09.874283 master-0 kubenswrapper[33572]: I1204 22:37:09.873707 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74d1c2d-00eb-4165-b8e9-76b3640bd88d","Type":"ContainerDied","Data":"09444a3ac61d09f5fe2425675b39d15f04458f0d4b58afe434c7b92de38afdaa"} Dec 04 22:37:10.068089 master-0 kubenswrapper[33572]: I1204 22:37:10.068038 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:37:10.109469 master-0 kubenswrapper[33572]: I1204 22:37:10.109415 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp6xq\" (UniqueName: \"kubernetes.io/projected/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-kube-api-access-gp6xq\") pod \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " Dec 04 22:37:10.109469 master-0 kubenswrapper[33572]: I1204 22:37:10.109481 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-logs\") pod \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " Dec 04 22:37:10.109770 master-0 kubenswrapper[33572]: I1204 22:37:10.109536 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-config-data\") pod \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " Dec 04 22:37:10.109770 master-0 kubenswrapper[33572]: I1204 22:37:10.109713 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-combined-ca-bundle\") pod \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\" (UID: \"f74d1c2d-00eb-4165-b8e9-76b3640bd88d\") " Dec 04 22:37:10.110572 master-0 kubenswrapper[33572]: I1204 22:37:10.110491 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-logs" (OuterVolumeSpecName: "logs") pod "f74d1c2d-00eb-4165-b8e9-76b3640bd88d" (UID: "f74d1c2d-00eb-4165-b8e9-76b3640bd88d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:37:10.111116 master-0 kubenswrapper[33572]: I1204 22:37:10.111088 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:10.113333 master-0 kubenswrapper[33572]: I1204 22:37:10.113272 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-kube-api-access-gp6xq" (OuterVolumeSpecName: "kube-api-access-gp6xq") pod "f74d1c2d-00eb-4165-b8e9-76b3640bd88d" (UID: "f74d1c2d-00eb-4165-b8e9-76b3640bd88d"). InnerVolumeSpecName "kube-api-access-gp6xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:10.149535 master-0 kubenswrapper[33572]: I1204 22:37:10.149437 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-config-data" (OuterVolumeSpecName: "config-data") pod "f74d1c2d-00eb-4165-b8e9-76b3640bd88d" (UID: "f74d1c2d-00eb-4165-b8e9-76b3640bd88d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:10.158232 master-0 kubenswrapper[33572]: I1204 22:37:10.158161 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f74d1c2d-00eb-4165-b8e9-76b3640bd88d" (UID: "f74d1c2d-00eb-4165-b8e9-76b3640bd88d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:10.214140 master-0 kubenswrapper[33572]: I1204 22:37:10.214049 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:10.214140 master-0 kubenswrapper[33572]: I1204 22:37:10.214103 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp6xq\" (UniqueName: \"kubernetes.io/projected/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-kube-api-access-gp6xq\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:10.214140 master-0 kubenswrapper[33572]: I1204 22:37:10.214120 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74d1c2d-00eb-4165-b8e9-76b3640bd88d-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:10.900929 master-0 kubenswrapper[33572]: I1204 22:37:10.900788 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74d1c2d-00eb-4165-b8e9-76b3640bd88d","Type":"ContainerDied","Data":"af2ac0f3d8ed13b983c0c9c4977a629064e50484771322144afa5dca580e614f"} Dec 04 22:37:10.900929 master-0 kubenswrapper[33572]: I1204 22:37:10.900850 33572 scope.go:117] "RemoveContainer" containerID="7427e6e84e6b35778966b90ec113f783374ae543a4251f264fb712a9cbf33a59" Dec 04 22:37:10.901637 master-0 kubenswrapper[33572]: I1204 22:37:10.900985 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:37:10.927754 master-0 kubenswrapper[33572]: I1204 22:37:10.927700 33572 scope.go:117] "RemoveContainer" containerID="09444a3ac61d09f5fe2425675b39d15f04458f0d4b58afe434c7b92de38afdaa" Dec 04 22:37:10.996794 master-0 kubenswrapper[33572]: I1204 22:37:10.996736 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:11.039862 master-0 kubenswrapper[33572]: I1204 22:37:11.039810 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:11.056619 master-0 kubenswrapper[33572]: I1204 22:37:11.056544 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:11.057187 master-0 kubenswrapper[33572]: E1204 22:37:11.057164 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" containerName="nova-metadata-log" Dec 04 22:37:11.057187 master-0 kubenswrapper[33572]: I1204 22:37:11.057185 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" containerName="nova-metadata-log" Dec 04 22:37:11.057279 master-0 kubenswrapper[33572]: E1204 22:37:11.057220 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" containerName="nova-metadata-metadata" Dec 04 22:37:11.057279 master-0 kubenswrapper[33572]: I1204 22:37:11.057227 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" containerName="nova-metadata-metadata" Dec 04 22:37:11.057490 master-0 kubenswrapper[33572]: I1204 22:37:11.057471 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" containerName="nova-metadata-metadata" Dec 04 22:37:11.057553 master-0 kubenswrapper[33572]: I1204 22:37:11.057533 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" containerName="nova-metadata-log" Dec 04 22:37:11.058865 master-0 kubenswrapper[33572]: I1204 22:37:11.058826 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:37:11.062296 master-0 kubenswrapper[33572]: I1204 22:37:11.062081 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 22:37:11.062686 master-0 kubenswrapper[33572]: I1204 22:37:11.062232 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 22:37:11.083086 master-0 kubenswrapper[33572]: I1204 22:37:11.083037 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:11.141522 master-0 kubenswrapper[33572]: I1204 22:37:11.141444 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e02eeef5-2729-4b19-84b7-a0059c569b5e-logs\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.141522 master-0 kubenswrapper[33572]: I1204 22:37:11.141532 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-config-data\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.141853 master-0 kubenswrapper[33572]: I1204 22:37:11.141637 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zwbl\" (UniqueName: \"kubernetes.io/projected/e02eeef5-2729-4b19-84b7-a0059c569b5e-kube-api-access-7zwbl\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.141853 master-0 kubenswrapper[33572]: I1204 22:37:11.141731 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.142114 master-0 kubenswrapper[33572]: I1204 22:37:11.142024 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.246061 master-0 kubenswrapper[33572]: I1204 22:37:11.245909 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zwbl\" (UniqueName: \"kubernetes.io/projected/e02eeef5-2729-4b19-84b7-a0059c569b5e-kube-api-access-7zwbl\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.246061 master-0 kubenswrapper[33572]: I1204 22:37:11.246027 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.246284 master-0 kubenswrapper[33572]: I1204 22:37:11.246147 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.246555 master-0 kubenswrapper[33572]: I1204 22:37:11.246426 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e02eeef5-2729-4b19-84b7-a0059c569b5e-logs\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.246671 master-0 kubenswrapper[33572]: I1204 22:37:11.246635 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-config-data\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.247365 master-0 kubenswrapper[33572]: I1204 22:37:11.247308 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e02eeef5-2729-4b19-84b7-a0059c569b5e-logs\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.253955 master-0 kubenswrapper[33572]: I1204 22:37:11.253240 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.258090 master-0 kubenswrapper[33572]: I1204 22:37:11.258003 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.264251 master-0 kubenswrapper[33572]: I1204 22:37:11.264194 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zwbl\" (UniqueName: \"kubernetes.io/projected/e02eeef5-2729-4b19-84b7-a0059c569b5e-kube-api-access-7zwbl\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.275593 master-0 kubenswrapper[33572]: I1204 22:37:11.275464 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-config-data\") pod \"nova-metadata-0\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " pod="openstack/nova-metadata-0" Dec 04 22:37:11.384570 master-0 kubenswrapper[33572]: I1204 22:37:11.384484 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:37:11.872761 master-0 kubenswrapper[33572]: I1204 22:37:11.867146 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:11.954831 master-0 kubenswrapper[33572]: I1204 22:37:11.954755 33572 generic.go:334] "Generic (PLEG): container finished" podID="384308f7-6942-4013-9bcd-e56fde1ab09f" containerID="5e0a765d17be796298bfbad89487ca763f9600f55ffb5dc9d673752542258cf2" exitCode=0 Dec 04 22:37:11.955482 master-0 kubenswrapper[33572]: I1204 22:37:11.954864 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-96wfh" event={"ID":"384308f7-6942-4013-9bcd-e56fde1ab09f","Type":"ContainerDied","Data":"5e0a765d17be796298bfbad89487ca763f9600f55ffb5dc9d673752542258cf2"} Dec 04 22:37:12.544270 master-0 kubenswrapper[33572]: I1204 22:37:12.544183 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74d1c2d-00eb-4165-b8e9-76b3640bd88d" path="/var/lib/kubelet/pods/f74d1c2d-00eb-4165-b8e9-76b3640bd88d/volumes" Dec 04 22:37:13.279096 master-0 kubenswrapper[33572]: I1204 22:37:13.279044 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 22:37:13.280577 master-0 kubenswrapper[33572]: I1204 22:37:13.280432 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 22:37:13.714227 master-0 kubenswrapper[33572]: I1204 22:37:13.714029 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 22:37:13.763009 master-0 kubenswrapper[33572]: I1204 22:37:13.762772 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 22:37:13.852579 master-0 kubenswrapper[33572]: I1204 22:37:13.850747 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:13.931558 master-0 kubenswrapper[33572]: I1204 22:37:13.930915 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fccc65cc-8m4nz"] Dec 04 22:37:13.931558 master-0 kubenswrapper[33572]: I1204 22:37:13.931158 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" podUID="a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" containerName="dnsmasq-dns" containerID="cri-o://3e28f85dcff415799db894d82bc38aa597140573407611014b13faad6628ecaa" gracePeriod=10 Dec 04 22:37:14.013458 master-0 kubenswrapper[33572]: I1204 22:37:14.013396 33572 generic.go:334] "Generic (PLEG): container finished" podID="5c2c1216-db02-4bb8-856c-a56a68e06d23" containerID="ffd36e5d6e150b002db06f2232e39d6d5924ff66b522113a2b1e78a73a390535" exitCode=0 Dec 04 22:37:14.013845 master-0 kubenswrapper[33572]: I1204 22:37:14.013762 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5c2c1216-db02-4bb8-856c-a56a68e06d23","Type":"ContainerDied","Data":"ffd36e5d6e150b002db06f2232e39d6d5924ff66b522113a2b1e78a73a390535"} Dec 04 22:37:14.061944 master-0 kubenswrapper[33572]: I1204 22:37:14.061787 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 22:37:14.319798 master-0 kubenswrapper[33572]: I1204 22:37:14.319718 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c57ec571-8d99-48b0-a466-78284aab8064" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.1:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 22:37:14.366561 master-0 kubenswrapper[33572]: I1204 22:37:14.360730 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c57ec571-8d99-48b0-a466-78284aab8064" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.1:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 22:37:14.509159 master-0 kubenswrapper[33572]: I1204 22:37:14.509081 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" podUID="a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.242:5353: connect: connection refused" Dec 04 22:37:15.036864 master-0 kubenswrapper[33572]: I1204 22:37:15.036805 33572 generic.go:334] "Generic (PLEG): container finished" podID="a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" containerID="3e28f85dcff415799db894d82bc38aa597140573407611014b13faad6628ecaa" exitCode=0 Dec 04 22:37:15.037725 master-0 kubenswrapper[33572]: I1204 22:37:15.037694 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" event={"ID":"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790","Type":"ContainerDied","Data":"3e28f85dcff415799db894d82bc38aa597140573407611014b13faad6628ecaa"} Dec 04 22:37:17.120524 master-0 kubenswrapper[33572]: I1204 22:37:17.118202 33572 generic.go:334] "Generic (PLEG): container finished" podID="26e68458-b43a-471d-8d30-bfe010b365f3" containerID="342ab6428cfe3389305d6d9a5b6e25caf9b329efe3f47477ce696e2be0645bd3" exitCode=0 Dec 04 22:37:17.120524 master-0 kubenswrapper[33572]: I1204 22:37:17.118270 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8rh5h" event={"ID":"26e68458-b43a-471d-8d30-bfe010b365f3","Type":"ContainerDied","Data":"342ab6428cfe3389305d6d9a5b6e25caf9b329efe3f47477ce696e2be0645bd3"} Dec 04 22:37:17.630004 master-0 kubenswrapper[33572]: W1204 22:37:17.629850 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode02eeef5_2729_4b19_84b7_a0059c569b5e.slice/crio-e2145bfa0bea0fd7d22699ebe7d4290eb8641d5d8791d12ead916da6e87ed865 WatchSource:0}: Error finding container e2145bfa0bea0fd7d22699ebe7d4290eb8641d5d8791d12ead916da6e87ed865: Status 404 returned error can't find the container with id e2145bfa0bea0fd7d22699ebe7d4290eb8641d5d8791d12ead916da6e87ed865 Dec 04 22:37:17.990060 master-0 kubenswrapper[33572]: I1204 22:37:17.989938 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:18.064778 master-0 kubenswrapper[33572]: I1204 22:37:18.064713 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtj6n\" (UniqueName: \"kubernetes.io/projected/384308f7-6942-4013-9bcd-e56fde1ab09f-kube-api-access-wtj6n\") pod \"384308f7-6942-4013-9bcd-e56fde1ab09f\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " Dec 04 22:37:18.065037 master-0 kubenswrapper[33572]: I1204 22:37:18.065003 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-scripts\") pod \"384308f7-6942-4013-9bcd-e56fde1ab09f\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " Dec 04 22:37:18.065272 master-0 kubenswrapper[33572]: I1204 22:37:18.065236 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-config-data\") pod \"384308f7-6942-4013-9bcd-e56fde1ab09f\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " Dec 04 22:37:18.066105 master-0 kubenswrapper[33572]: I1204 22:37:18.065411 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-combined-ca-bundle\") pod \"384308f7-6942-4013-9bcd-e56fde1ab09f\" (UID: \"384308f7-6942-4013-9bcd-e56fde1ab09f\") " Dec 04 22:37:18.072812 master-0 kubenswrapper[33572]: I1204 22:37:18.072734 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-scripts" (OuterVolumeSpecName: "scripts") pod "384308f7-6942-4013-9bcd-e56fde1ab09f" (UID: "384308f7-6942-4013-9bcd-e56fde1ab09f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:18.072812 master-0 kubenswrapper[33572]: I1204 22:37:18.072758 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384308f7-6942-4013-9bcd-e56fde1ab09f-kube-api-access-wtj6n" (OuterVolumeSpecName: "kube-api-access-wtj6n") pod "384308f7-6942-4013-9bcd-e56fde1ab09f" (UID: "384308f7-6942-4013-9bcd-e56fde1ab09f"). InnerVolumeSpecName "kube-api-access-wtj6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:18.073522 master-0 kubenswrapper[33572]: I1204 22:37:18.073489 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:37:18.164001 master-0 kubenswrapper[33572]: I1204 22:37:18.162757 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" event={"ID":"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790","Type":"ContainerDied","Data":"9137bbadee76dbf3cbc50c1db0ebe0f7b1bf71eed8b5d837e9734452b38f831d"} Dec 04 22:37:18.164001 master-0 kubenswrapper[33572]: I1204 22:37:18.162812 33572 scope.go:117] "RemoveContainer" containerID="3e28f85dcff415799db894d82bc38aa597140573407611014b13faad6628ecaa" Dec 04 22:37:18.164001 master-0 kubenswrapper[33572]: I1204 22:37:18.162916 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fccc65cc-8m4nz" Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.167599 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-config\") pod \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.167767 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-svc\") pod \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.167912 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-nb\") pod \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.167954 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-swift-storage-0\") pod \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.168004 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-sb\") pod \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.168048 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fmcj\" (UniqueName: \"kubernetes.io/projected/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-kube-api-access-7fmcj\") pod \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\" (UID: \"a0fb2afe-6c27-4f4e-bf1e-27d253b4c790\") " Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.168718 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtj6n\" (UniqueName: \"kubernetes.io/projected/384308f7-6942-4013-9bcd-e56fde1ab09f-kube-api-access-wtj6n\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.168740 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.170795 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-96wfh" event={"ID":"384308f7-6942-4013-9bcd-e56fde1ab09f","Type":"ContainerDied","Data":"3466a92dc262cf71b24d5565919c7162b749983a013f4412fef97bee14dd0bdf"} Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.170900 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3466a92dc262cf71b24d5565919c7162b749983a013f4412fef97bee14dd0bdf" Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.170904 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-96wfh" Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.172921 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-config-data" (OuterVolumeSpecName: "config-data") pod "384308f7-6942-4013-9bcd-e56fde1ab09f" (UID: "384308f7-6942-4013-9bcd-e56fde1ab09f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.174861 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-kube-api-access-7fmcj" (OuterVolumeSpecName: "kube-api-access-7fmcj") pod "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" (UID: "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790"). InnerVolumeSpecName "kube-api-access-7fmcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:18.179889 master-0 kubenswrapper[33572]: I1204 22:37:18.179058 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e02eeef5-2729-4b19-84b7-a0059c569b5e","Type":"ContainerStarted","Data":"e2145bfa0bea0fd7d22699ebe7d4290eb8641d5d8791d12ead916da6e87ed865"} Dec 04 22:37:18.180303 master-0 kubenswrapper[33572]: I1204 22:37:18.180140 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "384308f7-6942-4013-9bcd-e56fde1ab09f" (UID: "384308f7-6942-4013-9bcd-e56fde1ab09f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:18.203211 master-0 kubenswrapper[33572]: I1204 22:37:18.203126 33572 scope.go:117] "RemoveContainer" containerID="095dcd17a8fc7ecc1902e798a974640c9ba3121574eea44963691e538ce86ce5" Dec 04 22:37:18.259653 master-0 kubenswrapper[33572]: I1204 22:37:18.259233 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-config" (OuterVolumeSpecName: "config") pod "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" (UID: "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:37:18.270867 master-0 kubenswrapper[33572]: I1204 22:37:18.270802 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fmcj\" (UniqueName: \"kubernetes.io/projected/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-kube-api-access-7fmcj\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.270867 master-0 kubenswrapper[33572]: I1204 22:37:18.270863 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.270867 master-0 kubenswrapper[33572]: I1204 22:37:18.270876 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384308f7-6942-4013-9bcd-e56fde1ab09f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.271091 master-0 kubenswrapper[33572]: I1204 22:37:18.270889 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.274237 master-0 kubenswrapper[33572]: I1204 22:37:18.274183 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" (UID: "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:37:18.276217 master-0 kubenswrapper[33572]: I1204 22:37:18.276155 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" (UID: "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:37:18.302433 master-0 kubenswrapper[33572]: I1204 22:37:18.302336 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" (UID: "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:37:18.304355 master-0 kubenswrapper[33572]: I1204 22:37:18.304311 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" (UID: "a0fb2afe-6c27-4f4e-bf1e-27d253b4c790"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:37:18.373115 master-0 kubenswrapper[33572]: I1204 22:37:18.373059 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.373115 master-0 kubenswrapper[33572]: I1204 22:37:18.373107 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.373238 master-0 kubenswrapper[33572]: I1204 22:37:18.373119 33572 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.373238 master-0 kubenswrapper[33572]: I1204 22:37:18.373131 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.521610 master-0 kubenswrapper[33572]: I1204 22:37:18.517238 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fccc65cc-8m4nz"] Dec 04 22:37:18.553977 master-0 kubenswrapper[33572]: I1204 22:37:18.553907 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9fccc65cc-8m4nz"] Dec 04 22:37:18.600379 master-0 kubenswrapper[33572]: I1204 22:37:18.600321 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:18.694014 master-0 kubenswrapper[33572]: I1204 22:37:18.684452 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-scripts\") pod \"26e68458-b43a-471d-8d30-bfe010b365f3\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " Dec 04 22:37:18.694014 master-0 kubenswrapper[33572]: I1204 22:37:18.684593 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q6b9\" (UniqueName: \"kubernetes.io/projected/26e68458-b43a-471d-8d30-bfe010b365f3-kube-api-access-4q6b9\") pod \"26e68458-b43a-471d-8d30-bfe010b365f3\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " Dec 04 22:37:18.694014 master-0 kubenswrapper[33572]: I1204 22:37:18.684742 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-combined-ca-bundle\") pod \"26e68458-b43a-471d-8d30-bfe010b365f3\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " Dec 04 22:37:18.694014 master-0 kubenswrapper[33572]: I1204 22:37:18.684787 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-config-data\") pod \"26e68458-b43a-471d-8d30-bfe010b365f3\" (UID: \"26e68458-b43a-471d-8d30-bfe010b365f3\") " Dec 04 22:37:18.694014 master-0 kubenswrapper[33572]: I1204 22:37:18.689271 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e68458-b43a-471d-8d30-bfe010b365f3-kube-api-access-4q6b9" (OuterVolumeSpecName: "kube-api-access-4q6b9") pod "26e68458-b43a-471d-8d30-bfe010b365f3" (UID: "26e68458-b43a-471d-8d30-bfe010b365f3"). InnerVolumeSpecName "kube-api-access-4q6b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:18.694014 master-0 kubenswrapper[33572]: I1204 22:37:18.693848 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-scripts" (OuterVolumeSpecName: "scripts") pod "26e68458-b43a-471d-8d30-bfe010b365f3" (UID: "26e68458-b43a-471d-8d30-bfe010b365f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:18.724104 master-0 kubenswrapper[33572]: I1204 22:37:18.724021 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26e68458-b43a-471d-8d30-bfe010b365f3" (UID: "26e68458-b43a-471d-8d30-bfe010b365f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:18.730343 master-0 kubenswrapper[33572]: I1204 22:37:18.730252 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-config-data" (OuterVolumeSpecName: "config-data") pod "26e68458-b43a-471d-8d30-bfe010b365f3" (UID: "26e68458-b43a-471d-8d30-bfe010b365f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:18.788549 master-0 kubenswrapper[33572]: I1204 22:37:18.788389 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.788549 master-0 kubenswrapper[33572]: I1204 22:37:18.788448 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.788549 master-0 kubenswrapper[33572]: I1204 22:37:18.788461 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e68458-b43a-471d-8d30-bfe010b365f3-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:18.788549 master-0 kubenswrapper[33572]: I1204 22:37:18.788473 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q6b9\" (UniqueName: \"kubernetes.io/projected/26e68458-b43a-471d-8d30-bfe010b365f3-kube-api-access-4q6b9\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:19.226534 master-0 kubenswrapper[33572]: I1204 22:37:19.220707 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e02eeef5-2729-4b19-84b7-a0059c569b5e","Type":"ContainerStarted","Data":"f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9"} Dec 04 22:37:19.226534 master-0 kubenswrapper[33572]: I1204 22:37:19.220758 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e02eeef5-2729-4b19-84b7-a0059c569b5e","Type":"ContainerStarted","Data":"bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9"} Dec 04 22:37:19.232526 master-0 kubenswrapper[33572]: I1204 22:37:19.231592 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:19.232526 master-0 kubenswrapper[33572]: I1204 22:37:19.231952 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c57ec571-8d99-48b0-a466-78284aab8064" containerName="nova-api-log" containerID="cri-o://2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818" gracePeriod=30 Dec 04 22:37:19.232526 master-0 kubenswrapper[33572]: I1204 22:37:19.232099 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c57ec571-8d99-48b0-a466-78284aab8064" containerName="nova-api-api" containerID="cri-o://bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44" gracePeriod=30 Dec 04 22:37:19.252620 master-0 kubenswrapper[33572]: I1204 22:37:19.250421 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5c2c1216-db02-4bb8-856c-a56a68e06d23","Type":"ContainerStarted","Data":"45c7352128daf3d05578dd53eaef312dfbc3b3476838d046518899a06d5016c5"} Dec 04 22:37:19.252620 master-0 kubenswrapper[33572]: I1204 22:37:19.250489 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5c2c1216-db02-4bb8-856c-a56a68e06d23","Type":"ContainerStarted","Data":"f203c416d10878b6e514eb725fb4b177f7c879ce0da0a3bc956ad47f92df890e"} Dec 04 22:37:19.265589 master-0 kubenswrapper[33572]: I1204 22:37:19.260392 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:37:19.265589 master-0 kubenswrapper[33572]: I1204 22:37:19.260728 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="83a5bf22-49cc-4341-926c-62c129088b57" containerName="nova-scheduler-scheduler" containerID="cri-o://6169c3bf4abef692c40f35f7b8f7af2e834631788ebf68c6323cfd0702aa65ba" gracePeriod=30 Dec 04 22:37:19.272522 master-0 kubenswrapper[33572]: I1204 22:37:19.271025 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8rh5h" event={"ID":"26e68458-b43a-471d-8d30-bfe010b365f3","Type":"ContainerDied","Data":"12c1f8fd375eff79ab3dfc43b0b73433099bbd87238c098f1420342966451de1"} Dec 04 22:37:19.272522 master-0 kubenswrapper[33572]: I1204 22:37:19.271085 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12c1f8fd375eff79ab3dfc43b0b73433099bbd87238c098f1420342966451de1" Dec 04 22:37:19.272522 master-0 kubenswrapper[33572]: I1204 22:37:19.271153 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8rh5h" Dec 04 22:37:19.286755 master-0 kubenswrapper[33572]: I1204 22:37:19.285496 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"14343bc6-14e1-4b02-9753-ab42edb7d682","Type":"ContainerStarted","Data":"a9e7a6a6d18fb63ae5352fa40442bf75ccc90d7d0505edcab8ab2b6e79ccfe61"} Dec 04 22:37:19.286755 master-0 kubenswrapper[33572]: I1204 22:37:19.285989 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:19.286755 master-0 kubenswrapper[33572]: I1204 22:37:19.286662 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:19.362523 master-0 kubenswrapper[33572]: I1204 22:37:19.358183 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 04 22:37:19.379530 master-0 kubenswrapper[33572]: I1204 22:37:19.376402 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=9.376378748 podStartE2EDuration="9.376378748s" podCreationTimestamp="2025-12-04 22:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:19.254853332 +0000 UTC m=+1102.982378981" watchObservedRunningTime="2025-12-04 22:37:19.376378748 +0000 UTC m=+1103.103904397" Dec 04 22:37:19.405530 master-0 kubenswrapper[33572]: I1204 22:37:19.402673 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-compute-ironic-compute-0" podStartSLOduration=3.731717429 podStartE2EDuration="17.402651894s" podCreationTimestamp="2025-12-04 22:37:02 +0000 UTC" firstStartedPulling="2025-12-04 22:37:04.076978182 +0000 UTC m=+1087.804503831" lastFinishedPulling="2025-12-04 22:37:17.747912607 +0000 UTC m=+1101.475438296" observedRunningTime="2025-12-04 22:37:19.316721899 +0000 UTC m=+1103.044247548" watchObservedRunningTime="2025-12-04 22:37:19.402651894 +0000 UTC m=+1103.130177543" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: I1204 22:37:19.416578 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: E1204 22:37:19.417317 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384308f7-6942-4013-9bcd-e56fde1ab09f" containerName="nova-manage" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: I1204 22:37:19.417339 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="384308f7-6942-4013-9bcd-e56fde1ab09f" containerName="nova-manage" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: E1204 22:37:19.417361 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" containerName="init" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: I1204 22:37:19.417369 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" containerName="init" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: E1204 22:37:19.417396 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e68458-b43a-471d-8d30-bfe010b365f3" containerName="nova-cell1-conductor-db-sync" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: I1204 22:37:19.417406 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e68458-b43a-471d-8d30-bfe010b365f3" containerName="nova-cell1-conductor-db-sync" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: E1204 22:37:19.417427 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" containerName="dnsmasq-dns" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: I1204 22:37:19.417435 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" containerName="dnsmasq-dns" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: I1204 22:37:19.417736 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" containerName="dnsmasq-dns" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: I1204 22:37:19.417762 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e68458-b43a-471d-8d30-bfe010b365f3" containerName="nova-cell1-conductor-db-sync" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: I1204 22:37:19.417776 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="384308f7-6942-4013-9bcd-e56fde1ab09f" containerName="nova-manage" Dec 04 22:37:19.421531 master-0 kubenswrapper[33572]: I1204 22:37:19.418657 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:19.430540 master-0 kubenswrapper[33572]: I1204 22:37:19.425625 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 22:37:19.430540 master-0 kubenswrapper[33572]: I1204 22:37:19.425720 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 22:37:19.527711 master-0 kubenswrapper[33572]: I1204 22:37:19.527142 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdql2\" (UniqueName: \"kubernetes.io/projected/12658633-b57d-4bfb-a4ae-63cddf8517c3-kube-api-access-zdql2\") pod \"nova-cell1-conductor-0\" (UID: \"12658633-b57d-4bfb-a4ae-63cddf8517c3\") " pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:19.527711 master-0 kubenswrapper[33572]: I1204 22:37:19.527665 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12658633-b57d-4bfb-a4ae-63cddf8517c3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"12658633-b57d-4bfb-a4ae-63cddf8517c3\") " pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:19.528121 master-0 kubenswrapper[33572]: I1204 22:37:19.527791 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12658633-b57d-4bfb-a4ae-63cddf8517c3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"12658633-b57d-4bfb-a4ae-63cddf8517c3\") " pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:19.633587 master-0 kubenswrapper[33572]: I1204 22:37:19.631327 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdql2\" (UniqueName: \"kubernetes.io/projected/12658633-b57d-4bfb-a4ae-63cddf8517c3-kube-api-access-zdql2\") pod \"nova-cell1-conductor-0\" (UID: \"12658633-b57d-4bfb-a4ae-63cddf8517c3\") " pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:19.633587 master-0 kubenswrapper[33572]: I1204 22:37:19.631595 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12658633-b57d-4bfb-a4ae-63cddf8517c3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"12658633-b57d-4bfb-a4ae-63cddf8517c3\") " pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:19.633587 master-0 kubenswrapper[33572]: I1204 22:37:19.631706 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12658633-b57d-4bfb-a4ae-63cddf8517c3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"12658633-b57d-4bfb-a4ae-63cddf8517c3\") " pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:19.636884 master-0 kubenswrapper[33572]: I1204 22:37:19.636823 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12658633-b57d-4bfb-a4ae-63cddf8517c3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"12658633-b57d-4bfb-a4ae-63cddf8517c3\") " pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:19.637549 master-0 kubenswrapper[33572]: I1204 22:37:19.637466 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12658633-b57d-4bfb-a4ae-63cddf8517c3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"12658633-b57d-4bfb-a4ae-63cddf8517c3\") " pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:19.654387 master-0 kubenswrapper[33572]: I1204 22:37:19.654313 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdql2\" (UniqueName: \"kubernetes.io/projected/12658633-b57d-4bfb-a4ae-63cddf8517c3-kube-api-access-zdql2\") pod \"nova-cell1-conductor-0\" (UID: \"12658633-b57d-4bfb-a4ae-63cddf8517c3\") " pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:19.758479 master-0 kubenswrapper[33572]: I1204 22:37:19.758365 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:20.258976 master-0 kubenswrapper[33572]: I1204 22:37:20.258517 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 22:37:20.305311 master-0 kubenswrapper[33572]: I1204 22:37:20.305248 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"5c2c1216-db02-4bb8-856c-a56a68e06d23","Type":"ContainerStarted","Data":"a8ed84e90e7cb5a2f60c207f2bf76ab9a94b60d819dbb1a67891a301d13038d6"} Dec 04 22:37:20.305630 master-0 kubenswrapper[33572]: I1204 22:37:20.305582 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Dec 04 22:37:20.305965 master-0 kubenswrapper[33572]: I1204 22:37:20.305936 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Dec 04 22:37:20.305965 master-0 kubenswrapper[33572]: I1204 22:37:20.305959 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Dec 04 22:37:20.309488 master-0 kubenswrapper[33572]: I1204 22:37:20.309450 33572 generic.go:334] "Generic (PLEG): container finished" podID="c57ec571-8d99-48b0-a466-78284aab8064" containerID="2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818" exitCode=143 Dec 04 22:37:20.309750 master-0 kubenswrapper[33572]: I1204 22:37:20.309527 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c57ec571-8d99-48b0-a466-78284aab8064","Type":"ContainerDied","Data":"2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818"} Dec 04 22:37:20.312755 master-0 kubenswrapper[33572]: I1204 22:37:20.312096 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"12658633-b57d-4bfb-a4ae-63cddf8517c3","Type":"ContainerStarted","Data":"5aad3708e2aaca8c72ba9ed9e3260fd574d79a246a5f39042450bdb1f75b1306"} Dec 04 22:37:20.634235 master-0 kubenswrapper[33572]: I1204 22:37:20.634137 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0fb2afe-6c27-4f4e-bf1e-27d253b4c790" path="/var/lib/kubelet/pods/a0fb2afe-6c27-4f4e-bf1e-27d253b4c790/volumes" Dec 04 22:37:20.641007 master-0 kubenswrapper[33572]: I1204 22:37:20.639884 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=63.895447159 podStartE2EDuration="1m43.639858959s" podCreationTimestamp="2025-12-04 22:35:37 +0000 UTC" firstStartedPulling="2025-12-04 22:35:47.549526709 +0000 UTC m=+1011.277052358" lastFinishedPulling="2025-12-04 22:36:27.293938509 +0000 UTC m=+1051.021464158" observedRunningTime="2025-12-04 22:37:20.613095079 +0000 UTC m=+1104.340620788" watchObservedRunningTime="2025-12-04 22:37:20.639858959 +0000 UTC m=+1104.367384608" Dec 04 22:37:21.328590 master-0 kubenswrapper[33572]: I1204 22:37:21.328480 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"12658633-b57d-4bfb-a4ae-63cddf8517c3","Type":"ContainerStarted","Data":"4a0acdef7bb47849066f201346e52c45ddf8fca59bfeb0ca0570d3b2ac235ddf"} Dec 04 22:37:21.329693 master-0 kubenswrapper[33572]: I1204 22:37:21.328745 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e02eeef5-2729-4b19-84b7-a0059c569b5e" containerName="nova-metadata-log" containerID="cri-o://bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9" gracePeriod=30 Dec 04 22:37:21.329693 master-0 kubenswrapper[33572]: I1204 22:37:21.328817 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e02eeef5-2729-4b19-84b7-a0059c569b5e" containerName="nova-metadata-metadata" containerID="cri-o://f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9" gracePeriod=30 Dec 04 22:37:21.379275 master-0 kubenswrapper[33572]: I1204 22:37:21.379169 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.379147459 podStartE2EDuration="2.379147459s" podCreationTimestamp="2025-12-04 22:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:21.368270413 +0000 UTC m=+1105.095796062" watchObservedRunningTime="2025-12-04 22:37:21.379147459 +0000 UTC m=+1105.106673108" Dec 04 22:37:21.385361 master-0 kubenswrapper[33572]: I1204 22:37:21.385280 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 22:37:21.385361 master-0 kubenswrapper[33572]: I1204 22:37:21.385361 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 22:37:22.018257 master-0 kubenswrapper[33572]: I1204 22:37:22.018195 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:37:22.179636 master-0 kubenswrapper[33572]: I1204 22:37:22.179435 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-nova-metadata-tls-certs\") pod \"e02eeef5-2729-4b19-84b7-a0059c569b5e\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " Dec 04 22:37:22.179636 master-0 kubenswrapper[33572]: I1204 22:37:22.179619 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zwbl\" (UniqueName: \"kubernetes.io/projected/e02eeef5-2729-4b19-84b7-a0059c569b5e-kube-api-access-7zwbl\") pod \"e02eeef5-2729-4b19-84b7-a0059c569b5e\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " Dec 04 22:37:22.179876 master-0 kubenswrapper[33572]: I1204 22:37:22.179711 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-combined-ca-bundle\") pod \"e02eeef5-2729-4b19-84b7-a0059c569b5e\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " Dec 04 22:37:22.180028 master-0 kubenswrapper[33572]: I1204 22:37:22.179882 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-config-data\") pod \"e02eeef5-2729-4b19-84b7-a0059c569b5e\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " Dec 04 22:37:22.180418 master-0 kubenswrapper[33572]: I1204 22:37:22.180380 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e02eeef5-2729-4b19-84b7-a0059c569b5e-logs\") pod \"e02eeef5-2729-4b19-84b7-a0059c569b5e\" (UID: \"e02eeef5-2729-4b19-84b7-a0059c569b5e\") " Dec 04 22:37:22.182055 master-0 kubenswrapper[33572]: I1204 22:37:22.181999 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e02eeef5-2729-4b19-84b7-a0059c569b5e-logs" (OuterVolumeSpecName: "logs") pod "e02eeef5-2729-4b19-84b7-a0059c569b5e" (UID: "e02eeef5-2729-4b19-84b7-a0059c569b5e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:37:22.184929 master-0 kubenswrapper[33572]: I1204 22:37:22.184875 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02eeef5-2729-4b19-84b7-a0059c569b5e-kube-api-access-7zwbl" (OuterVolumeSpecName: "kube-api-access-7zwbl") pod "e02eeef5-2729-4b19-84b7-a0059c569b5e" (UID: "e02eeef5-2729-4b19-84b7-a0059c569b5e"). InnerVolumeSpecName "kube-api-access-7zwbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:22.214273 master-0 kubenswrapper[33572]: I1204 22:37:22.213716 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-config-data" (OuterVolumeSpecName: "config-data") pod "e02eeef5-2729-4b19-84b7-a0059c569b5e" (UID: "e02eeef5-2729-4b19-84b7-a0059c569b5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:22.223779 master-0 kubenswrapper[33572]: I1204 22:37:22.223679 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e02eeef5-2729-4b19-84b7-a0059c569b5e" (UID: "e02eeef5-2729-4b19-84b7-a0059c569b5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:22.236881 master-0 kubenswrapper[33572]: I1204 22:37:22.236789 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e02eeef5-2729-4b19-84b7-a0059c569b5e" (UID: "e02eeef5-2729-4b19-84b7-a0059c569b5e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:22.283839 master-0 kubenswrapper[33572]: I1204 22:37:22.283579 33572 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:22.283839 master-0 kubenswrapper[33572]: I1204 22:37:22.283627 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zwbl\" (UniqueName: \"kubernetes.io/projected/e02eeef5-2729-4b19-84b7-a0059c569b5e-kube-api-access-7zwbl\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:22.283839 master-0 kubenswrapper[33572]: I1204 22:37:22.283644 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:22.283839 master-0 kubenswrapper[33572]: I1204 22:37:22.283657 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02eeef5-2729-4b19-84b7-a0059c569b5e-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:22.283839 master-0 kubenswrapper[33572]: I1204 22:37:22.283669 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e02eeef5-2729-4b19-84b7-a0059c569b5e-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:22.344932 master-0 kubenswrapper[33572]: I1204 22:37:22.344709 33572 generic.go:334] "Generic (PLEG): container finished" podID="e02eeef5-2729-4b19-84b7-a0059c569b5e" containerID="f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9" exitCode=0 Dec 04 22:37:22.344932 master-0 kubenswrapper[33572]: I1204 22:37:22.344762 33572 generic.go:334] "Generic (PLEG): container finished" podID="e02eeef5-2729-4b19-84b7-a0059c569b5e" containerID="bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9" exitCode=143 Dec 04 22:37:22.344932 master-0 kubenswrapper[33572]: I1204 22:37:22.344775 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:37:22.344932 master-0 kubenswrapper[33572]: I1204 22:37:22.344782 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e02eeef5-2729-4b19-84b7-a0059c569b5e","Type":"ContainerDied","Data":"f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9"} Dec 04 22:37:22.350158 master-0 kubenswrapper[33572]: I1204 22:37:22.345588 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e02eeef5-2729-4b19-84b7-a0059c569b5e","Type":"ContainerDied","Data":"bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9"} Dec 04 22:37:22.350158 master-0 kubenswrapper[33572]: I1204 22:37:22.345635 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e02eeef5-2729-4b19-84b7-a0059c569b5e","Type":"ContainerDied","Data":"e2145bfa0bea0fd7d22699ebe7d4290eb8641d5d8791d12ead916da6e87ed865"} Dec 04 22:37:22.350158 master-0 kubenswrapper[33572]: I1204 22:37:22.345666 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:22.350158 master-0 kubenswrapper[33572]: I1204 22:37:22.345665 33572 scope.go:117] "RemoveContainer" containerID="f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9" Dec 04 22:37:22.412127 master-0 kubenswrapper[33572]: I1204 22:37:22.409362 33572 scope.go:117] "RemoveContainer" containerID="bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9" Dec 04 22:37:22.453577 master-0 kubenswrapper[33572]: I1204 22:37:22.452845 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:22.479469 master-0 kubenswrapper[33572]: I1204 22:37:22.479406 33572 scope.go:117] "RemoveContainer" containerID="f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9" Dec 04 22:37:22.479977 master-0 kubenswrapper[33572]: E1204 22:37:22.479940 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9\": container with ID starting with f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9 not found: ID does not exist" containerID="f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9" Dec 04 22:37:22.480131 master-0 kubenswrapper[33572]: I1204 22:37:22.480084 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9"} err="failed to get container status \"f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9\": rpc error: code = NotFound desc = could not find container \"f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9\": container with ID starting with f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9 not found: ID does not exist" Dec 04 22:37:22.480221 master-0 kubenswrapper[33572]: I1204 22:37:22.480208 33572 scope.go:117] "RemoveContainer" containerID="bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9" Dec 04 22:37:22.480694 master-0 kubenswrapper[33572]: E1204 22:37:22.480647 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9\": container with ID starting with bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9 not found: ID does not exist" containerID="bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9" Dec 04 22:37:22.480764 master-0 kubenswrapper[33572]: I1204 22:37:22.480694 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9"} err="failed to get container status \"bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9\": rpc error: code = NotFound desc = could not find container \"bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9\": container with ID starting with bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9 not found: ID does not exist" Dec 04 22:37:22.480764 master-0 kubenswrapper[33572]: I1204 22:37:22.480720 33572 scope.go:117] "RemoveContainer" containerID="f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9" Dec 04 22:37:22.481095 master-0 kubenswrapper[33572]: I1204 22:37:22.481059 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9"} err="failed to get container status \"f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9\": rpc error: code = NotFound desc = could not find container \"f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9\": container with ID starting with f23a946001c70232c3326eb2005d300954c7cebbb1c8fb360486b758194ee9d9 not found: ID does not exist" Dec 04 22:37:22.481095 master-0 kubenswrapper[33572]: I1204 22:37:22.481086 33572 scope.go:117] "RemoveContainer" containerID="bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9" Dec 04 22:37:22.481359 master-0 kubenswrapper[33572]: I1204 22:37:22.481327 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9"} err="failed to get container status \"bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9\": rpc error: code = NotFound desc = could not find container \"bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9\": container with ID starting with bde357d880933290623e77f30a52f16894967b77f82bf8c4a27ec41993074fa9 not found: ID does not exist" Dec 04 22:37:22.482900 master-0 kubenswrapper[33572]: I1204 22:37:22.482864 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:22.492723 master-0 kubenswrapper[33572]: I1204 22:37:22.492539 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:22.493072 master-0 kubenswrapper[33572]: E1204 22:37:22.493025 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02eeef5-2729-4b19-84b7-a0059c569b5e" containerName="nova-metadata-metadata" Dec 04 22:37:22.493072 master-0 kubenswrapper[33572]: I1204 22:37:22.493072 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02eeef5-2729-4b19-84b7-a0059c569b5e" containerName="nova-metadata-metadata" Dec 04 22:37:22.493161 master-0 kubenswrapper[33572]: E1204 22:37:22.493123 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02eeef5-2729-4b19-84b7-a0059c569b5e" containerName="nova-metadata-log" Dec 04 22:37:22.493161 master-0 kubenswrapper[33572]: I1204 22:37:22.493131 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02eeef5-2729-4b19-84b7-a0059c569b5e" containerName="nova-metadata-log" Dec 04 22:37:22.493381 master-0 kubenswrapper[33572]: I1204 22:37:22.493352 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02eeef5-2729-4b19-84b7-a0059c569b5e" containerName="nova-metadata-metadata" Dec 04 22:37:22.493438 master-0 kubenswrapper[33572]: I1204 22:37:22.493393 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02eeef5-2729-4b19-84b7-a0059c569b5e" containerName="nova-metadata-log" Dec 04 22:37:22.494583 master-0 kubenswrapper[33572]: I1204 22:37:22.494551 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:37:22.496489 master-0 kubenswrapper[33572]: I1204 22:37:22.496455 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 22:37:22.496702 master-0 kubenswrapper[33572]: I1204 22:37:22.496656 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 22:37:22.506611 master-0 kubenswrapper[33572]: I1204 22:37:22.506542 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:22.538345 master-0 kubenswrapper[33572]: I1204 22:37:22.538268 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e02eeef5-2729-4b19-84b7-a0059c569b5e" path="/var/lib/kubelet/pods/e02eeef5-2729-4b19-84b7-a0059c569b5e/volumes" Dec 04 22:37:22.594554 master-0 kubenswrapper[33572]: I1204 22:37:22.594366 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-config-data\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.594554 master-0 kubenswrapper[33572]: I1204 22:37:22.594484 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szzsd\" (UniqueName: \"kubernetes.io/projected/71c4600b-ad67-4111-bfae-0bc1eaad685c-kube-api-access-szzsd\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.594871 master-0 kubenswrapper[33572]: I1204 22:37:22.594778 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.594925 master-0 kubenswrapper[33572]: I1204 22:37:22.594892 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.595104 master-0 kubenswrapper[33572]: I1204 22:37:22.595034 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c4600b-ad67-4111-bfae-0bc1eaad685c-logs\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.696697 master-0 kubenswrapper[33572]: I1204 22:37:22.696626 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szzsd\" (UniqueName: \"kubernetes.io/projected/71c4600b-ad67-4111-bfae-0bc1eaad685c-kube-api-access-szzsd\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.696923 master-0 kubenswrapper[33572]: I1204 22:37:22.696744 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.696923 master-0 kubenswrapper[33572]: I1204 22:37:22.696778 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.696923 master-0 kubenswrapper[33572]: I1204 22:37:22.696825 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c4600b-ad67-4111-bfae-0bc1eaad685c-logs\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.696923 master-0 kubenswrapper[33572]: I1204 22:37:22.696922 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-config-data\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.697657 master-0 kubenswrapper[33572]: I1204 22:37:22.697625 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c4600b-ad67-4111-bfae-0bc1eaad685c-logs\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.700098 master-0 kubenswrapper[33572]: I1204 22:37:22.700066 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-config-data\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.700635 master-0 kubenswrapper[33572]: I1204 22:37:22.700596 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.700843 master-0 kubenswrapper[33572]: I1204 22:37:22.700796 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.729617 master-0 kubenswrapper[33572]: I1204 22:37:22.728299 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szzsd\" (UniqueName: \"kubernetes.io/projected/71c4600b-ad67-4111-bfae-0bc1eaad685c-kube-api-access-szzsd\") pod \"nova-metadata-0\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " pod="openstack/nova-metadata-0" Dec 04 22:37:22.904203 master-0 kubenswrapper[33572]: I1204 22:37:22.903943 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:37:23.060552 master-0 kubenswrapper[33572]: I1204 22:37:23.060462 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:37:23.207690 master-0 kubenswrapper[33572]: I1204 22:37:23.207605 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-combined-ca-bundle\") pod \"c57ec571-8d99-48b0-a466-78284aab8064\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " Dec 04 22:37:23.207913 master-0 kubenswrapper[33572]: I1204 22:37:23.207718 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvw8h\" (UniqueName: \"kubernetes.io/projected/c57ec571-8d99-48b0-a466-78284aab8064-kube-api-access-bvw8h\") pod \"c57ec571-8d99-48b0-a466-78284aab8064\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " Dec 04 22:37:23.207913 master-0 kubenswrapper[33572]: I1204 22:37:23.207779 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c57ec571-8d99-48b0-a466-78284aab8064-logs\") pod \"c57ec571-8d99-48b0-a466-78284aab8064\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " Dec 04 22:37:23.207913 master-0 kubenswrapper[33572]: I1204 22:37:23.207850 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-config-data\") pod \"c57ec571-8d99-48b0-a466-78284aab8064\" (UID: \"c57ec571-8d99-48b0-a466-78284aab8064\") " Dec 04 22:37:23.228448 master-0 kubenswrapper[33572]: I1204 22:37:23.228193 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c57ec571-8d99-48b0-a466-78284aab8064-logs" (OuterVolumeSpecName: "logs") pod "c57ec571-8d99-48b0-a466-78284aab8064" (UID: "c57ec571-8d99-48b0-a466-78284aab8064"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:37:23.229451 master-0 kubenswrapper[33572]: I1204 22:37:23.229390 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57ec571-8d99-48b0-a466-78284aab8064-kube-api-access-bvw8h" (OuterVolumeSpecName: "kube-api-access-bvw8h") pod "c57ec571-8d99-48b0-a466-78284aab8064" (UID: "c57ec571-8d99-48b0-a466-78284aab8064"). InnerVolumeSpecName "kube-api-access-bvw8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:23.246331 master-0 kubenswrapper[33572]: I1204 22:37:23.246272 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-config-data" (OuterVolumeSpecName: "config-data") pod "c57ec571-8d99-48b0-a466-78284aab8064" (UID: "c57ec571-8d99-48b0-a466-78284aab8064"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:23.253033 master-0 kubenswrapper[33572]: I1204 22:37:23.252944 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c57ec571-8d99-48b0-a466-78284aab8064" (UID: "c57ec571-8d99-48b0-a466-78284aab8064"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:23.312703 master-0 kubenswrapper[33572]: I1204 22:37:23.312604 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:23.312703 master-0 kubenswrapper[33572]: I1204 22:37:23.312696 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvw8h\" (UniqueName: \"kubernetes.io/projected/c57ec571-8d99-48b0-a466-78284aab8064-kube-api-access-bvw8h\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:23.312931 master-0 kubenswrapper[33572]: I1204 22:37:23.312729 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c57ec571-8d99-48b0-a466-78284aab8064-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:23.312931 master-0 kubenswrapper[33572]: I1204 22:37:23.312755 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c57ec571-8d99-48b0-a466-78284aab8064-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:23.381527 master-0 kubenswrapper[33572]: I1204 22:37:23.364735 33572 generic.go:334] "Generic (PLEG): container finished" podID="c57ec571-8d99-48b0-a466-78284aab8064" containerID="bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44" exitCode=0 Dec 04 22:37:23.381527 master-0 kubenswrapper[33572]: I1204 22:37:23.364793 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c57ec571-8d99-48b0-a466-78284aab8064","Type":"ContainerDied","Data":"bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44"} Dec 04 22:37:23.381527 master-0 kubenswrapper[33572]: I1204 22:37:23.364842 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:37:23.381527 master-0 kubenswrapper[33572]: I1204 22:37:23.364856 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c57ec571-8d99-48b0-a466-78284aab8064","Type":"ContainerDied","Data":"53f7e164cf886b918ece99286f666f1615b7d27073deec6d798906bd1c6095b9"} Dec 04 22:37:23.381527 master-0 kubenswrapper[33572]: I1204 22:37:23.364948 33572 scope.go:117] "RemoveContainer" containerID="bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44" Dec 04 22:37:23.381527 master-0 kubenswrapper[33572]: I1204 22:37:23.374518 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:23.381527 master-0 kubenswrapper[33572]: W1204 22:37:23.377546 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71c4600b_ad67_4111_bfae_0bc1eaad685c.slice/crio-af1d88af79948d5a5a27f891e7cf246ba072d0a915a982170ec74b977d30bb6d WatchSource:0}: Error finding container af1d88af79948d5a5a27f891e7cf246ba072d0a915a982170ec74b977d30bb6d: Status 404 returned error can't find the container with id af1d88af79948d5a5a27f891e7cf246ba072d0a915a982170ec74b977d30bb6d Dec 04 22:37:23.443303 master-0 kubenswrapper[33572]: I1204 22:37:23.443250 33572 scope.go:117] "RemoveContainer" containerID="2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818" Dec 04 22:37:23.461923 master-0 kubenswrapper[33572]: I1204 22:37:23.461817 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:23.480057 master-0 kubenswrapper[33572]: I1204 22:37:23.479973 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:23.499369 master-0 kubenswrapper[33572]: I1204 22:37:23.499124 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:23.500347 master-0 kubenswrapper[33572]: E1204 22:37:23.500292 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57ec571-8d99-48b0-a466-78284aab8064" containerName="nova-api-log" Dec 04 22:37:23.500400 master-0 kubenswrapper[33572]: I1204 22:37:23.500348 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57ec571-8d99-48b0-a466-78284aab8064" containerName="nova-api-log" Dec 04 22:37:23.500400 master-0 kubenswrapper[33572]: E1204 22:37:23.500379 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57ec571-8d99-48b0-a466-78284aab8064" containerName="nova-api-api" Dec 04 22:37:23.500400 master-0 kubenswrapper[33572]: I1204 22:37:23.500398 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57ec571-8d99-48b0-a466-78284aab8064" containerName="nova-api-api" Dec 04 22:37:23.501021 master-0 kubenswrapper[33572]: I1204 22:37:23.500973 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57ec571-8d99-48b0-a466-78284aab8064" containerName="nova-api-log" Dec 04 22:37:23.501145 master-0 kubenswrapper[33572]: I1204 22:37:23.501113 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57ec571-8d99-48b0-a466-78284aab8064" containerName="nova-api-api" Dec 04 22:37:23.501443 master-0 kubenswrapper[33572]: I1204 22:37:23.501401 33572 scope.go:117] "RemoveContainer" containerID="bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44" Dec 04 22:37:23.503473 master-0 kubenswrapper[33572]: E1204 22:37:23.503439 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44\": container with ID starting with bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44 not found: ID does not exist" containerID="bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44" Dec 04 22:37:23.503554 master-0 kubenswrapper[33572]: I1204 22:37:23.503475 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44"} err="failed to get container status \"bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44\": rpc error: code = NotFound desc = could not find container \"bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44\": container with ID starting with bb57f89c46b7096721714c5cbede1672c8faf4ebe2257ec89a6ce666412e3a44 not found: ID does not exist" Dec 04 22:37:23.503554 master-0 kubenswrapper[33572]: I1204 22:37:23.503517 33572 scope.go:117] "RemoveContainer" containerID="2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818" Dec 04 22:37:23.504924 master-0 kubenswrapper[33572]: E1204 22:37:23.504647 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818\": container with ID starting with 2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818 not found: ID does not exist" containerID="2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818" Dec 04 22:37:23.504924 master-0 kubenswrapper[33572]: I1204 22:37:23.504672 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818"} err="failed to get container status \"2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818\": rpc error: code = NotFound desc = could not find container \"2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818\": container with ID starting with 2a1da397c2fca2222dd602f77a5f998e28f32fb0c988dee54c657ad163ef1818 not found: ID does not exist" Dec 04 22:37:23.506352 master-0 kubenswrapper[33572]: I1204 22:37:23.506203 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:37:23.520559 master-0 kubenswrapper[33572]: I1204 22:37:23.518228 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 22:37:23.520559 master-0 kubenswrapper[33572]: I1204 22:37:23.518337 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:23.623901 master-0 kubenswrapper[33572]: I1204 22:37:23.623673 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.624248 master-0 kubenswrapper[33572]: I1204 22:37:23.624120 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b9302a-dbf7-4625-872a-65098e149e69-logs\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.624248 master-0 kubenswrapper[33572]: I1204 22:37:23.624158 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfgw7\" (UniqueName: \"kubernetes.io/projected/b5b9302a-dbf7-4625-872a-65098e149e69-kube-api-access-pfgw7\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.624657 master-0 kubenswrapper[33572]: I1204 22:37:23.624630 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-config-data\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.716518 master-0 kubenswrapper[33572]: E1204 22:37:23.716377 33572 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6169c3bf4abef692c40f35f7b8f7af2e834631788ebf68c6323cfd0702aa65ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 22:37:23.719126 master-0 kubenswrapper[33572]: E1204 22:37:23.718947 33572 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6169c3bf4abef692c40f35f7b8f7af2e834631788ebf68c6323cfd0702aa65ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 22:37:23.721060 master-0 kubenswrapper[33572]: E1204 22:37:23.720975 33572 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6169c3bf4abef692c40f35f7b8f7af2e834631788ebf68c6323cfd0702aa65ba" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 22:37:23.721115 master-0 kubenswrapper[33572]: E1204 22:37:23.721089 33572 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="83a5bf22-49cc-4341-926c-62c129088b57" containerName="nova-scheduler-scheduler" Dec 04 22:37:23.726692 master-0 kubenswrapper[33572]: I1204 22:37:23.726646 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b9302a-dbf7-4625-872a-65098e149e69-logs\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.726881 master-0 kubenswrapper[33572]: I1204 22:37:23.726708 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfgw7\" (UniqueName: \"kubernetes.io/projected/b5b9302a-dbf7-4625-872a-65098e149e69-kube-api-access-pfgw7\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.726881 master-0 kubenswrapper[33572]: I1204 22:37:23.726792 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-config-data\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.726881 master-0 kubenswrapper[33572]: I1204 22:37:23.726869 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.727313 master-0 kubenswrapper[33572]: I1204 22:37:23.727261 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b9302a-dbf7-4625-872a-65098e149e69-logs\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.732511 master-0 kubenswrapper[33572]: I1204 22:37:23.732458 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-config-data\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.733947 master-0 kubenswrapper[33572]: I1204 22:37:23.733543 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.747492 master-0 kubenswrapper[33572]: I1204 22:37:23.747451 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfgw7\" (UniqueName: \"kubernetes.io/projected/b5b9302a-dbf7-4625-872a-65098e149e69-kube-api-access-pfgw7\") pod \"nova-api-0\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " pod="openstack/nova-api-0" Dec 04 22:37:23.859309 master-0 kubenswrapper[33572]: I1204 22:37:23.859178 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:37:24.364588 master-0 kubenswrapper[33572]: I1204 22:37:24.364115 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:24.409046 master-0 kubenswrapper[33572]: I1204 22:37:24.408969 33572 generic.go:334] "Generic (PLEG): container finished" podID="83a5bf22-49cc-4341-926c-62c129088b57" containerID="6169c3bf4abef692c40f35f7b8f7af2e834631788ebf68c6323cfd0702aa65ba" exitCode=0 Dec 04 22:37:24.410230 master-0 kubenswrapper[33572]: I1204 22:37:24.409095 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"83a5bf22-49cc-4341-926c-62c129088b57","Type":"ContainerDied","Data":"6169c3bf4abef692c40f35f7b8f7af2e834631788ebf68c6323cfd0702aa65ba"} Dec 04 22:37:24.417856 master-0 kubenswrapper[33572]: I1204 22:37:24.414205 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71c4600b-ad67-4111-bfae-0bc1eaad685c","Type":"ContainerStarted","Data":"723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6"} Dec 04 22:37:24.417856 master-0 kubenswrapper[33572]: I1204 22:37:24.414249 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71c4600b-ad67-4111-bfae-0bc1eaad685c","Type":"ContainerStarted","Data":"847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a"} Dec 04 22:37:24.417856 master-0 kubenswrapper[33572]: I1204 22:37:24.414268 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71c4600b-ad67-4111-bfae-0bc1eaad685c","Type":"ContainerStarted","Data":"af1d88af79948d5a5a27f891e7cf246ba072d0a915a982170ec74b977d30bb6d"} Dec 04 22:37:24.471605 master-0 kubenswrapper[33572]: I1204 22:37:24.471153 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.471113467 podStartE2EDuration="2.471113467s" podCreationTimestamp="2025-12-04 22:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:24.462970065 +0000 UTC m=+1108.190495724" watchObservedRunningTime="2025-12-04 22:37:24.471113467 +0000 UTC m=+1108.198639146" Dec 04 22:37:24.542231 master-0 kubenswrapper[33572]: I1204 22:37:24.542165 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57ec571-8d99-48b0-a466-78284aab8064" path="/var/lib/kubelet/pods/c57ec571-8d99-48b0-a466-78284aab8064/volumes" Dec 04 22:37:24.579721 master-0 kubenswrapper[33572]: I1204 22:37:24.579684 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 22:37:24.759303 master-0 kubenswrapper[33572]: I1204 22:37:24.759254 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-config-data\") pod \"83a5bf22-49cc-4341-926c-62c129088b57\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " Dec 04 22:37:24.759427 master-0 kubenswrapper[33572]: I1204 22:37:24.759415 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-combined-ca-bundle\") pod \"83a5bf22-49cc-4341-926c-62c129088b57\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " Dec 04 22:37:24.759599 master-0 kubenswrapper[33572]: I1204 22:37:24.759579 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwnwr\" (UniqueName: \"kubernetes.io/projected/83a5bf22-49cc-4341-926c-62c129088b57-kube-api-access-mwnwr\") pod \"83a5bf22-49cc-4341-926c-62c129088b57\" (UID: \"83a5bf22-49cc-4341-926c-62c129088b57\") " Dec 04 22:37:24.766683 master-0 kubenswrapper[33572]: I1204 22:37:24.766620 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83a5bf22-49cc-4341-926c-62c129088b57-kube-api-access-mwnwr" (OuterVolumeSpecName: "kube-api-access-mwnwr") pod "83a5bf22-49cc-4341-926c-62c129088b57" (UID: "83a5bf22-49cc-4341-926c-62c129088b57"). InnerVolumeSpecName "kube-api-access-mwnwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:24.805000 master-0 kubenswrapper[33572]: I1204 22:37:24.804929 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83a5bf22-49cc-4341-926c-62c129088b57" (UID: "83a5bf22-49cc-4341-926c-62c129088b57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:24.810083 master-0 kubenswrapper[33572]: I1204 22:37:24.810026 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-config-data" (OuterVolumeSpecName: "config-data") pod "83a5bf22-49cc-4341-926c-62c129088b57" (UID: "83a5bf22-49cc-4341-926c-62c129088b57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:24.862298 master-0 kubenswrapper[33572]: I1204 22:37:24.862258 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:24.862298 master-0 kubenswrapper[33572]: I1204 22:37:24.862295 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwnwr\" (UniqueName: \"kubernetes.io/projected/83a5bf22-49cc-4341-926c-62c129088b57-kube-api-access-mwnwr\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:24.862298 master-0 kubenswrapper[33572]: I1204 22:37:24.862307 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83a5bf22-49cc-4341-926c-62c129088b57-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:25.428654 master-0 kubenswrapper[33572]: I1204 22:37:25.428522 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5b9302a-dbf7-4625-872a-65098e149e69","Type":"ContainerStarted","Data":"d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3"} Dec 04 22:37:25.428654 master-0 kubenswrapper[33572]: I1204 22:37:25.428617 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5b9302a-dbf7-4625-872a-65098e149e69","Type":"ContainerStarted","Data":"8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b"} Dec 04 22:37:25.428654 master-0 kubenswrapper[33572]: I1204 22:37:25.428633 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5b9302a-dbf7-4625-872a-65098e149e69","Type":"ContainerStarted","Data":"231f3b007a6057ddd5f8b09f31884fda95c684667f429536d50a29391da0364e"} Dec 04 22:37:25.431080 master-0 kubenswrapper[33572]: I1204 22:37:25.431027 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"83a5bf22-49cc-4341-926c-62c129088b57","Type":"ContainerDied","Data":"fb6d1aa0cc663471ba2b35792702435e5044c612b3aaabc375246afd90d94f4b"} Dec 04 22:37:25.431203 master-0 kubenswrapper[33572]: I1204 22:37:25.431104 33572 scope.go:117] "RemoveContainer" containerID="6169c3bf4abef692c40f35f7b8f7af2e834631788ebf68c6323cfd0702aa65ba" Dec 04 22:37:25.431203 master-0 kubenswrapper[33572]: I1204 22:37:25.431170 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 22:37:25.462626 master-0 kubenswrapper[33572]: I1204 22:37:25.462234 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.462213868 podStartE2EDuration="2.462213868s" podCreationTimestamp="2025-12-04 22:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:25.452680518 +0000 UTC m=+1109.180206187" watchObservedRunningTime="2025-12-04 22:37:25.462213868 +0000 UTC m=+1109.189739527" Dec 04 22:37:25.489080 master-0 kubenswrapper[33572]: I1204 22:37:25.489011 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:37:25.513811 master-0 kubenswrapper[33572]: I1204 22:37:25.513715 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:37:25.534764 master-0 kubenswrapper[33572]: I1204 22:37:25.534650 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:37:25.535351 master-0 kubenswrapper[33572]: E1204 22:37:25.535303 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83a5bf22-49cc-4341-926c-62c129088b57" containerName="nova-scheduler-scheduler" Dec 04 22:37:25.535351 master-0 kubenswrapper[33572]: I1204 22:37:25.535331 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a5bf22-49cc-4341-926c-62c129088b57" containerName="nova-scheduler-scheduler" Dec 04 22:37:25.535735 master-0 kubenswrapper[33572]: I1204 22:37:25.535687 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="83a5bf22-49cc-4341-926c-62c129088b57" containerName="nova-scheduler-scheduler" Dec 04 22:37:25.536725 master-0 kubenswrapper[33572]: I1204 22:37:25.536677 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 22:37:25.540055 master-0 kubenswrapper[33572]: I1204 22:37:25.539983 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 22:37:25.574640 master-0 kubenswrapper[33572]: I1204 22:37:25.574555 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:37:25.684667 master-0 kubenswrapper[33572]: I1204 22:37:25.684528 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:25.685155 master-0 kubenswrapper[33572]: I1204 22:37:25.685091 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfpxz\" (UniqueName: \"kubernetes.io/projected/bc905e0f-4124-457f-b7e8-9f22528f6b2e-kube-api-access-pfpxz\") pod \"nova-scheduler-0\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:25.685951 master-0 kubenswrapper[33572]: I1204 22:37:25.685654 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-config-data\") pod \"nova-scheduler-0\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:25.788216 master-0 kubenswrapper[33572]: I1204 22:37:25.788131 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-config-data\") pod \"nova-scheduler-0\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:25.788528 master-0 kubenswrapper[33572]: I1204 22:37:25.788263 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:25.788528 master-0 kubenswrapper[33572]: I1204 22:37:25.788340 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfpxz\" (UniqueName: \"kubernetes.io/projected/bc905e0f-4124-457f-b7e8-9f22528f6b2e-kube-api-access-pfpxz\") pod \"nova-scheduler-0\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:25.792123 master-0 kubenswrapper[33572]: I1204 22:37:25.792068 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-config-data\") pod \"nova-scheduler-0\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:25.793205 master-0 kubenswrapper[33572]: I1204 22:37:25.793166 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:25.805519 master-0 kubenswrapper[33572]: I1204 22:37:25.805461 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfpxz\" (UniqueName: \"kubernetes.io/projected/bc905e0f-4124-457f-b7e8-9f22528f6b2e-kube-api-access-pfpxz\") pod \"nova-scheduler-0\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " pod="openstack/nova-scheduler-0" Dec 04 22:37:25.858443 master-0 kubenswrapper[33572]: I1204 22:37:25.858377 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 22:37:26.397942 master-0 kubenswrapper[33572]: W1204 22:37:26.397859 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc905e0f_4124_457f_b7e8_9f22528f6b2e.slice/crio-ae95a556cc81f8433c7a56393fb94bbb30d5db0333192506d8253fffdd187901 WatchSource:0}: Error finding container ae95a556cc81f8433c7a56393fb94bbb30d5db0333192506d8253fffdd187901: Status 404 returned error can't find the container with id ae95a556cc81f8433c7a56393fb94bbb30d5db0333192506d8253fffdd187901 Dec 04 22:37:26.409820 master-0 kubenswrapper[33572]: I1204 22:37:26.408494 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:37:26.453004 master-0 kubenswrapper[33572]: I1204 22:37:26.452914 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc905e0f-4124-457f-b7e8-9f22528f6b2e","Type":"ContainerStarted","Data":"ae95a556cc81f8433c7a56393fb94bbb30d5db0333192506d8253fffdd187901"} Dec 04 22:37:26.548641 master-0 kubenswrapper[33572]: I1204 22:37:26.548573 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83a5bf22-49cc-4341-926c-62c129088b57" path="/var/lib/kubelet/pods/83a5bf22-49cc-4341-926c-62c129088b57/volumes" Dec 04 22:37:27.481650 master-0 kubenswrapper[33572]: I1204 22:37:27.481480 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc905e0f-4124-457f-b7e8-9f22528f6b2e","Type":"ContainerStarted","Data":"d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b"} Dec 04 22:37:27.540346 master-0 kubenswrapper[33572]: I1204 22:37:27.540179 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.54014208 podStartE2EDuration="2.54014208s" podCreationTimestamp="2025-12-04 22:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:27.509067762 +0000 UTC m=+1111.236593411" watchObservedRunningTime="2025-12-04 22:37:27.54014208 +0000 UTC m=+1111.267667779" Dec 04 22:37:27.904439 master-0 kubenswrapper[33572]: I1204 22:37:27.904359 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 22:37:27.904710 master-0 kubenswrapper[33572]: I1204 22:37:27.904461 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 22:37:29.825094 master-0 kubenswrapper[33572]: I1204 22:37:29.824626 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 22:37:30.859117 master-0 kubenswrapper[33572]: I1204 22:37:30.859025 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 22:37:32.904414 master-0 kubenswrapper[33572]: I1204 22:37:32.904309 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 22:37:32.905575 master-0 kubenswrapper[33572]: I1204 22:37:32.904429 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 22:37:33.860724 master-0 kubenswrapper[33572]: I1204 22:37:33.860656 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 22:37:33.860724 master-0 kubenswrapper[33572]: I1204 22:37:33.860730 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 22:37:33.920849 master-0 kubenswrapper[33572]: I1204 22:37:33.920765 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.9:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:37:33.921646 master-0 kubenswrapper[33572]: I1204 22:37:33.920924 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.9:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:37:34.944535 master-0 kubenswrapper[33572]: I1204 22:37:34.943749 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.10:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 22:37:34.944535 master-0 kubenswrapper[33572]: I1204 22:37:34.943743 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.10:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 22:37:35.859985 master-0 kubenswrapper[33572]: I1204 22:37:35.858975 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 22:37:35.897219 master-0 kubenswrapper[33572]: I1204 22:37:35.897172 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 22:37:36.730173 master-0 kubenswrapper[33572]: I1204 22:37:36.730070 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 22:37:38.361972 master-0 kubenswrapper[33572]: I1204 22:37:38.361921 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:38.483273 master-0 kubenswrapper[33572]: I1204 22:37:38.483179 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjj\" (UniqueName: \"kubernetes.io/projected/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-kube-api-access-fqsjj\") pod \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " Dec 04 22:37:38.483598 master-0 kubenswrapper[33572]: I1204 22:37:38.483563 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-combined-ca-bundle\") pod \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " Dec 04 22:37:38.483713 master-0 kubenswrapper[33572]: I1204 22:37:38.483646 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-config-data\") pod \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\" (UID: \"9904f70c-a3e6-41c4-85dd-078ab3d0cb70\") " Dec 04 22:37:38.490890 master-0 kubenswrapper[33572]: I1204 22:37:38.490808 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-kube-api-access-fqsjj" (OuterVolumeSpecName: "kube-api-access-fqsjj") pod "9904f70c-a3e6-41c4-85dd-078ab3d0cb70" (UID: "9904f70c-a3e6-41c4-85dd-078ab3d0cb70"). InnerVolumeSpecName "kube-api-access-fqsjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:38.542319 master-0 kubenswrapper[33572]: I1204 22:37:38.542232 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-config-data" (OuterVolumeSpecName: "config-data") pod "9904f70c-a3e6-41c4-85dd-078ab3d0cb70" (UID: "9904f70c-a3e6-41c4-85dd-078ab3d0cb70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:38.550777 master-0 kubenswrapper[33572]: I1204 22:37:38.550194 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9904f70c-a3e6-41c4-85dd-078ab3d0cb70" (UID: "9904f70c-a3e6-41c4-85dd-078ab3d0cb70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:38.586920 master-0 kubenswrapper[33572]: I1204 22:37:38.586837 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:38.586920 master-0 kubenswrapper[33572]: I1204 22:37:38.586920 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:38.587079 master-0 kubenswrapper[33572]: I1204 22:37:38.586942 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjj\" (UniqueName: \"kubernetes.io/projected/9904f70c-a3e6-41c4-85dd-078ab3d0cb70-kube-api-access-fqsjj\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:38.706484 master-0 kubenswrapper[33572]: I1204 22:37:38.706419 33572 generic.go:334] "Generic (PLEG): container finished" podID="9904f70c-a3e6-41c4-85dd-078ab3d0cb70" containerID="4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11" exitCode=137 Dec 04 22:37:38.706783 master-0 kubenswrapper[33572]: I1204 22:37:38.706484 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9904f70c-a3e6-41c4-85dd-078ab3d0cb70","Type":"ContainerDied","Data":"4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11"} Dec 04 22:37:38.706783 master-0 kubenswrapper[33572]: I1204 22:37:38.706578 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9904f70c-a3e6-41c4-85dd-078ab3d0cb70","Type":"ContainerDied","Data":"e9e488ab93f8344a12da7c1005046c93aa13b7caec9c51c3a01d7508403180e5"} Dec 04 22:37:38.706783 master-0 kubenswrapper[33572]: I1204 22:37:38.706609 33572 scope.go:117] "RemoveContainer" containerID="4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11" Dec 04 22:37:38.707012 master-0 kubenswrapper[33572]: I1204 22:37:38.706988 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:38.755665 master-0 kubenswrapper[33572]: I1204 22:37:38.755566 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 22:37:38.756824 master-0 kubenswrapper[33572]: I1204 22:37:38.756622 33572 scope.go:117] "RemoveContainer" containerID="4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11" Dec 04 22:37:38.757863 master-0 kubenswrapper[33572]: E1204 22:37:38.757796 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11\": container with ID starting with 4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11 not found: ID does not exist" containerID="4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11" Dec 04 22:37:38.757963 master-0 kubenswrapper[33572]: I1204 22:37:38.757881 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11"} err="failed to get container status \"4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11\": rpc error: code = NotFound desc = could not find container \"4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11\": container with ID starting with 4841e5411964a5b7c15c3619bfab23310fadcd9ea73bf3471574c5c5aacdac11 not found: ID does not exist" Dec 04 22:37:38.770245 master-0 kubenswrapper[33572]: I1204 22:37:38.770107 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 22:37:38.804203 master-0 kubenswrapper[33572]: I1204 22:37:38.804019 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 22:37:38.804633 master-0 kubenswrapper[33572]: E1204 22:37:38.804598 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9904f70c-a3e6-41c4-85dd-078ab3d0cb70" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 22:37:38.804633 master-0 kubenswrapper[33572]: I1204 22:37:38.804625 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9904f70c-a3e6-41c4-85dd-078ab3d0cb70" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 22:37:38.805006 master-0 kubenswrapper[33572]: I1204 22:37:38.804974 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="9904f70c-a3e6-41c4-85dd-078ab3d0cb70" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 22:37:38.806073 master-0 kubenswrapper[33572]: I1204 22:37:38.806031 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:38.808396 master-0 kubenswrapper[33572]: I1204 22:37:38.808353 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 22:37:38.809628 master-0 kubenswrapper[33572]: I1204 22:37:38.809596 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 04 22:37:38.810251 master-0 kubenswrapper[33572]: I1204 22:37:38.810211 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 04 22:37:38.823196 master-0 kubenswrapper[33572]: I1204 22:37:38.823096 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 22:37:38.997443 master-0 kubenswrapper[33572]: I1204 22:37:38.997362 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:38.997443 master-0 kubenswrapper[33572]: I1204 22:37:38.997459 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:38.997816 master-0 kubenswrapper[33572]: I1204 22:37:38.997610 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:38.997816 master-0 kubenswrapper[33572]: I1204 22:37:38.997673 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76prk\" (UniqueName: \"kubernetes.io/projected/f844d5cd-b1d3-49a9-9de1-15d959de193d-kube-api-access-76prk\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:38.997816 master-0 kubenswrapper[33572]: I1204 22:37:38.997768 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.101467 master-0 kubenswrapper[33572]: I1204 22:37:39.100778 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.101467 master-0 kubenswrapper[33572]: I1204 22:37:39.100889 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.101467 master-0 kubenswrapper[33572]: I1204 22:37:39.100955 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.101467 master-0 kubenswrapper[33572]: I1204 22:37:39.100997 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76prk\" (UniqueName: \"kubernetes.io/projected/f844d5cd-b1d3-49a9-9de1-15d959de193d-kube-api-access-76prk\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.101467 master-0 kubenswrapper[33572]: I1204 22:37:39.101077 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.106715 master-0 kubenswrapper[33572]: I1204 22:37:39.106651 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.107114 master-0 kubenswrapper[33572]: I1204 22:37:39.107039 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.108447 master-0 kubenswrapper[33572]: I1204 22:37:39.108386 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.111632 master-0 kubenswrapper[33572]: I1204 22:37:39.110765 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f844d5cd-b1d3-49a9-9de1-15d959de193d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.122429 master-0 kubenswrapper[33572]: I1204 22:37:39.122290 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76prk\" (UniqueName: \"kubernetes.io/projected/f844d5cd-b1d3-49a9-9de1-15d959de193d-kube-api-access-76prk\") pod \"nova-cell1-novncproxy-0\" (UID: \"f844d5cd-b1d3-49a9-9de1-15d959de193d\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.141571 master-0 kubenswrapper[33572]: I1204 22:37:39.141406 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:39.722677 master-0 kubenswrapper[33572]: W1204 22:37:39.722552 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf844d5cd_b1d3_49a9_9de1_15d959de193d.slice/crio-76f6bb5c8a184076c05fdfa2a89fe28c6b10c0c948abcf03a33609eb3651ee8a WatchSource:0}: Error finding container 76f6bb5c8a184076c05fdfa2a89fe28c6b10c0c948abcf03a33609eb3651ee8a: Status 404 returned error can't find the container with id 76f6bb5c8a184076c05fdfa2a89fe28c6b10c0c948abcf03a33609eb3651ee8a Dec 04 22:37:39.744277 master-0 kubenswrapper[33572]: I1204 22:37:39.744217 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 22:37:40.551916 master-0 kubenswrapper[33572]: I1204 22:37:40.551635 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9904f70c-a3e6-41c4-85dd-078ab3d0cb70" path="/var/lib/kubelet/pods/9904f70c-a3e6-41c4-85dd-078ab3d0cb70/volumes" Dec 04 22:37:40.771793 master-0 kubenswrapper[33572]: I1204 22:37:40.771653 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f844d5cd-b1d3-49a9-9de1-15d959de193d","Type":"ContainerStarted","Data":"bf930b4d3cf39beadbbf3671a9d8a1627c303e9928be6a496a205c67964165fc"} Dec 04 22:37:40.771793 master-0 kubenswrapper[33572]: I1204 22:37:40.771784 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f844d5cd-b1d3-49a9-9de1-15d959de193d","Type":"ContainerStarted","Data":"76f6bb5c8a184076c05fdfa2a89fe28c6b10c0c948abcf03a33609eb3651ee8a"} Dec 04 22:37:40.817719 master-0 kubenswrapper[33572]: I1204 22:37:40.817443 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.817409014 podStartE2EDuration="2.817409014s" podCreationTimestamp="2025-12-04 22:37:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:40.80588102 +0000 UTC m=+1124.533406749" watchObservedRunningTime="2025-12-04 22:37:40.817409014 +0000 UTC m=+1124.544934693" Dec 04 22:37:41.791885 master-0 kubenswrapper[33572]: I1204 22:37:41.791195 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Dec 04 22:37:41.826669 master-0 kubenswrapper[33572]: I1204 22:37:41.826587 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Dec 04 22:37:41.834563 master-0 kubenswrapper[33572]: I1204 22:37:41.834395 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Dec 04 22:37:42.911378 master-0 kubenswrapper[33572]: I1204 22:37:42.909876 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 22:37:42.913938 master-0 kubenswrapper[33572]: I1204 22:37:42.913888 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 22:37:42.920872 master-0 kubenswrapper[33572]: I1204 22:37:42.920835 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 22:37:43.831179 master-0 kubenswrapper[33572]: I1204 22:37:43.831083 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 22:37:43.866131 master-0 kubenswrapper[33572]: I1204 22:37:43.866044 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 22:37:43.866440 master-0 kubenswrapper[33572]: I1204 22:37:43.866343 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 22:37:43.867633 master-0 kubenswrapper[33572]: I1204 22:37:43.867486 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 22:37:43.867633 master-0 kubenswrapper[33572]: I1204 22:37:43.867600 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 22:37:43.874949 master-0 kubenswrapper[33572]: I1204 22:37:43.874886 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 22:37:43.881560 master-0 kubenswrapper[33572]: I1204 22:37:43.880008 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 22:37:44.144907 master-0 kubenswrapper[33572]: I1204 22:37:44.142462 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:44.191150 master-0 kubenswrapper[33572]: I1204 22:37:44.190278 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-866669bb59-4m7ck"] Dec 04 22:37:44.195718 master-0 kubenswrapper[33572]: I1204 22:37:44.194365 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.206075 master-0 kubenswrapper[33572]: I1204 22:37:44.206009 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-866669bb59-4m7ck"] Dec 04 22:37:44.245649 master-0 kubenswrapper[33572]: I1204 22:37:44.245489 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-ovsdbserver-sb\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.245917 master-0 kubenswrapper[33572]: I1204 22:37:44.245871 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-ovsdbserver-nb\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.246097 master-0 kubenswrapper[33572]: I1204 22:37:44.246051 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-dns-swift-storage-0\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.246230 master-0 kubenswrapper[33572]: I1204 22:37:44.246205 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxxw\" (UniqueName: \"kubernetes.io/projected/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-kube-api-access-pzxxw\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.246297 master-0 kubenswrapper[33572]: I1204 22:37:44.246281 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-dns-svc\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.246378 master-0 kubenswrapper[33572]: I1204 22:37:44.246360 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-config\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.348244 master-0 kubenswrapper[33572]: I1204 22:37:44.348180 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-ovsdbserver-nb\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.348640 master-0 kubenswrapper[33572]: I1204 22:37:44.348284 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-dns-swift-storage-0\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.348640 master-0 kubenswrapper[33572]: I1204 22:37:44.348338 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzxxw\" (UniqueName: \"kubernetes.io/projected/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-kube-api-access-pzxxw\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.348640 master-0 kubenswrapper[33572]: I1204 22:37:44.348373 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-dns-svc\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.348640 master-0 kubenswrapper[33572]: I1204 22:37:44.348400 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-config\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.348640 master-0 kubenswrapper[33572]: I1204 22:37:44.348455 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-ovsdbserver-sb\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.349403 master-0 kubenswrapper[33572]: I1204 22:37:44.349352 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-ovsdbserver-sb\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.350273 master-0 kubenswrapper[33572]: I1204 22:37:44.350222 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-ovsdbserver-nb\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.351247 master-0 kubenswrapper[33572]: I1204 22:37:44.351201 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-dns-swift-storage-0\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.352212 master-0 kubenswrapper[33572]: I1204 22:37:44.352182 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-dns-svc\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.352812 master-0 kubenswrapper[33572]: I1204 22:37:44.352751 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-config\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.369482 master-0 kubenswrapper[33572]: I1204 22:37:44.369404 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzxxw\" (UniqueName: \"kubernetes.io/projected/0f7f6956-2a97-48e4-80df-e4ce247cc4ad-kube-api-access-pzxxw\") pod \"dnsmasq-dns-866669bb59-4m7ck\" (UID: \"0f7f6956-2a97-48e4-80df-e4ce247cc4ad\") " pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:44.530574 master-0 kubenswrapper[33572]: I1204 22:37:44.530460 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:45.129287 master-0 kubenswrapper[33572]: I1204 22:37:45.128572 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-866669bb59-4m7ck"] Dec 04 22:37:45.861424 master-0 kubenswrapper[33572]: I1204 22:37:45.861338 33572 generic.go:334] "Generic (PLEG): container finished" podID="0f7f6956-2a97-48e4-80df-e4ce247cc4ad" containerID="3bfe54d4a262b61c7b690567637051bb4f408c59ce014616160294779a219545" exitCode=0 Dec 04 22:37:45.865309 master-0 kubenswrapper[33572]: I1204 22:37:45.861406 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-866669bb59-4m7ck" event={"ID":"0f7f6956-2a97-48e4-80df-e4ce247cc4ad","Type":"ContainerDied","Data":"3bfe54d4a262b61c7b690567637051bb4f408c59ce014616160294779a219545"} Dec 04 22:37:45.865309 master-0 kubenswrapper[33572]: I1204 22:37:45.861532 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-866669bb59-4m7ck" event={"ID":"0f7f6956-2a97-48e4-80df-e4ce247cc4ad","Type":"ContainerStarted","Data":"f7e1a7674df038ed9c3947beb472532f76ca6fa4621a17bf4984f39a8b52be92"} Dec 04 22:37:46.409342 master-0 kubenswrapper[33572]: I1204 22:37:46.409002 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:46.881365 master-0 kubenswrapper[33572]: I1204 22:37:46.881298 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-866669bb59-4m7ck" event={"ID":"0f7f6956-2a97-48e4-80df-e4ce247cc4ad","Type":"ContainerStarted","Data":"e3b330c50cd617adfbd5369ecd98482763a840712c688c309e006bace570710c"} Dec 04 22:37:46.881999 master-0 kubenswrapper[33572]: I1204 22:37:46.881535 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" containerName="nova-api-log" containerID="cri-o://8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b" gracePeriod=30 Dec 04 22:37:46.881999 master-0 kubenswrapper[33572]: I1204 22:37:46.881659 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" containerName="nova-api-api" containerID="cri-o://d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3" gracePeriod=30 Dec 04 22:37:46.936114 master-0 kubenswrapper[33572]: I1204 22:37:46.931344 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-866669bb59-4m7ck" podStartSLOduration=2.931322291 podStartE2EDuration="2.931322291s" podCreationTimestamp="2025-12-04 22:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:46.908794367 +0000 UTC m=+1130.636320056" watchObservedRunningTime="2025-12-04 22:37:46.931322291 +0000 UTC m=+1130.658847940" Dec 04 22:37:47.902651 master-0 kubenswrapper[33572]: I1204 22:37:47.902573 33572 generic.go:334] "Generic (PLEG): container finished" podID="b5b9302a-dbf7-4625-872a-65098e149e69" containerID="8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b" exitCode=143 Dec 04 22:37:47.903583 master-0 kubenswrapper[33572]: I1204 22:37:47.902713 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5b9302a-dbf7-4625-872a-65098e149e69","Type":"ContainerDied","Data":"8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b"} Dec 04 22:37:47.903583 master-0 kubenswrapper[33572]: I1204 22:37:47.903038 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:49.142413 master-0 kubenswrapper[33572]: I1204 22:37:49.142183 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:49.178075 master-0 kubenswrapper[33572]: I1204 22:37:49.178001 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:49.963195 master-0 kubenswrapper[33572]: I1204 22:37:49.963111 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 04 22:37:50.212281 master-0 kubenswrapper[33572]: I1204 22:37:50.212209 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ks44t"] Dec 04 22:37:50.229113 master-0 kubenswrapper[33572]: I1204 22:37:50.229049 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.234332 master-0 kubenswrapper[33572]: I1204 22:37:50.234288 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 04 22:37:50.234857 master-0 kubenswrapper[33572]: I1204 22:37:50.234800 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-j8m8x"] Dec 04 22:37:50.235095 master-0 kubenswrapper[33572]: I1204 22:37:50.235070 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 04 22:37:50.236461 master-0 kubenswrapper[33572]: I1204 22:37:50.236422 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.254868 master-0 kubenswrapper[33572]: I1204 22:37:50.254795 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ks44t"] Dec 04 22:37:50.272647 master-0 kubenswrapper[33572]: I1204 22:37:50.272587 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-j8m8x"] Dec 04 22:37:50.360802 master-0 kubenswrapper[33572]: I1204 22:37:50.360735 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rc6z\" (UniqueName: \"kubernetes.io/projected/6fe4a826-dcad-4726-9545-a2f208036313-kube-api-access-6rc6z\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.361160 master-0 kubenswrapper[33572]: I1204 22:37:50.360830 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbz2r\" (UniqueName: \"kubernetes.io/projected/e36abc43-b5d9-41be-b4f3-c90c72b43065-kube-api-access-wbz2r\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.361160 master-0 kubenswrapper[33572]: I1204 22:37:50.360962 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-config-data\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.361160 master-0 kubenswrapper[33572]: I1204 22:37:50.361121 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.361690 master-0 kubenswrapper[33572]: I1204 22:37:50.361191 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-combined-ca-bundle\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.361690 master-0 kubenswrapper[33572]: I1204 22:37:50.361225 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-config-data\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.361690 master-0 kubenswrapper[33572]: I1204 22:37:50.361257 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-scripts\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.361690 master-0 kubenswrapper[33572]: I1204 22:37:50.361303 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-scripts\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.465231 master-0 kubenswrapper[33572]: I1204 22:37:50.465104 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-config-data\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.469188 master-0 kubenswrapper[33572]: I1204 22:37:50.468401 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.469188 master-0 kubenswrapper[33572]: I1204 22:37:50.468549 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-config-data\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.469188 master-0 kubenswrapper[33572]: I1204 22:37:50.468586 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-combined-ca-bundle\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.469188 master-0 kubenswrapper[33572]: I1204 22:37:50.468613 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-scripts\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.469188 master-0 kubenswrapper[33572]: I1204 22:37:50.468659 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-scripts\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.469188 master-0 kubenswrapper[33572]: I1204 22:37:50.468804 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rc6z\" (UniqueName: \"kubernetes.io/projected/6fe4a826-dcad-4726-9545-a2f208036313-kube-api-access-6rc6z\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.469816 master-0 kubenswrapper[33572]: I1204 22:37:50.468873 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbz2r\" (UniqueName: \"kubernetes.io/projected/e36abc43-b5d9-41be-b4f3-c90c72b43065-kube-api-access-wbz2r\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.472637 master-0 kubenswrapper[33572]: I1204 22:37:50.472334 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-config-data\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.472637 master-0 kubenswrapper[33572]: I1204 22:37:50.472580 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-combined-ca-bundle\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.474894 master-0 kubenswrapper[33572]: I1204 22:37:50.474800 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.475785 master-0 kubenswrapper[33572]: I1204 22:37:50.475732 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-scripts\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.476658 master-0 kubenswrapper[33572]: I1204 22:37:50.476621 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-config-data\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.476751 master-0 kubenswrapper[33572]: I1204 22:37:50.476706 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-scripts\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.486379 master-0 kubenswrapper[33572]: I1204 22:37:50.486280 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbz2r\" (UniqueName: \"kubernetes.io/projected/e36abc43-b5d9-41be-b4f3-c90c72b43065-kube-api-access-wbz2r\") pod \"nova-cell1-host-discover-j8m8x\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.489005 master-0 kubenswrapper[33572]: I1204 22:37:50.488917 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rc6z\" (UniqueName: \"kubernetes.io/projected/6fe4a826-dcad-4726-9545-a2f208036313-kube-api-access-6rc6z\") pod \"nova-cell1-cell-mapping-ks44t\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.568109 master-0 kubenswrapper[33572]: I1204 22:37:50.567216 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:50.573731 master-0 kubenswrapper[33572]: I1204 22:37:50.573686 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:50.595145 master-0 kubenswrapper[33572]: I1204 22:37:50.595086 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:37:50.776356 master-0 kubenswrapper[33572]: I1204 22:37:50.776308 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-combined-ca-bundle\") pod \"b5b9302a-dbf7-4625-872a-65098e149e69\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " Dec 04 22:37:50.776472 master-0 kubenswrapper[33572]: I1204 22:37:50.776454 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b9302a-dbf7-4625-872a-65098e149e69-logs\") pod \"b5b9302a-dbf7-4625-872a-65098e149e69\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " Dec 04 22:37:50.776532 master-0 kubenswrapper[33572]: I1204 22:37:50.776487 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfgw7\" (UniqueName: \"kubernetes.io/projected/b5b9302a-dbf7-4625-872a-65098e149e69-kube-api-access-pfgw7\") pod \"b5b9302a-dbf7-4625-872a-65098e149e69\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " Dec 04 22:37:50.776532 master-0 kubenswrapper[33572]: I1204 22:37:50.776521 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-config-data\") pod \"b5b9302a-dbf7-4625-872a-65098e149e69\" (UID: \"b5b9302a-dbf7-4625-872a-65098e149e69\") " Dec 04 22:37:50.777828 master-0 kubenswrapper[33572]: I1204 22:37:50.777789 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b9302a-dbf7-4625-872a-65098e149e69-logs" (OuterVolumeSpecName: "logs") pod "b5b9302a-dbf7-4625-872a-65098e149e69" (UID: "b5b9302a-dbf7-4625-872a-65098e149e69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:37:50.780820 master-0 kubenswrapper[33572]: I1204 22:37:50.780768 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b9302a-dbf7-4625-872a-65098e149e69-kube-api-access-pfgw7" (OuterVolumeSpecName: "kube-api-access-pfgw7") pod "b5b9302a-dbf7-4625-872a-65098e149e69" (UID: "b5b9302a-dbf7-4625-872a-65098e149e69"). InnerVolumeSpecName "kube-api-access-pfgw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:50.820226 master-0 kubenswrapper[33572]: I1204 22:37:50.820172 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-config-data" (OuterVolumeSpecName: "config-data") pod "b5b9302a-dbf7-4625-872a-65098e149e69" (UID: "b5b9302a-dbf7-4625-872a-65098e149e69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:50.858929 master-0 kubenswrapper[33572]: I1204 22:37:50.858868 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5b9302a-dbf7-4625-872a-65098e149e69" (UID: "b5b9302a-dbf7-4625-872a-65098e149e69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:50.880537 master-0 kubenswrapper[33572]: I1204 22:37:50.880142 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:50.880537 master-0 kubenswrapper[33572]: I1204 22:37:50.880188 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b9302a-dbf7-4625-872a-65098e149e69-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:50.880537 master-0 kubenswrapper[33572]: I1204 22:37:50.880198 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfgw7\" (UniqueName: \"kubernetes.io/projected/b5b9302a-dbf7-4625-872a-65098e149e69-kube-api-access-pfgw7\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:50.880537 master-0 kubenswrapper[33572]: I1204 22:37:50.880210 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b9302a-dbf7-4625-872a-65098e149e69-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:50.953236 master-0 kubenswrapper[33572]: I1204 22:37:50.953158 33572 generic.go:334] "Generic (PLEG): container finished" podID="b5b9302a-dbf7-4625-872a-65098e149e69" containerID="d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3" exitCode=0 Dec 04 22:37:50.953433 master-0 kubenswrapper[33572]: I1204 22:37:50.953347 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5b9302a-dbf7-4625-872a-65098e149e69","Type":"ContainerDied","Data":"d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3"} Dec 04 22:37:50.953433 master-0 kubenswrapper[33572]: I1204 22:37:50.953397 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5b9302a-dbf7-4625-872a-65098e149e69","Type":"ContainerDied","Data":"231f3b007a6057ddd5f8b09f31884fda95c684667f429536d50a29391da0364e"} Dec 04 22:37:50.953433 master-0 kubenswrapper[33572]: I1204 22:37:50.953414 33572 scope.go:117] "RemoveContainer" containerID="d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3" Dec 04 22:37:50.953609 master-0 kubenswrapper[33572]: I1204 22:37:50.953371 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:37:50.982397 master-0 kubenswrapper[33572]: I1204 22:37:50.982123 33572 scope.go:117] "RemoveContainer" containerID="8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b" Dec 04 22:37:51.033530 master-0 kubenswrapper[33572]: I1204 22:37:51.033312 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:51.037556 master-0 kubenswrapper[33572]: I1204 22:37:51.037032 33572 scope.go:117] "RemoveContainer" containerID="d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3" Dec 04 22:37:51.038142 master-0 kubenswrapper[33572]: E1204 22:37:51.038071 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3\": container with ID starting with d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3 not found: ID does not exist" containerID="d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3" Dec 04 22:37:51.038142 master-0 kubenswrapper[33572]: I1204 22:37:51.038130 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3"} err="failed to get container status \"d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3\": rpc error: code = NotFound desc = could not find container \"d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3\": container with ID starting with d1331e4cdbe441bcfd4077196812ecde7e503ec3cfe932c629e58c1bf33e3ca3 not found: ID does not exist" Dec 04 22:37:51.038309 master-0 kubenswrapper[33572]: I1204 22:37:51.038151 33572 scope.go:117] "RemoveContainer" containerID="8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b" Dec 04 22:37:51.052436 master-0 kubenswrapper[33572]: E1204 22:37:51.041720 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b\": container with ID starting with 8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b not found: ID does not exist" containerID="8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b" Dec 04 22:37:51.052436 master-0 kubenswrapper[33572]: I1204 22:37:51.041788 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b"} err="failed to get container status \"8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b\": rpc error: code = NotFound desc = could not find container \"8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b\": container with ID starting with 8d10899dd7ab6e280d8cbb165efbdb3d6a96661805937a634df6f03d7293c39b not found: ID does not exist" Dec 04 22:37:51.068288 master-0 kubenswrapper[33572]: I1204 22:37:51.068092 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:51.070949 master-0 kubenswrapper[33572]: I1204 22:37:51.070885 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:51.071991 master-0 kubenswrapper[33572]: E1204 22:37:51.071923 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" containerName="nova-api-api" Dec 04 22:37:51.071991 master-0 kubenswrapper[33572]: I1204 22:37:51.071975 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" containerName="nova-api-api" Dec 04 22:37:51.072191 master-0 kubenswrapper[33572]: E1204 22:37:51.072095 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" containerName="nova-api-log" Dec 04 22:37:51.072191 master-0 kubenswrapper[33572]: I1204 22:37:51.072149 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" containerName="nova-api-log" Dec 04 22:37:51.072475 master-0 kubenswrapper[33572]: I1204 22:37:51.072433 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" containerName="nova-api-log" Dec 04 22:37:51.072741 master-0 kubenswrapper[33572]: I1204 22:37:51.072515 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" containerName="nova-api-api" Dec 04 22:37:51.077078 master-0 kubenswrapper[33572]: I1204 22:37:51.077027 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:37:51.079492 master-0 kubenswrapper[33572]: I1204 22:37:51.079440 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 22:37:51.079874 master-0 kubenswrapper[33572]: I1204 22:37:51.079839 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 22:37:51.080079 master-0 kubenswrapper[33572]: I1204 22:37:51.080046 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 22:37:51.088193 master-0 kubenswrapper[33572]: I1204 22:37:51.088144 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:51.121537 master-0 kubenswrapper[33572]: I1204 22:37:51.118851 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ks44t"] Dec 04 22:37:51.190064 master-0 kubenswrapper[33572]: I1204 22:37:51.190018 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.190162 master-0 kubenswrapper[33572]: I1204 22:37:51.190141 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.190310 master-0 kubenswrapper[33572]: I1204 22:37:51.190283 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6xg9\" (UniqueName: \"kubernetes.io/projected/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-kube-api-access-n6xg9\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.190432 master-0 kubenswrapper[33572]: I1204 22:37:51.190409 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-config-data\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.190473 master-0 kubenswrapper[33572]: I1204 22:37:51.190435 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-logs\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.190535 master-0 kubenswrapper[33572]: I1204 22:37:51.190472 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-public-tls-certs\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.236419 master-0 kubenswrapper[33572]: I1204 22:37:51.236355 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-j8m8x"] Dec 04 22:37:51.292372 master-0 kubenswrapper[33572]: I1204 22:37:51.292156 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.292372 master-0 kubenswrapper[33572]: I1204 22:37:51.292269 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.292372 master-0 kubenswrapper[33572]: I1204 22:37:51.292322 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6xg9\" (UniqueName: \"kubernetes.io/projected/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-kube-api-access-n6xg9\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.292624 master-0 kubenswrapper[33572]: I1204 22:37:51.292543 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-config-data\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.292624 master-0 kubenswrapper[33572]: I1204 22:37:51.292567 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-logs\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.292624 master-0 kubenswrapper[33572]: I1204 22:37:51.292609 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-public-tls-certs\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.293924 master-0 kubenswrapper[33572]: I1204 22:37:51.293868 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-logs\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.295926 master-0 kubenswrapper[33572]: I1204 22:37:51.295892 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-public-tls-certs\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.296435 master-0 kubenswrapper[33572]: I1204 22:37:51.296402 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.297887 master-0 kubenswrapper[33572]: I1204 22:37:51.297834 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.298458 master-0 kubenswrapper[33572]: I1204 22:37:51.298439 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-config-data\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.307623 master-0 kubenswrapper[33572]: I1204 22:37:51.307586 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6xg9\" (UniqueName: \"kubernetes.io/projected/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-kube-api-access-n6xg9\") pod \"nova-api-0\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " pod="openstack/nova-api-0" Dec 04 22:37:51.444657 master-0 kubenswrapper[33572]: I1204 22:37:51.444556 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:37:51.952219 master-0 kubenswrapper[33572]: I1204 22:37:51.952130 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:52.005703 master-0 kubenswrapper[33572]: I1204 22:37:52.005612 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ks44t" event={"ID":"6fe4a826-dcad-4726-9545-a2f208036313","Type":"ContainerStarted","Data":"e1e47bf8f98f63f3437bfb5a44c506f051d1792be8fc6c07e683ee32ef14498f"} Dec 04 22:37:52.005703 master-0 kubenswrapper[33572]: I1204 22:37:52.005699 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ks44t" event={"ID":"6fe4a826-dcad-4726-9545-a2f208036313","Type":"ContainerStarted","Data":"36c73a49088308d881bf257e1b0f327e084d604a9878b0b4ca248ca8fc0dd4ed"} Dec 04 22:37:52.012148 master-0 kubenswrapper[33572]: I1204 22:37:52.012013 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-j8m8x" event={"ID":"e36abc43-b5d9-41be-b4f3-c90c72b43065","Type":"ContainerStarted","Data":"edc92f41f7b8ffc4163eafadb32c5be365030d626372bdf7dd6922f06d933a18"} Dec 04 22:37:52.012148 master-0 kubenswrapper[33572]: I1204 22:37:52.012071 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-j8m8x" event={"ID":"e36abc43-b5d9-41be-b4f3-c90c72b43065","Type":"ContainerStarted","Data":"871e2f9a787aceeab0a3068afc35f6d12bd094376ab79a1ec2fb441e094a1143"} Dec 04 22:37:52.041440 master-0 kubenswrapper[33572]: I1204 22:37:52.041346 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ks44t" podStartSLOduration=2.041321308 podStartE2EDuration="2.041321308s" podCreationTimestamp="2025-12-04 22:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:52.02712364 +0000 UTC m=+1135.754649279" watchObservedRunningTime="2025-12-04 22:37:52.041321308 +0000 UTC m=+1135.768846957" Dec 04 22:37:52.062661 master-0 kubenswrapper[33572]: I1204 22:37:52.062559 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-j8m8x" podStartSLOduration=2.062536767 podStartE2EDuration="2.062536767s" podCreationTimestamp="2025-12-04 22:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:52.049017918 +0000 UTC m=+1135.776543577" watchObservedRunningTime="2025-12-04 22:37:52.062536767 +0000 UTC m=+1135.790062426" Dec 04 22:37:52.549151 master-0 kubenswrapper[33572]: I1204 22:37:52.549074 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b9302a-dbf7-4625-872a-65098e149e69" path="/var/lib/kubelet/pods/b5b9302a-dbf7-4625-872a-65098e149e69/volumes" Dec 04 22:37:53.029607 master-0 kubenswrapper[33572]: I1204 22:37:53.029479 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a6824c-42b7-4b24-b2dc-97e2bd56e59b","Type":"ContainerStarted","Data":"0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588"} Dec 04 22:37:53.029607 master-0 kubenswrapper[33572]: I1204 22:37:53.029595 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a6824c-42b7-4b24-b2dc-97e2bd56e59b","Type":"ContainerStarted","Data":"f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda"} Dec 04 22:37:53.029960 master-0 kubenswrapper[33572]: I1204 22:37:53.029620 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a6824c-42b7-4b24-b2dc-97e2bd56e59b","Type":"ContainerStarted","Data":"8f6e3459bc5a5e582531f221a8ea101bc46ef7b73946699467649a3c97a34dd3"} Dec 04 22:37:53.065150 master-0 kubenswrapper[33572]: I1204 22:37:53.065009 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.064974216 podStartE2EDuration="2.064974216s" podCreationTimestamp="2025-12-04 22:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:37:53.060968577 +0000 UTC m=+1136.788494266" watchObservedRunningTime="2025-12-04 22:37:53.064974216 +0000 UTC m=+1136.792499925" Dec 04 22:37:54.047149 master-0 kubenswrapper[33572]: I1204 22:37:54.047064 33572 generic.go:334] "Generic (PLEG): container finished" podID="e36abc43-b5d9-41be-b4f3-c90c72b43065" containerID="edc92f41f7b8ffc4163eafadb32c5be365030d626372bdf7dd6922f06d933a18" exitCode=0 Dec 04 22:37:54.047892 master-0 kubenswrapper[33572]: I1204 22:37:54.047159 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-j8m8x" event={"ID":"e36abc43-b5d9-41be-b4f3-c90c72b43065","Type":"ContainerDied","Data":"edc92f41f7b8ffc4163eafadb32c5be365030d626372bdf7dd6922f06d933a18"} Dec 04 22:37:54.577398 master-0 kubenswrapper[33572]: I1204 22:37:54.577318 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-866669bb59-4m7ck" Dec 04 22:37:54.702539 master-0 kubenswrapper[33572]: I1204 22:37:54.702446 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b46df4b9-2z4nn"] Dec 04 22:37:54.702913 master-0 kubenswrapper[33572]: I1204 22:37:54.702800 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" podUID="9408932d-d92d-4d1a-bd67-7e90e89b2341" containerName="dnsmasq-dns" containerID="cri-o://d0b3d68e04df90b1f3a9b46d744450614f696daf560f87b2d61928bad6ef7292" gracePeriod=10 Dec 04 22:37:55.090840 master-0 kubenswrapper[33572]: I1204 22:37:55.090621 33572 generic.go:334] "Generic (PLEG): container finished" podID="9408932d-d92d-4d1a-bd67-7e90e89b2341" containerID="d0b3d68e04df90b1f3a9b46d744450614f696daf560f87b2d61928bad6ef7292" exitCode=0 Dec 04 22:37:55.090840 master-0 kubenswrapper[33572]: I1204 22:37:55.090831 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" event={"ID":"9408932d-d92d-4d1a-bd67-7e90e89b2341","Type":"ContainerDied","Data":"d0b3d68e04df90b1f3a9b46d744450614f696daf560f87b2d61928bad6ef7292"} Dec 04 22:37:55.352367 master-0 kubenswrapper[33572]: I1204 22:37:55.352265 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:55.417157 master-0 kubenswrapper[33572]: I1204 22:37:55.417051 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-nb\") pod \"9408932d-d92d-4d1a-bd67-7e90e89b2341\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " Dec 04 22:37:55.417534 master-0 kubenswrapper[33572]: I1204 22:37:55.417205 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-sb\") pod \"9408932d-d92d-4d1a-bd67-7e90e89b2341\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " Dec 04 22:37:55.417534 master-0 kubenswrapper[33572]: I1204 22:37:55.417337 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn6rm\" (UniqueName: \"kubernetes.io/projected/9408932d-d92d-4d1a-bd67-7e90e89b2341-kube-api-access-rn6rm\") pod \"9408932d-d92d-4d1a-bd67-7e90e89b2341\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " Dec 04 22:37:55.417736 master-0 kubenswrapper[33572]: I1204 22:37:55.417556 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-config\") pod \"9408932d-d92d-4d1a-bd67-7e90e89b2341\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " Dec 04 22:37:55.417736 master-0 kubenswrapper[33572]: I1204 22:37:55.417618 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-swift-storage-0\") pod \"9408932d-d92d-4d1a-bd67-7e90e89b2341\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " Dec 04 22:37:55.417736 master-0 kubenswrapper[33572]: I1204 22:37:55.417669 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-svc\") pod \"9408932d-d92d-4d1a-bd67-7e90e89b2341\" (UID: \"9408932d-d92d-4d1a-bd67-7e90e89b2341\") " Dec 04 22:37:55.439479 master-0 kubenswrapper[33572]: I1204 22:37:55.438715 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9408932d-d92d-4d1a-bd67-7e90e89b2341-kube-api-access-rn6rm" (OuterVolumeSpecName: "kube-api-access-rn6rm") pod "9408932d-d92d-4d1a-bd67-7e90e89b2341" (UID: "9408932d-d92d-4d1a-bd67-7e90e89b2341"). InnerVolumeSpecName "kube-api-access-rn6rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:55.479227 master-0 kubenswrapper[33572]: I1204 22:37:55.479168 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-config" (OuterVolumeSpecName: "config") pod "9408932d-d92d-4d1a-bd67-7e90e89b2341" (UID: "9408932d-d92d-4d1a-bd67-7e90e89b2341"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:37:55.481600 master-0 kubenswrapper[33572]: I1204 22:37:55.481536 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9408932d-d92d-4d1a-bd67-7e90e89b2341" (UID: "9408932d-d92d-4d1a-bd67-7e90e89b2341"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:37:55.492711 master-0 kubenswrapper[33572]: I1204 22:37:55.492149 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9408932d-d92d-4d1a-bd67-7e90e89b2341" (UID: "9408932d-d92d-4d1a-bd67-7e90e89b2341"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:37:55.500793 master-0 kubenswrapper[33572]: I1204 22:37:55.500748 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9408932d-d92d-4d1a-bd67-7e90e89b2341" (UID: "9408932d-d92d-4d1a-bd67-7e90e89b2341"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:37:55.501682 master-0 kubenswrapper[33572]: I1204 22:37:55.501630 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9408932d-d92d-4d1a-bd67-7e90e89b2341" (UID: "9408932d-d92d-4d1a-bd67-7e90e89b2341"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:37:55.520379 master-0 kubenswrapper[33572]: I1204 22:37:55.520312 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn6rm\" (UniqueName: \"kubernetes.io/projected/9408932d-d92d-4d1a-bd67-7e90e89b2341-kube-api-access-rn6rm\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:55.520379 master-0 kubenswrapper[33572]: I1204 22:37:55.520358 33572 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:55.520379 master-0 kubenswrapper[33572]: I1204 22:37:55.520368 33572 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:55.520379 master-0 kubenswrapper[33572]: I1204 22:37:55.520378 33572 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:55.520379 master-0 kubenswrapper[33572]: I1204 22:37:55.520387 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:55.520379 master-0 kubenswrapper[33572]: I1204 22:37:55.520395 33572 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9408932d-d92d-4d1a-bd67-7e90e89b2341-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:55.558298 master-0 kubenswrapper[33572]: I1204 22:37:55.558257 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:55.622412 master-0 kubenswrapper[33572]: I1204 22:37:55.622264 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbz2r\" (UniqueName: \"kubernetes.io/projected/e36abc43-b5d9-41be-b4f3-c90c72b43065-kube-api-access-wbz2r\") pod \"e36abc43-b5d9-41be-b4f3-c90c72b43065\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " Dec 04 22:37:55.622412 master-0 kubenswrapper[33572]: I1204 22:37:55.622388 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-combined-ca-bundle\") pod \"e36abc43-b5d9-41be-b4f3-c90c72b43065\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " Dec 04 22:37:55.622679 master-0 kubenswrapper[33572]: I1204 22:37:55.622444 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-config-data\") pod \"e36abc43-b5d9-41be-b4f3-c90c72b43065\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " Dec 04 22:37:55.622679 master-0 kubenswrapper[33572]: I1204 22:37:55.622581 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-scripts\") pod \"e36abc43-b5d9-41be-b4f3-c90c72b43065\" (UID: \"e36abc43-b5d9-41be-b4f3-c90c72b43065\") " Dec 04 22:37:55.626805 master-0 kubenswrapper[33572]: I1204 22:37:55.626715 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e36abc43-b5d9-41be-b4f3-c90c72b43065-kube-api-access-wbz2r" (OuterVolumeSpecName: "kube-api-access-wbz2r") pod "e36abc43-b5d9-41be-b4f3-c90c72b43065" (UID: "e36abc43-b5d9-41be-b4f3-c90c72b43065"). InnerVolumeSpecName "kube-api-access-wbz2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:55.631368 master-0 kubenswrapper[33572]: I1204 22:37:55.631305 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-scripts" (OuterVolumeSpecName: "scripts") pod "e36abc43-b5d9-41be-b4f3-c90c72b43065" (UID: "e36abc43-b5d9-41be-b4f3-c90c72b43065"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:55.670944 master-0 kubenswrapper[33572]: I1204 22:37:55.669673 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-config-data" (OuterVolumeSpecName: "config-data") pod "e36abc43-b5d9-41be-b4f3-c90c72b43065" (UID: "e36abc43-b5d9-41be-b4f3-c90c72b43065"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:55.675132 master-0 kubenswrapper[33572]: I1204 22:37:55.675041 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e36abc43-b5d9-41be-b4f3-c90c72b43065" (UID: "e36abc43-b5d9-41be-b4f3-c90c72b43065"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:55.725850 master-0 kubenswrapper[33572]: I1204 22:37:55.725775 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:55.725850 master-0 kubenswrapper[33572]: I1204 22:37:55.725837 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:55.725850 master-0 kubenswrapper[33572]: I1204 22:37:55.725856 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbz2r\" (UniqueName: \"kubernetes.io/projected/e36abc43-b5d9-41be-b4f3-c90c72b43065-kube-api-access-wbz2r\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:55.725850 master-0 kubenswrapper[33572]: I1204 22:37:55.725872 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36abc43-b5d9-41be-b4f3-c90c72b43065-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:56.105571 master-0 kubenswrapper[33572]: I1204 22:37:56.105475 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-j8m8x" event={"ID":"e36abc43-b5d9-41be-b4f3-c90c72b43065","Type":"ContainerDied","Data":"871e2f9a787aceeab0a3068afc35f6d12bd094376ab79a1ec2fb441e094a1143"} Dec 04 22:37:56.105571 master-0 kubenswrapper[33572]: I1204 22:37:56.105554 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="871e2f9a787aceeab0a3068afc35f6d12bd094376ab79a1ec2fb441e094a1143" Dec 04 22:37:56.105571 master-0 kubenswrapper[33572]: I1204 22:37:56.105567 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-j8m8x" Dec 04 22:37:56.109579 master-0 kubenswrapper[33572]: I1204 22:37:56.109203 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" event={"ID":"9408932d-d92d-4d1a-bd67-7e90e89b2341","Type":"ContainerDied","Data":"bf58895a768d015404e0c772730c96332b3cdeb3db19686570083728c2687ea1"} Dec 04 22:37:56.109579 master-0 kubenswrapper[33572]: I1204 22:37:56.109261 33572 scope.go:117] "RemoveContainer" containerID="d0b3d68e04df90b1f3a9b46d744450614f696daf560f87b2d61928bad6ef7292" Dec 04 22:37:56.109579 master-0 kubenswrapper[33572]: I1204 22:37:56.109395 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b46df4b9-2z4nn" Dec 04 22:37:56.152074 master-0 kubenswrapper[33572]: I1204 22:37:56.151982 33572 scope.go:117] "RemoveContainer" containerID="4a4a1e62d0ca6213e68359b71811f72710a8ddceda5840fabaf96f7217606cee" Dec 04 22:37:56.191996 master-0 kubenswrapper[33572]: I1204 22:37:56.191936 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b46df4b9-2z4nn"] Dec 04 22:37:56.206395 master-0 kubenswrapper[33572]: I1204 22:37:56.206330 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79b46df4b9-2z4nn"] Dec 04 22:37:56.537972 master-0 kubenswrapper[33572]: I1204 22:37:56.537916 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9408932d-d92d-4d1a-bd67-7e90e89b2341" path="/var/lib/kubelet/pods/9408932d-d92d-4d1a-bd67-7e90e89b2341/volumes" Dec 04 22:37:57.124238 master-0 kubenswrapper[33572]: I1204 22:37:57.124158 33572 generic.go:334] "Generic (PLEG): container finished" podID="6fe4a826-dcad-4726-9545-a2f208036313" containerID="e1e47bf8f98f63f3437bfb5a44c506f051d1792be8fc6c07e683ee32ef14498f" exitCode=0 Dec 04 22:37:57.125062 master-0 kubenswrapper[33572]: I1204 22:37:57.124590 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ks44t" event={"ID":"6fe4a826-dcad-4726-9545-a2f208036313","Type":"ContainerDied","Data":"e1e47bf8f98f63f3437bfb5a44c506f051d1792be8fc6c07e683ee32ef14498f"} Dec 04 22:37:58.603637 master-0 kubenswrapper[33572]: I1204 22:37:58.603426 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:58.710289 master-0 kubenswrapper[33572]: I1204 22:37:58.710224 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-config-data\") pod \"6fe4a826-dcad-4726-9545-a2f208036313\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " Dec 04 22:37:58.710590 master-0 kubenswrapper[33572]: I1204 22:37:58.710569 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-combined-ca-bundle\") pod \"6fe4a826-dcad-4726-9545-a2f208036313\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " Dec 04 22:37:58.710670 master-0 kubenswrapper[33572]: I1204 22:37:58.710651 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rc6z\" (UniqueName: \"kubernetes.io/projected/6fe4a826-dcad-4726-9545-a2f208036313-kube-api-access-6rc6z\") pod \"6fe4a826-dcad-4726-9545-a2f208036313\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " Dec 04 22:37:58.710804 master-0 kubenswrapper[33572]: I1204 22:37:58.710756 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-scripts\") pod \"6fe4a826-dcad-4726-9545-a2f208036313\" (UID: \"6fe4a826-dcad-4726-9545-a2f208036313\") " Dec 04 22:37:58.714417 master-0 kubenswrapper[33572]: I1204 22:37:58.714333 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe4a826-dcad-4726-9545-a2f208036313-kube-api-access-6rc6z" (OuterVolumeSpecName: "kube-api-access-6rc6z") pod "6fe4a826-dcad-4726-9545-a2f208036313" (UID: "6fe4a826-dcad-4726-9545-a2f208036313"). InnerVolumeSpecName "kube-api-access-6rc6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:37:58.715394 master-0 kubenswrapper[33572]: I1204 22:37:58.715347 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-scripts" (OuterVolumeSpecName: "scripts") pod "6fe4a826-dcad-4726-9545-a2f208036313" (UID: "6fe4a826-dcad-4726-9545-a2f208036313"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:58.742419 master-0 kubenswrapper[33572]: I1204 22:37:58.742377 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fe4a826-dcad-4726-9545-a2f208036313" (UID: "6fe4a826-dcad-4726-9545-a2f208036313"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:58.757777 master-0 kubenswrapper[33572]: I1204 22:37:58.757677 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-config-data" (OuterVolumeSpecName: "config-data") pod "6fe4a826-dcad-4726-9545-a2f208036313" (UID: "6fe4a826-dcad-4726-9545-a2f208036313"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:37:58.814736 master-0 kubenswrapper[33572]: I1204 22:37:58.814675 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:58.814736 master-0 kubenswrapper[33572]: I1204 22:37:58.814731 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rc6z\" (UniqueName: \"kubernetes.io/projected/6fe4a826-dcad-4726-9545-a2f208036313-kube-api-access-6rc6z\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:58.814994 master-0 kubenswrapper[33572]: I1204 22:37:58.814750 33572 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-scripts\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:58.814994 master-0 kubenswrapper[33572]: I1204 22:37:58.814762 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe4a826-dcad-4726-9545-a2f208036313-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:37:59.166316 master-0 kubenswrapper[33572]: I1204 22:37:59.166257 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ks44t" event={"ID":"6fe4a826-dcad-4726-9545-a2f208036313","Type":"ContainerDied","Data":"36c73a49088308d881bf257e1b0f327e084d604a9878b0b4ca248ca8fc0dd4ed"} Dec 04 22:37:59.166316 master-0 kubenswrapper[33572]: I1204 22:37:59.166314 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36c73a49088308d881bf257e1b0f327e084d604a9878b0b4ca248ca8fc0dd4ed" Dec 04 22:37:59.166606 master-0 kubenswrapper[33572]: I1204 22:37:59.166406 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ks44t" Dec 04 22:37:59.375296 master-0 kubenswrapper[33572]: I1204 22:37:59.375227 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:37:59.375605 master-0 kubenswrapper[33572]: I1204 22:37:59.375565 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" containerName="nova-api-log" containerID="cri-o://f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda" gracePeriod=30 Dec 04 22:37:59.375671 master-0 kubenswrapper[33572]: I1204 22:37:59.375609 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" containerName="nova-api-api" containerID="cri-o://0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588" gracePeriod=30 Dec 04 22:37:59.397655 master-0 kubenswrapper[33572]: I1204 22:37:59.397588 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:37:59.397897 master-0 kubenswrapper[33572]: I1204 22:37:59.397849 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bc905e0f-4124-457f-b7e8-9f22528f6b2e" containerName="nova-scheduler-scheduler" containerID="cri-o://d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b" gracePeriod=30 Dec 04 22:37:59.438727 master-0 kubenswrapper[33572]: I1204 22:37:59.438564 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:37:59.438959 master-0 kubenswrapper[33572]: I1204 22:37:59.438882 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-log" containerID="cri-o://847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a" gracePeriod=30 Dec 04 22:37:59.439087 master-0 kubenswrapper[33572]: I1204 22:37:59.439040 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-metadata" containerID="cri-o://723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6" gracePeriod=30 Dec 04 22:38:00.079086 master-0 kubenswrapper[33572]: I1204 22:38:00.078111 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:38:00.148129 master-0 kubenswrapper[33572]: I1204 22:38:00.148075 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-logs\") pod \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " Dec 04 22:38:00.148328 master-0 kubenswrapper[33572]: I1204 22:38:00.148177 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-public-tls-certs\") pod \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " Dec 04 22:38:00.148328 master-0 kubenswrapper[33572]: I1204 22:38:00.148319 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-config-data\") pod \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " Dec 04 22:38:00.148398 master-0 kubenswrapper[33572]: I1204 22:38:00.148347 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-combined-ca-bundle\") pod \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " Dec 04 22:38:00.148472 master-0 kubenswrapper[33572]: I1204 22:38:00.148446 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6xg9\" (UniqueName: \"kubernetes.io/projected/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-kube-api-access-n6xg9\") pod \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " Dec 04 22:38:00.148586 master-0 kubenswrapper[33572]: I1204 22:38:00.148561 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-internal-tls-certs\") pod \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\" (UID: \"10a6824c-42b7-4b24-b2dc-97e2bd56e59b\") " Dec 04 22:38:00.148802 master-0 kubenswrapper[33572]: I1204 22:38:00.148713 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-logs" (OuterVolumeSpecName: "logs") pod "10a6824c-42b7-4b24-b2dc-97e2bd56e59b" (UID: "10a6824c-42b7-4b24-b2dc-97e2bd56e59b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:38:00.149259 master-0 kubenswrapper[33572]: I1204 22:38:00.149229 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:00.158998 master-0 kubenswrapper[33572]: I1204 22:38:00.158942 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-kube-api-access-n6xg9" (OuterVolumeSpecName: "kube-api-access-n6xg9") pod "10a6824c-42b7-4b24-b2dc-97e2bd56e59b" (UID: "10a6824c-42b7-4b24-b2dc-97e2bd56e59b"). InnerVolumeSpecName "kube-api-access-n6xg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:38:00.182923 master-0 kubenswrapper[33572]: I1204 22:38:00.182859 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10a6824c-42b7-4b24-b2dc-97e2bd56e59b" (UID: "10a6824c-42b7-4b24-b2dc-97e2bd56e59b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:38:00.194172 master-0 kubenswrapper[33572]: I1204 22:38:00.193411 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-config-data" (OuterVolumeSpecName: "config-data") pod "10a6824c-42b7-4b24-b2dc-97e2bd56e59b" (UID: "10a6824c-42b7-4b24-b2dc-97e2bd56e59b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:38:00.196467 master-0 kubenswrapper[33572]: I1204 22:38:00.196425 33572 generic.go:334] "Generic (PLEG): container finished" podID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" containerID="0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588" exitCode=0 Dec 04 22:38:00.196552 master-0 kubenswrapper[33572]: I1204 22:38:00.196468 33572 generic.go:334] "Generic (PLEG): container finished" podID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" containerID="f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda" exitCode=143 Dec 04 22:38:00.196552 master-0 kubenswrapper[33572]: I1204 22:38:00.196485 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:38:00.196950 master-0 kubenswrapper[33572]: I1204 22:38:00.196613 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a6824c-42b7-4b24-b2dc-97e2bd56e59b","Type":"ContainerDied","Data":"0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588"} Dec 04 22:38:00.196950 master-0 kubenswrapper[33572]: I1204 22:38:00.196695 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a6824c-42b7-4b24-b2dc-97e2bd56e59b","Type":"ContainerDied","Data":"f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda"} Dec 04 22:38:00.196950 master-0 kubenswrapper[33572]: I1204 22:38:00.196710 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10a6824c-42b7-4b24-b2dc-97e2bd56e59b","Type":"ContainerDied","Data":"8f6e3459bc5a5e582531f221a8ea101bc46ef7b73946699467649a3c97a34dd3"} Dec 04 22:38:00.196950 master-0 kubenswrapper[33572]: I1204 22:38:00.196736 33572 scope.go:117] "RemoveContainer" containerID="0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588" Dec 04 22:38:00.198660 master-0 kubenswrapper[33572]: I1204 22:38:00.198637 33572 generic.go:334] "Generic (PLEG): container finished" podID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerID="847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a" exitCode=143 Dec 04 22:38:00.198787 master-0 kubenswrapper[33572]: I1204 22:38:00.198670 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71c4600b-ad67-4111-bfae-0bc1eaad685c","Type":"ContainerDied","Data":"847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a"} Dec 04 22:38:00.218040 master-0 kubenswrapper[33572]: I1204 22:38:00.217981 33572 scope.go:117] "RemoveContainer" containerID="f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda" Dec 04 22:38:00.234563 master-0 kubenswrapper[33572]: I1204 22:38:00.234522 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "10a6824c-42b7-4b24-b2dc-97e2bd56e59b" (UID: "10a6824c-42b7-4b24-b2dc-97e2bd56e59b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:38:00.251217 master-0 kubenswrapper[33572]: I1204 22:38:00.251165 33572 scope.go:117] "RemoveContainer" containerID="0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588" Dec 04 22:38:00.251717 master-0 kubenswrapper[33572]: E1204 22:38:00.251674 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588\": container with ID starting with 0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588 not found: ID does not exist" containerID="0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588" Dec 04 22:38:00.251777 master-0 kubenswrapper[33572]: I1204 22:38:00.251722 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588"} err="failed to get container status \"0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588\": rpc error: code = NotFound desc = could not find container \"0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588\": container with ID starting with 0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588 not found: ID does not exist" Dec 04 22:38:00.251777 master-0 kubenswrapper[33572]: I1204 22:38:00.251750 33572 scope.go:117] "RemoveContainer" containerID="f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda" Dec 04 22:38:00.252436 master-0 kubenswrapper[33572]: I1204 22:38:00.252404 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6xg9\" (UniqueName: \"kubernetes.io/projected/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-kube-api-access-n6xg9\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:00.252563 master-0 kubenswrapper[33572]: I1204 22:38:00.252444 33572 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:00.252604 master-0 kubenswrapper[33572]: I1204 22:38:00.252572 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:00.252604 master-0 kubenswrapper[33572]: I1204 22:38:00.252592 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:00.259989 master-0 kubenswrapper[33572]: E1204 22:38:00.259950 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda\": container with ID starting with f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda not found: ID does not exist" containerID="f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda" Dec 04 22:38:00.260058 master-0 kubenswrapper[33572]: I1204 22:38:00.259996 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda"} err="failed to get container status \"f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda\": rpc error: code = NotFound desc = could not find container \"f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda\": container with ID starting with f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda not found: ID does not exist" Dec 04 22:38:00.260058 master-0 kubenswrapper[33572]: I1204 22:38:00.260026 33572 scope.go:117] "RemoveContainer" containerID="0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588" Dec 04 22:38:00.260545 master-0 kubenswrapper[33572]: I1204 22:38:00.260479 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588"} err="failed to get container status \"0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588\": rpc error: code = NotFound desc = could not find container \"0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588\": container with ID starting with 0f26ab3ff202095443d6cd174ff3dc77912c6dd70423b0f5be2b29ee9e0f9588 not found: ID does not exist" Dec 04 22:38:00.260742 master-0 kubenswrapper[33572]: I1204 22:38:00.260552 33572 scope.go:117] "RemoveContainer" containerID="f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda" Dec 04 22:38:00.261071 master-0 kubenswrapper[33572]: I1204 22:38:00.261027 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda"} err="failed to get container status \"f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda\": rpc error: code = NotFound desc = could not find container \"f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda\": container with ID starting with f169d1509de3437d0b621b50c15151b7099839d23beb6d7b202874e4036b4dda not found: ID does not exist" Dec 04 22:38:00.261389 master-0 kubenswrapper[33572]: I1204 22:38:00.261336 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "10a6824c-42b7-4b24-b2dc-97e2bd56e59b" (UID: "10a6824c-42b7-4b24-b2dc-97e2bd56e59b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:38:00.355640 master-0 kubenswrapper[33572]: I1204 22:38:00.355556 33572 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10a6824c-42b7-4b24-b2dc-97e2bd56e59b-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:00.553232 master-0 kubenswrapper[33572]: I1204 22:38:00.553120 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:38:00.567490 master-0 kubenswrapper[33572]: I1204 22:38:00.567414 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:38:00.608286 master-0 kubenswrapper[33572]: I1204 22:38:00.608190 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 22:38:00.608874 master-0 kubenswrapper[33572]: E1204 22:38:00.608812 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36abc43-b5d9-41be-b4f3-c90c72b43065" containerName="nova-manage" Dec 04 22:38:00.608874 master-0 kubenswrapper[33572]: I1204 22:38:00.608847 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36abc43-b5d9-41be-b4f3-c90c72b43065" containerName="nova-manage" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: E1204 22:38:00.608888 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" containerName="nova-api-log" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: I1204 22:38:00.608899 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" containerName="nova-api-log" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: E1204 22:38:00.608932 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9408932d-d92d-4d1a-bd67-7e90e89b2341" containerName="dnsmasq-dns" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: I1204 22:38:00.608942 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9408932d-d92d-4d1a-bd67-7e90e89b2341" containerName="dnsmasq-dns" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: E1204 22:38:00.608964 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9408932d-d92d-4d1a-bd67-7e90e89b2341" containerName="init" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: I1204 22:38:00.608973 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9408932d-d92d-4d1a-bd67-7e90e89b2341" containerName="init" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: E1204 22:38:00.609003 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" containerName="nova-api-api" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: I1204 22:38:00.609012 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" containerName="nova-api-api" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: E1204 22:38:00.609041 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe4a826-dcad-4726-9545-a2f208036313" containerName="nova-manage" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: I1204 22:38:00.609049 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe4a826-dcad-4726-9545-a2f208036313" containerName="nova-manage" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: I1204 22:38:00.609402 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" containerName="nova-api-api" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: I1204 22:38:00.609437 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36abc43-b5d9-41be-b4f3-c90c72b43065" containerName="nova-manage" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: I1204 22:38:00.609466 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" containerName="nova-api-log" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: I1204 22:38:00.609488 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe4a826-dcad-4726-9545-a2f208036313" containerName="nova-manage" Dec 04 22:38:00.609850 master-0 kubenswrapper[33572]: I1204 22:38:00.609529 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="9408932d-d92d-4d1a-bd67-7e90e89b2341" containerName="dnsmasq-dns" Dec 04 22:38:00.611546 master-0 kubenswrapper[33572]: I1204 22:38:00.611207 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:38:00.616347 master-0 kubenswrapper[33572]: I1204 22:38:00.614109 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 22:38:00.616347 master-0 kubenswrapper[33572]: I1204 22:38:00.614565 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 22:38:00.616347 master-0 kubenswrapper[33572]: I1204 22:38:00.615078 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 22:38:00.625130 master-0 kubenswrapper[33572]: I1204 22:38:00.625050 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:38:00.773205 master-0 kubenswrapper[33572]: I1204 22:38:00.773142 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-public-tls-certs\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.773781 master-0 kubenswrapper[33572]: I1204 22:38:00.773707 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382c705e-e524-4ac9-bdc0-860d7b670518-logs\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.773855 master-0 kubenswrapper[33572]: I1204 22:38:00.773794 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57z8k\" (UniqueName: \"kubernetes.io/projected/382c705e-e524-4ac9-bdc0-860d7b670518-kube-api-access-57z8k\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.774002 master-0 kubenswrapper[33572]: I1204 22:38:00.773965 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.774089 master-0 kubenswrapper[33572]: I1204 22:38:00.774054 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-internal-tls-certs\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.774278 master-0 kubenswrapper[33572]: I1204 22:38:00.774246 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-config-data\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.863215 master-0 kubenswrapper[33572]: E1204 22:38:00.863131 33572 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 22:38:00.864891 master-0 kubenswrapper[33572]: E1204 22:38:00.864805 33572 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 22:38:00.866169 master-0 kubenswrapper[33572]: E1204 22:38:00.866124 33572 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 22:38:00.866169 master-0 kubenswrapper[33572]: E1204 22:38:00.866162 33572 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bc905e0f-4124-457f-b7e8-9f22528f6b2e" containerName="nova-scheduler-scheduler" Dec 04 22:38:00.877168 master-0 kubenswrapper[33572]: I1204 22:38:00.877086 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-config-data\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.877348 master-0 kubenswrapper[33572]: I1204 22:38:00.877247 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-public-tls-certs\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.877479 master-0 kubenswrapper[33572]: I1204 22:38:00.877439 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382c705e-e524-4ac9-bdc0-860d7b670518-logs\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.877557 master-0 kubenswrapper[33572]: I1204 22:38:00.877486 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57z8k\" (UniqueName: \"kubernetes.io/projected/382c705e-e524-4ac9-bdc0-860d7b670518-kube-api-access-57z8k\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.877652 master-0 kubenswrapper[33572]: I1204 22:38:00.877616 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.877705 master-0 kubenswrapper[33572]: I1204 22:38:00.877682 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-internal-tls-certs\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.878169 master-0 kubenswrapper[33572]: I1204 22:38:00.878125 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/382c705e-e524-4ac9-bdc0-860d7b670518-logs\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.881373 master-0 kubenswrapper[33572]: I1204 22:38:00.881331 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-internal-tls-certs\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.882708 master-0 kubenswrapper[33572]: I1204 22:38:00.882654 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.882771 master-0 kubenswrapper[33572]: I1204 22:38:00.882708 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-public-tls-certs\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.883336 master-0 kubenswrapper[33572]: I1204 22:38:00.883291 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/382c705e-e524-4ac9-bdc0-860d7b670518-config-data\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.899785 master-0 kubenswrapper[33572]: I1204 22:38:00.899745 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57z8k\" (UniqueName: \"kubernetes.io/projected/382c705e-e524-4ac9-bdc0-860d7b670518-kube-api-access-57z8k\") pod \"nova-api-0\" (UID: \"382c705e-e524-4ac9-bdc0-860d7b670518\") " pod="openstack/nova-api-0" Dec 04 22:38:00.959519 master-0 kubenswrapper[33572]: I1204 22:38:00.959438 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 22:38:01.544555 master-0 kubenswrapper[33572]: W1204 22:38:01.541910 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod382c705e_e524_4ac9_bdc0_860d7b670518.slice/crio-e265ec5e04f78cc64ba3dd072bc9a5362d0e534b4c5965d647c7cc8744711349 WatchSource:0}: Error finding container e265ec5e04f78cc64ba3dd072bc9a5362d0e534b4c5965d647c7cc8744711349: Status 404 returned error can't find the container with id e265ec5e04f78cc64ba3dd072bc9a5362d0e534b4c5965d647c7cc8744711349 Dec 04 22:38:01.544555 master-0 kubenswrapper[33572]: I1204 22:38:01.544225 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 22:38:02.238636 master-0 kubenswrapper[33572]: I1204 22:38:02.238566 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"382c705e-e524-4ac9-bdc0-860d7b670518","Type":"ContainerStarted","Data":"733f07781f35f0eb647f905dd924c471d99e9ddd88a554c929668770443660ad"} Dec 04 22:38:02.238636 master-0 kubenswrapper[33572]: I1204 22:38:02.238632 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"382c705e-e524-4ac9-bdc0-860d7b670518","Type":"ContainerStarted","Data":"17aae3cdbcb1f3a85a69414d10690ae293bcc262db5e0b79dbbb9941b57e67cf"} Dec 04 22:38:02.238916 master-0 kubenswrapper[33572]: I1204 22:38:02.238650 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"382c705e-e524-4ac9-bdc0-860d7b670518","Type":"ContainerStarted","Data":"e265ec5e04f78cc64ba3dd072bc9a5362d0e534b4c5965d647c7cc8744711349"} Dec 04 22:38:02.259778 master-0 kubenswrapper[33572]: I1204 22:38:02.259698 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.259679236 podStartE2EDuration="2.259679236s" podCreationTimestamp="2025-12-04 22:38:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:38:02.256620332 +0000 UTC m=+1145.984145981" watchObservedRunningTime="2025-12-04 22:38:02.259679236 +0000 UTC m=+1145.987204885" Dec 04 22:38:02.547079 master-0 kubenswrapper[33572]: I1204 22:38:02.547011 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10a6824c-42b7-4b24-b2dc-97e2bd56e59b" path="/var/lib/kubelet/pods/10a6824c-42b7-4b24-b2dc-97e2bd56e59b/volumes" Dec 04 22:38:03.251237 master-0 kubenswrapper[33572]: I1204 22:38:03.251173 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:38:03.262362 master-0 kubenswrapper[33572]: I1204 22:38:03.262308 33572 generic.go:334] "Generic (PLEG): container finished" podID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerID="723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6" exitCode=0 Dec 04 22:38:03.263224 master-0 kubenswrapper[33572]: I1204 22:38:03.262364 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:38:03.263224 master-0 kubenswrapper[33572]: I1204 22:38:03.262411 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71c4600b-ad67-4111-bfae-0bc1eaad685c","Type":"ContainerDied","Data":"723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6"} Dec 04 22:38:03.263224 master-0 kubenswrapper[33572]: I1204 22:38:03.262478 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71c4600b-ad67-4111-bfae-0bc1eaad685c","Type":"ContainerDied","Data":"af1d88af79948d5a5a27f891e7cf246ba072d0a915a982170ec74b977d30bb6d"} Dec 04 22:38:03.263224 master-0 kubenswrapper[33572]: I1204 22:38:03.262524 33572 scope.go:117] "RemoveContainer" containerID="723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6" Dec 04 22:38:03.355676 master-0 kubenswrapper[33572]: I1204 22:38:03.353856 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-combined-ca-bundle\") pod \"71c4600b-ad67-4111-bfae-0bc1eaad685c\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " Dec 04 22:38:03.355676 master-0 kubenswrapper[33572]: I1204 22:38:03.353998 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szzsd\" (UniqueName: \"kubernetes.io/projected/71c4600b-ad67-4111-bfae-0bc1eaad685c-kube-api-access-szzsd\") pod \"71c4600b-ad67-4111-bfae-0bc1eaad685c\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " Dec 04 22:38:03.355676 master-0 kubenswrapper[33572]: I1204 22:38:03.354032 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c4600b-ad67-4111-bfae-0bc1eaad685c-logs\") pod \"71c4600b-ad67-4111-bfae-0bc1eaad685c\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " Dec 04 22:38:03.355676 master-0 kubenswrapper[33572]: I1204 22:38:03.354229 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-nova-metadata-tls-certs\") pod \"71c4600b-ad67-4111-bfae-0bc1eaad685c\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " Dec 04 22:38:03.355676 master-0 kubenswrapper[33572]: I1204 22:38:03.354285 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-config-data\") pod \"71c4600b-ad67-4111-bfae-0bc1eaad685c\" (UID: \"71c4600b-ad67-4111-bfae-0bc1eaad685c\") " Dec 04 22:38:03.355676 master-0 kubenswrapper[33572]: I1204 22:38:03.354803 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c4600b-ad67-4111-bfae-0bc1eaad685c-logs" (OuterVolumeSpecName: "logs") pod "71c4600b-ad67-4111-bfae-0bc1eaad685c" (UID: "71c4600b-ad67-4111-bfae-0bc1eaad685c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:38:03.356958 master-0 kubenswrapper[33572]: I1204 22:38:03.356121 33572 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71c4600b-ad67-4111-bfae-0bc1eaad685c-logs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:03.358374 master-0 kubenswrapper[33572]: I1204 22:38:03.358318 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c4600b-ad67-4111-bfae-0bc1eaad685c-kube-api-access-szzsd" (OuterVolumeSpecName: "kube-api-access-szzsd") pod "71c4600b-ad67-4111-bfae-0bc1eaad685c" (UID: "71c4600b-ad67-4111-bfae-0bc1eaad685c"). InnerVolumeSpecName "kube-api-access-szzsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:38:03.362874 master-0 kubenswrapper[33572]: I1204 22:38:03.362814 33572 scope.go:117] "RemoveContainer" containerID="847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a" Dec 04 22:38:03.386187 master-0 kubenswrapper[33572]: I1204 22:38:03.386047 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-config-data" (OuterVolumeSpecName: "config-data") pod "71c4600b-ad67-4111-bfae-0bc1eaad685c" (UID: "71c4600b-ad67-4111-bfae-0bc1eaad685c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:38:03.402254 master-0 kubenswrapper[33572]: I1204 22:38:03.402167 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71c4600b-ad67-4111-bfae-0bc1eaad685c" (UID: "71c4600b-ad67-4111-bfae-0bc1eaad685c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:38:03.465123 master-0 kubenswrapper[33572]: I1204 22:38:03.465047 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:03.465123 master-0 kubenswrapper[33572]: I1204 22:38:03.465113 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:03.465376 master-0 kubenswrapper[33572]: I1204 22:38:03.465133 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szzsd\" (UniqueName: \"kubernetes.io/projected/71c4600b-ad67-4111-bfae-0bc1eaad685c-kube-api-access-szzsd\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:03.470730 master-0 kubenswrapper[33572]: I1204 22:38:03.470690 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "71c4600b-ad67-4111-bfae-0bc1eaad685c" (UID: "71c4600b-ad67-4111-bfae-0bc1eaad685c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:38:03.481636 master-0 kubenswrapper[33572]: I1204 22:38:03.481590 33572 scope.go:117] "RemoveContainer" containerID="723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6" Dec 04 22:38:03.483692 master-0 kubenswrapper[33572]: E1204 22:38:03.483650 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6\": container with ID starting with 723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6 not found: ID does not exist" containerID="723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6" Dec 04 22:38:03.483749 master-0 kubenswrapper[33572]: I1204 22:38:03.483699 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6"} err="failed to get container status \"723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6\": rpc error: code = NotFound desc = could not find container \"723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6\": container with ID starting with 723843c3151c36dbcc7698d234e8fbc3fd5f4072f4de2ef684c1c68ddebbeda6 not found: ID does not exist" Dec 04 22:38:03.483749 master-0 kubenswrapper[33572]: I1204 22:38:03.483726 33572 scope.go:117] "RemoveContainer" containerID="847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a" Dec 04 22:38:03.484100 master-0 kubenswrapper[33572]: E1204 22:38:03.484045 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a\": container with ID starting with 847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a not found: ID does not exist" containerID="847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a" Dec 04 22:38:03.484148 master-0 kubenswrapper[33572]: I1204 22:38:03.484111 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a"} err="failed to get container status \"847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a\": rpc error: code = NotFound desc = could not find container \"847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a\": container with ID starting with 847c62c0b88304366b7e8698340242bb445959c3c34d49878ef5385bae05472a not found: ID does not exist" Dec 04 22:38:03.566999 master-0 kubenswrapper[33572]: I1204 22:38:03.566945 33572 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71c4600b-ad67-4111-bfae-0bc1eaad685c-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:03.611818 master-0 kubenswrapper[33572]: I1204 22:38:03.611676 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:38:03.640798 master-0 kubenswrapper[33572]: I1204 22:38:03.640635 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:38:03.687363 master-0 kubenswrapper[33572]: I1204 22:38:03.686404 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:38:03.690869 master-0 kubenswrapper[33572]: E1204 22:38:03.687641 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-metadata" Dec 04 22:38:03.690869 master-0 kubenswrapper[33572]: I1204 22:38:03.687676 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-metadata" Dec 04 22:38:03.690869 master-0 kubenswrapper[33572]: E1204 22:38:03.687724 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-log" Dec 04 22:38:03.690869 master-0 kubenswrapper[33572]: I1204 22:38:03.687738 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-log" Dec 04 22:38:03.690869 master-0 kubenswrapper[33572]: I1204 22:38:03.688279 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-metadata" Dec 04 22:38:03.690869 master-0 kubenswrapper[33572]: I1204 22:38:03.688335 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-log" Dec 04 22:38:03.690869 master-0 kubenswrapper[33572]: I1204 22:38:03.690727 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:38:03.698118 master-0 kubenswrapper[33572]: I1204 22:38:03.698084 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:38:03.700727 master-0 kubenswrapper[33572]: I1204 22:38:03.700616 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 22:38:03.700833 master-0 kubenswrapper[33572]: I1204 22:38:03.700813 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 22:38:03.774455 master-0 kubenswrapper[33572]: I1204 22:38:03.774396 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-config-data\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.774455 master-0 kubenswrapper[33572]: I1204 22:38:03.774447 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.774734 master-0 kubenswrapper[33572]: I1204 22:38:03.774476 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-logs\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.774734 master-0 kubenswrapper[33572]: I1204 22:38:03.774565 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjq4h\" (UniqueName: \"kubernetes.io/projected/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-kube-api-access-xjq4h\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.774734 master-0 kubenswrapper[33572]: I1204 22:38:03.774592 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.879446 master-0 kubenswrapper[33572]: I1204 22:38:03.879392 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-config-data\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.879446 master-0 kubenswrapper[33572]: I1204 22:38:03.879451 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.879826 master-0 kubenswrapper[33572]: I1204 22:38:03.879775 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-logs\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.880206 master-0 kubenswrapper[33572]: I1204 22:38:03.880139 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjq4h\" (UniqueName: \"kubernetes.io/projected/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-kube-api-access-xjq4h\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.880290 master-0 kubenswrapper[33572]: I1204 22:38:03.880266 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.880340 master-0 kubenswrapper[33572]: I1204 22:38:03.880319 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-logs\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.883442 master-0 kubenswrapper[33572]: I1204 22:38:03.883411 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.884090 master-0 kubenswrapper[33572]: I1204 22:38:03.884045 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-config-data\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.884319 master-0 kubenswrapper[33572]: I1204 22:38:03.884259 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:03.896666 master-0 kubenswrapper[33572]: I1204 22:38:03.896562 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjq4h\" (UniqueName: \"kubernetes.io/projected/de28bdd4-fe9d-42ab-9255-b178f2c8abd9-kube-api-access-xjq4h\") pod \"nova-metadata-0\" (UID: \"de28bdd4-fe9d-42ab-9255-b178f2c8abd9\") " pod="openstack/nova-metadata-0" Dec 04 22:38:04.060944 master-0 kubenswrapper[33572]: I1204 22:38:04.060880 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 22:38:04.601483 master-0 kubenswrapper[33572]: I1204 22:38:04.601423 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" path="/var/lib/kubelet/pods/71c4600b-ad67-4111-bfae-0bc1eaad685c/volumes" Dec 04 22:38:04.681872 master-0 kubenswrapper[33572]: I1204 22:38:04.681788 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 22:38:04.682493 master-0 kubenswrapper[33572]: W1204 22:38:04.682442 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde28bdd4_fe9d_42ab_9255_b178f2c8abd9.slice/crio-c1c05d32ee69fa47ab2525f01b7014c50eccbd5d192487cce07f29d54e0f81d3 WatchSource:0}: Error finding container c1c05d32ee69fa47ab2525f01b7014c50eccbd5d192487cce07f29d54e0f81d3: Status 404 returned error can't find the container with id c1c05d32ee69fa47ab2525f01b7014c50eccbd5d192487cce07f29d54e0f81d3 Dec 04 22:38:05.117313 master-0 kubenswrapper[33572]: I1204 22:38:05.116476 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 22:38:05.221872 master-0 kubenswrapper[33572]: I1204 22:38:05.221459 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfpxz\" (UniqueName: \"kubernetes.io/projected/bc905e0f-4124-457f-b7e8-9f22528f6b2e-kube-api-access-pfpxz\") pod \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " Dec 04 22:38:05.221872 master-0 kubenswrapper[33572]: I1204 22:38:05.221751 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-combined-ca-bundle\") pod \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " Dec 04 22:38:05.222143 master-0 kubenswrapper[33572]: I1204 22:38:05.221994 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-config-data\") pod \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\" (UID: \"bc905e0f-4124-457f-b7e8-9f22528f6b2e\") " Dec 04 22:38:05.227558 master-0 kubenswrapper[33572]: I1204 22:38:05.227482 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc905e0f-4124-457f-b7e8-9f22528f6b2e-kube-api-access-pfpxz" (OuterVolumeSpecName: "kube-api-access-pfpxz") pod "bc905e0f-4124-457f-b7e8-9f22528f6b2e" (UID: "bc905e0f-4124-457f-b7e8-9f22528f6b2e"). InnerVolumeSpecName "kube-api-access-pfpxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:38:05.270720 master-0 kubenswrapper[33572]: I1204 22:38:05.270585 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-config-data" (OuterVolumeSpecName: "config-data") pod "bc905e0f-4124-457f-b7e8-9f22528f6b2e" (UID: "bc905e0f-4124-457f-b7e8-9f22528f6b2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:38:05.292587 master-0 kubenswrapper[33572]: I1204 22:38:05.292460 33572 generic.go:334] "Generic (PLEG): container finished" podID="bc905e0f-4124-457f-b7e8-9f22528f6b2e" containerID="d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b" exitCode=0 Dec 04 22:38:05.292587 master-0 kubenswrapper[33572]: I1204 22:38:05.292564 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 22:38:05.292935 master-0 kubenswrapper[33572]: I1204 22:38:05.292549 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc905e0f-4124-457f-b7e8-9f22528f6b2e","Type":"ContainerDied","Data":"d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b"} Dec 04 22:38:05.292935 master-0 kubenswrapper[33572]: I1204 22:38:05.292679 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bc905e0f-4124-457f-b7e8-9f22528f6b2e","Type":"ContainerDied","Data":"ae95a556cc81f8433c7a56393fb94bbb30d5db0333192506d8253fffdd187901"} Dec 04 22:38:05.292935 master-0 kubenswrapper[33572]: I1204 22:38:05.292706 33572 scope.go:117] "RemoveContainer" containerID="d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b" Dec 04 22:38:05.293674 master-0 kubenswrapper[33572]: I1204 22:38:05.293614 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc905e0f-4124-457f-b7e8-9f22528f6b2e" (UID: "bc905e0f-4124-457f-b7e8-9f22528f6b2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:38:05.301810 master-0 kubenswrapper[33572]: I1204 22:38:05.295595 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de28bdd4-fe9d-42ab-9255-b178f2c8abd9","Type":"ContainerStarted","Data":"f11a744caedf11044e2815dfb028ec36ec99fbd7c553ae880a34fcab840e0e4c"} Dec 04 22:38:05.301810 master-0 kubenswrapper[33572]: I1204 22:38:05.295639 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de28bdd4-fe9d-42ab-9255-b178f2c8abd9","Type":"ContainerStarted","Data":"0aabae359f32f3721ea4f06440447d5ffd0e681a782bc91cdaa001c83ee33160"} Dec 04 22:38:05.301810 master-0 kubenswrapper[33572]: I1204 22:38:05.295653 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"de28bdd4-fe9d-42ab-9255-b178f2c8abd9","Type":"ContainerStarted","Data":"c1c05d32ee69fa47ab2525f01b7014c50eccbd5d192487cce07f29d54e0f81d3"} Dec 04 22:38:05.324325 master-0 kubenswrapper[33572]: I1204 22:38:05.324257 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:05.324325 master-0 kubenswrapper[33572]: I1204 22:38:05.324303 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc905e0f-4124-457f-b7e8-9f22528f6b2e-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:05.324325 master-0 kubenswrapper[33572]: I1204 22:38:05.324316 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfpxz\" (UniqueName: \"kubernetes.io/projected/bc905e0f-4124-457f-b7e8-9f22528f6b2e-kube-api-access-pfpxz\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:05.328593 master-0 kubenswrapper[33572]: I1204 22:38:05.328550 33572 scope.go:117] "RemoveContainer" containerID="d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b" Dec 04 22:38:05.330238 master-0 kubenswrapper[33572]: E1204 22:38:05.330187 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b\": container with ID starting with d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b not found: ID does not exist" containerID="d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b" Dec 04 22:38:05.330313 master-0 kubenswrapper[33572]: I1204 22:38:05.330234 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b"} err="failed to get container status \"d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b\": rpc error: code = NotFound desc = could not find container \"d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b\": container with ID starting with d1d656ce393ed53396d58147c35ac3d09d21fa862f77651c16e6e57c5bd8f46b not found: ID does not exist" Dec 04 22:38:05.368832 master-0 kubenswrapper[33572]: I1204 22:38:05.368716 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.368692909 podStartE2EDuration="2.368692909s" podCreationTimestamp="2025-12-04 22:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:38:05.362408277 +0000 UTC m=+1149.089933926" watchObservedRunningTime="2025-12-04 22:38:05.368692909 +0000 UTC m=+1149.096218558" Dec 04 22:38:05.631211 master-0 kubenswrapper[33572]: I1204 22:38:05.631125 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:38:05.642978 master-0 kubenswrapper[33572]: I1204 22:38:05.642913 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:38:05.678326 master-0 kubenswrapper[33572]: I1204 22:38:05.678254 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:38:05.678930 master-0 kubenswrapper[33572]: E1204 22:38:05.678881 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc905e0f-4124-457f-b7e8-9f22528f6b2e" containerName="nova-scheduler-scheduler" Dec 04 22:38:05.678930 master-0 kubenswrapper[33572]: I1204 22:38:05.678914 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc905e0f-4124-457f-b7e8-9f22528f6b2e" containerName="nova-scheduler-scheduler" Dec 04 22:38:05.679301 master-0 kubenswrapper[33572]: I1204 22:38:05.679242 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc905e0f-4124-457f-b7e8-9f22528f6b2e" containerName="nova-scheduler-scheduler" Dec 04 22:38:05.680235 master-0 kubenswrapper[33572]: I1204 22:38:05.680202 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 22:38:05.682583 master-0 kubenswrapper[33572]: I1204 22:38:05.682529 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 22:38:05.711901 master-0 kubenswrapper[33572]: I1204 22:38:05.694343 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:38:05.735543 master-0 kubenswrapper[33572]: I1204 22:38:05.735473 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qctv2\" (UniqueName: \"kubernetes.io/projected/f3786f50-f32e-430f-960b-3fabbd9c2597-kube-api-access-qctv2\") pod \"nova-scheduler-0\" (UID: \"f3786f50-f32e-430f-960b-3fabbd9c2597\") " pod="openstack/nova-scheduler-0" Dec 04 22:38:05.735680 master-0 kubenswrapper[33572]: I1204 22:38:05.735634 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3786f50-f32e-430f-960b-3fabbd9c2597-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3786f50-f32e-430f-960b-3fabbd9c2597\") " pod="openstack/nova-scheduler-0" Dec 04 22:38:05.735817 master-0 kubenswrapper[33572]: I1204 22:38:05.735782 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3786f50-f32e-430f-960b-3fabbd9c2597-config-data\") pod \"nova-scheduler-0\" (UID: \"f3786f50-f32e-430f-960b-3fabbd9c2597\") " pod="openstack/nova-scheduler-0" Dec 04 22:38:05.837618 master-0 kubenswrapper[33572]: I1204 22:38:05.837480 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qctv2\" (UniqueName: \"kubernetes.io/projected/f3786f50-f32e-430f-960b-3fabbd9c2597-kube-api-access-qctv2\") pod \"nova-scheduler-0\" (UID: \"f3786f50-f32e-430f-960b-3fabbd9c2597\") " pod="openstack/nova-scheduler-0" Dec 04 22:38:05.837888 master-0 kubenswrapper[33572]: I1204 22:38:05.837742 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3786f50-f32e-430f-960b-3fabbd9c2597-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3786f50-f32e-430f-960b-3fabbd9c2597\") " pod="openstack/nova-scheduler-0" Dec 04 22:38:05.837969 master-0 kubenswrapper[33572]: I1204 22:38:05.837914 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3786f50-f32e-430f-960b-3fabbd9c2597-config-data\") pod \"nova-scheduler-0\" (UID: \"f3786f50-f32e-430f-960b-3fabbd9c2597\") " pod="openstack/nova-scheduler-0" Dec 04 22:38:05.841975 master-0 kubenswrapper[33572]: I1204 22:38:05.841932 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3786f50-f32e-430f-960b-3fabbd9c2597-config-data\") pod \"nova-scheduler-0\" (UID: \"f3786f50-f32e-430f-960b-3fabbd9c2597\") " pod="openstack/nova-scheduler-0" Dec 04 22:38:05.843224 master-0 kubenswrapper[33572]: I1204 22:38:05.843169 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3786f50-f32e-430f-960b-3fabbd9c2597-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f3786f50-f32e-430f-960b-3fabbd9c2597\") " pod="openstack/nova-scheduler-0" Dec 04 22:38:05.853582 master-0 kubenswrapper[33572]: I1204 22:38:05.853451 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qctv2\" (UniqueName: \"kubernetes.io/projected/f3786f50-f32e-430f-960b-3fabbd9c2597-kube-api-access-qctv2\") pod \"nova-scheduler-0\" (UID: \"f3786f50-f32e-430f-960b-3fabbd9c2597\") " pod="openstack/nova-scheduler-0" Dec 04 22:38:06.051096 master-0 kubenswrapper[33572]: I1204 22:38:06.051004 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 22:38:06.558092 master-0 kubenswrapper[33572]: I1204 22:38:06.557998 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc905e0f-4124-457f-b7e8-9f22528f6b2e" path="/var/lib/kubelet/pods/bc905e0f-4124-457f-b7e8-9f22528f6b2e/volumes" Dec 04 22:38:06.611446 master-0 kubenswrapper[33572]: I1204 22:38:06.611341 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 22:38:06.611831 master-0 kubenswrapper[33572]: W1204 22:38:06.611743 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3786f50_f32e_430f_960b_3fabbd9c2597.slice/crio-d287be950ca3a86cdfa794d22bdb1a91e8f110cb9472b7a62b6bca11ad33bcaa WatchSource:0}: Error finding container d287be950ca3a86cdfa794d22bdb1a91e8f110cb9472b7a62b6bca11ad33bcaa: Status 404 returned error can't find the container with id d287be950ca3a86cdfa794d22bdb1a91e8f110cb9472b7a62b6bca11ad33bcaa Dec 04 22:38:07.325576 master-0 kubenswrapper[33572]: I1204 22:38:07.325460 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3786f50-f32e-430f-960b-3fabbd9c2597","Type":"ContainerStarted","Data":"e40b14a4a09ac50d8fec57d37c53d71c598ec6ecccda6ecd2b2ab20de167fb05"} Dec 04 22:38:07.326313 master-0 kubenswrapper[33572]: I1204 22:38:07.325583 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f3786f50-f32e-430f-960b-3fabbd9c2597","Type":"ContainerStarted","Data":"d287be950ca3a86cdfa794d22bdb1a91e8f110cb9472b7a62b6bca11ad33bcaa"} Dec 04 22:38:07.376433 master-0 kubenswrapper[33572]: I1204 22:38:07.376240 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.37619965 podStartE2EDuration="2.37619965s" podCreationTimestamp="2025-12-04 22:38:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:38:07.354136588 +0000 UTC m=+1151.081662267" watchObservedRunningTime="2025-12-04 22:38:07.37619965 +0000 UTC m=+1151.103725329" Dec 04 22:38:07.905092 master-0 kubenswrapper[33572]: I1204 22:38:07.904964 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.9:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:38:07.905390 master-0 kubenswrapper[33572]: I1204 22:38:07.905272 33572 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="71c4600b-ad67-4111-bfae-0bc1eaad685c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.9:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 22:38:09.061378 master-0 kubenswrapper[33572]: I1204 22:38:09.061226 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 22:38:09.061378 master-0 kubenswrapper[33572]: I1204 22:38:09.061354 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 22:38:10.959920 master-0 kubenswrapper[33572]: I1204 22:38:10.959879 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 22:38:10.960619 master-0 kubenswrapper[33572]: I1204 22:38:10.960594 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 22:38:11.051991 master-0 kubenswrapper[33572]: I1204 22:38:11.051909 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 22:38:11.978782 master-0 kubenswrapper[33572]: I1204 22:38:11.978642 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="382c705e-e524-4ac9-bdc0-860d7b670518" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.17:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:38:11.978782 master-0 kubenswrapper[33572]: I1204 22:38:11.978642 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="382c705e-e524-4ac9-bdc0-860d7b670518" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.17:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:38:14.061761 master-0 kubenswrapper[33572]: I1204 22:38:14.061656 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 22:38:14.061761 master-0 kubenswrapper[33572]: I1204 22:38:14.061760 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 22:38:15.079764 master-0 kubenswrapper[33572]: I1204 22:38:15.079669 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="de28bdd4-fe9d-42ab-9255-b178f2c8abd9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.18:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:38:15.079764 master-0 kubenswrapper[33572]: I1204 22:38:15.079702 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="de28bdd4-fe9d-42ab-9255-b178f2c8abd9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.18:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 22:38:16.052384 master-0 kubenswrapper[33572]: I1204 22:38:16.052283 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 22:38:16.096666 master-0 kubenswrapper[33572]: I1204 22:38:16.096585 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 22:38:16.563386 master-0 kubenswrapper[33572]: I1204 22:38:16.563312 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 22:38:20.984459 master-0 kubenswrapper[33572]: I1204 22:38:20.984392 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 22:38:20.985040 master-0 kubenswrapper[33572]: I1204 22:38:20.984835 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 22:38:20.985983 master-0 kubenswrapper[33572]: I1204 22:38:20.985928 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 22:38:20.996279 master-0 kubenswrapper[33572]: I1204 22:38:20.996236 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 22:38:21.597670 master-0 kubenswrapper[33572]: I1204 22:38:21.597432 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 22:38:21.606154 master-0 kubenswrapper[33572]: I1204 22:38:21.605485 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 22:38:24.067832 master-0 kubenswrapper[33572]: I1204 22:38:24.067753 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 22:38:24.072383 master-0 kubenswrapper[33572]: I1204 22:38:24.072316 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 22:38:24.080798 master-0 kubenswrapper[33572]: I1204 22:38:24.080723 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 22:38:24.663590 master-0 kubenswrapper[33572]: I1204 22:38:24.663279 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 22:38:52.139004 master-0 kubenswrapper[33572]: I1204 22:38:52.127626 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-mwqvk"] Dec 04 22:38:52.139004 master-0 kubenswrapper[33572]: I1204 22:38:52.127880 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" podUID="41a4a33e-5739-4a56-8a8a-98bfd642d564" containerName="sushy-emulator" containerID="cri-o://3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40" gracePeriod=30 Dec 04 22:38:52.854859 master-0 kubenswrapper[33572]: I1204 22:38:52.854775 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:38:52.986061 master-0 kubenswrapper[33572]: I1204 22:38:52.986003 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-64488c485f-tjtp2"] Dec 04 22:38:52.986740 master-0 kubenswrapper[33572]: E1204 22:38:52.986700 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a4a33e-5739-4a56-8a8a-98bfd642d564" containerName="sushy-emulator" Dec 04 22:38:52.986740 master-0 kubenswrapper[33572]: I1204 22:38:52.986726 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a4a33e-5739-4a56-8a8a-98bfd642d564" containerName="sushy-emulator" Dec 04 22:38:52.987191 master-0 kubenswrapper[33572]: I1204 22:38:52.987167 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a4a33e-5739-4a56-8a8a-98bfd642d564" containerName="sushy-emulator" Dec 04 22:38:52.988169 master-0 kubenswrapper[33572]: I1204 22:38:52.988141 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.008419 master-0 kubenswrapper[33572]: I1204 22:38:53.008029 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/41a4a33e-5739-4a56-8a8a-98bfd642d564-os-client-config\") pod \"41a4a33e-5739-4a56-8a8a-98bfd642d564\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " Dec 04 22:38:53.008419 master-0 kubenswrapper[33572]: I1204 22:38:53.008166 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxzdk\" (UniqueName: \"kubernetes.io/projected/41a4a33e-5739-4a56-8a8a-98bfd642d564-kube-api-access-bxzdk\") pod \"41a4a33e-5739-4a56-8a8a-98bfd642d564\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " Dec 04 22:38:53.008419 master-0 kubenswrapper[33572]: I1204 22:38:53.008316 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/41a4a33e-5739-4a56-8a8a-98bfd642d564-sushy-emulator-config\") pod \"41a4a33e-5739-4a56-8a8a-98bfd642d564\" (UID: \"41a4a33e-5739-4a56-8a8a-98bfd642d564\") " Dec 04 22:38:53.010711 master-0 kubenswrapper[33572]: I1204 22:38:53.009225 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41a4a33e-5739-4a56-8a8a-98bfd642d564-sushy-emulator-config" (OuterVolumeSpecName: "sushy-emulator-config") pod "41a4a33e-5739-4a56-8a8a-98bfd642d564" (UID: "41a4a33e-5739-4a56-8a8a-98bfd642d564"). InnerVolumeSpecName "sushy-emulator-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:38:53.014615 master-0 kubenswrapper[33572]: I1204 22:38:53.014530 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41a4a33e-5739-4a56-8a8a-98bfd642d564-os-client-config" (OuterVolumeSpecName: "os-client-config") pod "41a4a33e-5739-4a56-8a8a-98bfd642d564" (UID: "41a4a33e-5739-4a56-8a8a-98bfd642d564"). InnerVolumeSpecName "os-client-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:38:53.014955 master-0 kubenswrapper[33572]: I1204 22:38:53.014913 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a4a33e-5739-4a56-8a8a-98bfd642d564-kube-api-access-bxzdk" (OuterVolumeSpecName: "kube-api-access-bxzdk") pod "41a4a33e-5739-4a56-8a8a-98bfd642d564" (UID: "41a4a33e-5739-4a56-8a8a-98bfd642d564"). InnerVolumeSpecName "kube-api-access-bxzdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:38:53.020877 master-0 kubenswrapper[33572]: I1204 22:38:53.020830 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-64488c485f-tjtp2"] Dec 04 22:38:53.112248 master-0 kubenswrapper[33572]: I1204 22:38:53.112135 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/e8117530-0fdd-479a-a481-79bc514bec3d-sushy-emulator-config\") pod \"sushy-emulator-64488c485f-tjtp2\" (UID: \"e8117530-0fdd-479a-a481-79bc514bec3d\") " pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.114272 master-0 kubenswrapper[33572]: I1204 22:38:53.114225 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnrqp\" (UniqueName: \"kubernetes.io/projected/e8117530-0fdd-479a-a481-79bc514bec3d-kube-api-access-gnrqp\") pod \"sushy-emulator-64488c485f-tjtp2\" (UID: \"e8117530-0fdd-479a-a481-79bc514bec3d\") " pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.114612 master-0 kubenswrapper[33572]: I1204 22:38:53.114567 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/e8117530-0fdd-479a-a481-79bc514bec3d-os-client-config\") pod \"sushy-emulator-64488c485f-tjtp2\" (UID: \"e8117530-0fdd-479a-a481-79bc514bec3d\") " pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.114811 master-0 kubenswrapper[33572]: I1204 22:38:53.114782 33572 reconciler_common.go:293] "Volume detached for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/41a4a33e-5739-4a56-8a8a-98bfd642d564-sushy-emulator-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:53.114811 master-0 kubenswrapper[33572]: I1204 22:38:53.114806 33572 reconciler_common.go:293] "Volume detached for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/41a4a33e-5739-4a56-8a8a-98bfd642d564-os-client-config\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:53.114918 master-0 kubenswrapper[33572]: I1204 22:38:53.114821 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxzdk\" (UniqueName: \"kubernetes.io/projected/41a4a33e-5739-4a56-8a8a-98bfd642d564-kube-api-access-bxzdk\") on node \"master-0\" DevicePath \"\"" Dec 04 22:38:53.140411 master-0 kubenswrapper[33572]: I1204 22:38:53.140344 33572 generic.go:334] "Generic (PLEG): container finished" podID="41a4a33e-5739-4a56-8a8a-98bfd642d564" containerID="3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40" exitCode=0 Dec 04 22:38:53.140411 master-0 kubenswrapper[33572]: I1204 22:38:53.140406 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" event={"ID":"41a4a33e-5739-4a56-8a8a-98bfd642d564","Type":"ContainerDied","Data":"3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40"} Dec 04 22:38:53.141006 master-0 kubenswrapper[33572]: I1204 22:38:53.140442 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" event={"ID":"41a4a33e-5739-4a56-8a8a-98bfd642d564","Type":"ContainerDied","Data":"1cd6c3429ca666ee1557a97d5b1f23bbfb9fdbc0b1596b23d564faba211c3ffa"} Dec 04 22:38:53.141006 master-0 kubenswrapper[33572]: I1204 22:38:53.140466 33572 scope.go:117] "RemoveContainer" containerID="3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40" Dec 04 22:38:53.141006 master-0 kubenswrapper[33572]: I1204 22:38:53.140525 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-58f4c9b998-mwqvk" Dec 04 22:38:53.177467 master-0 kubenswrapper[33572]: I1204 22:38:53.177416 33572 scope.go:117] "RemoveContainer" containerID="3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40" Dec 04 22:38:53.178136 master-0 kubenswrapper[33572]: E1204 22:38:53.178088 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40\": container with ID starting with 3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40 not found: ID does not exist" containerID="3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40" Dec 04 22:38:53.178212 master-0 kubenswrapper[33572]: I1204 22:38:53.178139 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40"} err="failed to get container status \"3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40\": rpc error: code = NotFound desc = could not find container \"3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40\": container with ID starting with 3c12ad91d6b691ccd834cc54121c9375b1a18bfae28d69bf8f88b5dc8c8b3e40 not found: ID does not exist" Dec 04 22:38:53.205700 master-0 kubenswrapper[33572]: I1204 22:38:53.205647 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-mwqvk"] Dec 04 22:38:53.218911 master-0 kubenswrapper[33572]: I1204 22:38:53.217804 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/e8117530-0fdd-479a-a481-79bc514bec3d-os-client-config\") pod \"sushy-emulator-64488c485f-tjtp2\" (UID: \"e8117530-0fdd-479a-a481-79bc514bec3d\") " pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.219350 master-0 kubenswrapper[33572]: I1204 22:38:53.218976 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/e8117530-0fdd-479a-a481-79bc514bec3d-sushy-emulator-config\") pod \"sushy-emulator-64488c485f-tjtp2\" (UID: \"e8117530-0fdd-479a-a481-79bc514bec3d\") " pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.219350 master-0 kubenswrapper[33572]: I1204 22:38:53.219222 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnrqp\" (UniqueName: \"kubernetes.io/projected/e8117530-0fdd-479a-a481-79bc514bec3d-kube-api-access-gnrqp\") pod \"sushy-emulator-64488c485f-tjtp2\" (UID: \"e8117530-0fdd-479a-a481-79bc514bec3d\") " pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.220818 master-0 kubenswrapper[33572]: I1204 22:38:53.220766 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/e8117530-0fdd-479a-a481-79bc514bec3d-sushy-emulator-config\") pod \"sushy-emulator-64488c485f-tjtp2\" (UID: \"e8117530-0fdd-479a-a481-79bc514bec3d\") " pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.226363 master-0 kubenswrapper[33572]: I1204 22:38:53.226293 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-mwqvk"] Dec 04 22:38:53.227643 master-0 kubenswrapper[33572]: I1204 22:38:53.227604 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/e8117530-0fdd-479a-a481-79bc514bec3d-os-client-config\") pod \"sushy-emulator-64488c485f-tjtp2\" (UID: \"e8117530-0fdd-479a-a481-79bc514bec3d\") " pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.235393 master-0 kubenswrapper[33572]: I1204 22:38:53.235355 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnrqp\" (UniqueName: \"kubernetes.io/projected/e8117530-0fdd-479a-a481-79bc514bec3d-kube-api-access-gnrqp\") pod \"sushy-emulator-64488c485f-tjtp2\" (UID: \"e8117530-0fdd-479a-a481-79bc514bec3d\") " pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.308418 master-0 kubenswrapper[33572]: I1204 22:38:53.308363 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:38:53.977907 master-0 kubenswrapper[33572]: I1204 22:38:53.972689 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-64488c485f-tjtp2"] Dec 04 22:38:54.156622 master-0 kubenswrapper[33572]: I1204 22:38:54.156284 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" event={"ID":"e8117530-0fdd-479a-a481-79bc514bec3d","Type":"ContainerStarted","Data":"c7b6c7b41f8ed3134324cfca61406091aa8b1e29eae778c7eb477a47ac22000c"} Dec 04 22:38:54.548540 master-0 kubenswrapper[33572]: I1204 22:38:54.548451 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a4a33e-5739-4a56-8a8a-98bfd642d564" path="/var/lib/kubelet/pods/41a4a33e-5739-4a56-8a8a-98bfd642d564/volumes" Dec 04 22:38:55.171802 master-0 kubenswrapper[33572]: I1204 22:38:55.171733 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" event={"ID":"e8117530-0fdd-479a-a481-79bc514bec3d","Type":"ContainerStarted","Data":"92ea4370b4a91f352b71aeb243ca8f1bfc28163f69fb4edd70ef4dbe666735f5"} Dec 04 22:38:55.207338 master-0 kubenswrapper[33572]: I1204 22:38:55.207195 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" podStartSLOduration=3.206871258 podStartE2EDuration="3.206871258s" podCreationTimestamp="2025-12-04 22:38:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 22:38:55.204294968 +0000 UTC m=+1198.931820617" watchObservedRunningTime="2025-12-04 22:38:55.206871258 +0000 UTC m=+1198.934396907" Dec 04 22:39:03.310111 master-0 kubenswrapper[33572]: I1204 22:39:03.309945 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:39:03.310111 master-0 kubenswrapper[33572]: I1204 22:39:03.310083 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:39:03.325521 master-0 kubenswrapper[33572]: I1204 22:39:03.325427 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:39:04.347664 master-0 kubenswrapper[33572]: I1204 22:39:04.347599 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-64488c485f-tjtp2" Dec 04 22:39:37.191419 master-0 kubenswrapper[33572]: E1204 22:39:37.191330 33572 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:34484->192.168.32.10:37473: write tcp 192.168.32.10:34484->192.168.32.10:37473: write: broken pipe Dec 04 22:40:05.834575 master-0 kubenswrapper[33572]: I1204 22:40:05.826580 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x6hct"] Dec 04 22:40:05.883869 master-0 kubenswrapper[33572]: I1204 22:40:05.883758 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:05.891305 master-0 kubenswrapper[33572]: I1204 22:40:05.891080 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6hct"] Dec 04 22:40:06.072039 master-0 kubenswrapper[33572]: I1204 22:40:06.071944 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kffqt\" (UniqueName: \"kubernetes.io/projected/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-kube-api-access-kffqt\") pod \"redhat-operators-x6hct\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:06.072039 master-0 kubenswrapper[33572]: I1204 22:40:06.072040 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-catalog-content\") pod \"redhat-operators-x6hct\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:06.072405 master-0 kubenswrapper[33572]: I1204 22:40:06.072070 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-utilities\") pod \"redhat-operators-x6hct\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:06.174826 master-0 kubenswrapper[33572]: I1204 22:40:06.174688 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kffqt\" (UniqueName: \"kubernetes.io/projected/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-kube-api-access-kffqt\") pod \"redhat-operators-x6hct\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:06.175139 master-0 kubenswrapper[33572]: I1204 22:40:06.175116 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-utilities\") pod \"redhat-operators-x6hct\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:06.175264 master-0 kubenswrapper[33572]: I1204 22:40:06.175248 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-catalog-content\") pod \"redhat-operators-x6hct\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:06.175724 master-0 kubenswrapper[33572]: I1204 22:40:06.175681 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-utilities\") pod \"redhat-operators-x6hct\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:06.175814 master-0 kubenswrapper[33572]: I1204 22:40:06.175773 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-catalog-content\") pod \"redhat-operators-x6hct\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:06.206104 master-0 kubenswrapper[33572]: I1204 22:40:06.206046 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kffqt\" (UniqueName: \"kubernetes.io/projected/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-kube-api-access-kffqt\") pod \"redhat-operators-x6hct\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:06.214324 master-0 kubenswrapper[33572]: I1204 22:40:06.214255 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:06.766816 master-0 kubenswrapper[33572]: I1204 22:40:06.766763 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x6hct"] Dec 04 22:40:07.201195 master-0 kubenswrapper[33572]: I1204 22:40:07.200657 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-znqsr"] Dec 04 22:40:07.206440 master-0 kubenswrapper[33572]: I1204 22:40:07.206399 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:07.230591 master-0 kubenswrapper[33572]: I1204 22:40:07.225553 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-znqsr"] Dec 04 22:40:07.308058 master-0 kubenswrapper[33572]: I1204 22:40:07.307939 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074eabcb-1348-4a02-9380-69479850bec2-catalog-content\") pod \"certified-operators-znqsr\" (UID: \"074eabcb-1348-4a02-9380-69479850bec2\") " pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:07.308238 master-0 kubenswrapper[33572]: I1204 22:40:07.308060 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074eabcb-1348-4a02-9380-69479850bec2-utilities\") pod \"certified-operators-znqsr\" (UID: \"074eabcb-1348-4a02-9380-69479850bec2\") " pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:07.308238 master-0 kubenswrapper[33572]: I1204 22:40:07.308109 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfrn8\" (UniqueName: \"kubernetes.io/projected/074eabcb-1348-4a02-9380-69479850bec2-kube-api-access-xfrn8\") pod \"certified-operators-znqsr\" (UID: \"074eabcb-1348-4a02-9380-69479850bec2\") " pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:07.356457 master-0 kubenswrapper[33572]: E1204 22:40:07.356396 33572 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf76deb1d_8ac8_4b6b_b7ed_03c4b7638635.slice/crio-88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf76deb1d_8ac8_4b6b_b7ed_03c4b7638635.slice/crio-conmon-88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9.scope\": RecentStats: unable to find data in memory cache]" Dec 04 22:40:07.410667 master-0 kubenswrapper[33572]: I1204 22:40:07.410603 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074eabcb-1348-4a02-9380-69479850bec2-catalog-content\") pod \"certified-operators-znqsr\" (UID: \"074eabcb-1348-4a02-9380-69479850bec2\") " pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:07.410921 master-0 kubenswrapper[33572]: I1204 22:40:07.410732 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074eabcb-1348-4a02-9380-69479850bec2-utilities\") pod \"certified-operators-znqsr\" (UID: \"074eabcb-1348-4a02-9380-69479850bec2\") " pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:07.410921 master-0 kubenswrapper[33572]: I1204 22:40:07.410801 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfrn8\" (UniqueName: \"kubernetes.io/projected/074eabcb-1348-4a02-9380-69479850bec2-kube-api-access-xfrn8\") pod \"certified-operators-znqsr\" (UID: \"074eabcb-1348-4a02-9380-69479850bec2\") " pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:07.411233 master-0 kubenswrapper[33572]: I1204 22:40:07.411193 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074eabcb-1348-4a02-9380-69479850bec2-catalog-content\") pod \"certified-operators-znqsr\" (UID: \"074eabcb-1348-4a02-9380-69479850bec2\") " pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:07.411586 master-0 kubenswrapper[33572]: I1204 22:40:07.411546 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074eabcb-1348-4a02-9380-69479850bec2-utilities\") pod \"certified-operators-znqsr\" (UID: \"074eabcb-1348-4a02-9380-69479850bec2\") " pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:07.429908 master-0 kubenswrapper[33572]: I1204 22:40:07.429868 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfrn8\" (UniqueName: \"kubernetes.io/projected/074eabcb-1348-4a02-9380-69479850bec2-kube-api-access-xfrn8\") pod \"certified-operators-znqsr\" (UID: \"074eabcb-1348-4a02-9380-69479850bec2\") " pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:07.487106 master-0 kubenswrapper[33572]: I1204 22:40:07.487057 33572 generic.go:334] "Generic (PLEG): container finished" podID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerID="88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9" exitCode=0 Dec 04 22:40:07.487106 master-0 kubenswrapper[33572]: I1204 22:40:07.487107 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6hct" event={"ID":"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635","Type":"ContainerDied","Data":"88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9"} Dec 04 22:40:07.487340 master-0 kubenswrapper[33572]: I1204 22:40:07.487132 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6hct" event={"ID":"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635","Type":"ContainerStarted","Data":"aff92277e18b18f9c705455782d2ae5a41e7697dcbf84506c0a4dc8710b55465"} Dec 04 22:40:07.556621 master-0 kubenswrapper[33572]: I1204 22:40:07.556544 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:08.062238 master-0 kubenswrapper[33572]: I1204 22:40:08.062182 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-znqsr"] Dec 04 22:40:08.086922 master-0 kubenswrapper[33572]: W1204 22:40:08.086844 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod074eabcb_1348_4a02_9380_69479850bec2.slice/crio-fd4c5f15995fffa6980ac3f2fcf5df96e008c5ce024d5021f86c6019b47f6c50 WatchSource:0}: Error finding container fd4c5f15995fffa6980ac3f2fcf5df96e008c5ce024d5021f86c6019b47f6c50: Status 404 returned error can't find the container with id fd4c5f15995fffa6980ac3f2fcf5df96e008c5ce024d5021f86c6019b47f6c50 Dec 04 22:40:08.505132 master-0 kubenswrapper[33572]: I1204 22:40:08.505036 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6hct" event={"ID":"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635","Type":"ContainerStarted","Data":"742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7"} Dec 04 22:40:08.506829 master-0 kubenswrapper[33572]: I1204 22:40:08.506793 33572 generic.go:334] "Generic (PLEG): container finished" podID="074eabcb-1348-4a02-9380-69479850bec2" containerID="b48530121b9b78a0c1b69fb28d454ecfd6ce6069c30d6b635e7766c6e1e27cc9" exitCode=0 Dec 04 22:40:08.506901 master-0 kubenswrapper[33572]: I1204 22:40:08.506827 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqsr" event={"ID":"074eabcb-1348-4a02-9380-69479850bec2","Type":"ContainerDied","Data":"b48530121b9b78a0c1b69fb28d454ecfd6ce6069c30d6b635e7766c6e1e27cc9"} Dec 04 22:40:08.506901 master-0 kubenswrapper[33572]: I1204 22:40:08.506850 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqsr" event={"ID":"074eabcb-1348-4a02-9380-69479850bec2","Type":"ContainerStarted","Data":"fd4c5f15995fffa6980ac3f2fcf5df96e008c5ce024d5021f86c6019b47f6c50"} Dec 04 22:40:09.010037 master-0 kubenswrapper[33572]: I1204 22:40:09.009961 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xprnb"] Dec 04 22:40:09.014200 master-0 kubenswrapper[33572]: I1204 22:40:09.014141 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.059540 master-0 kubenswrapper[33572]: I1204 22:40:09.044863 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xprnb"] Dec 04 22:40:09.158518 master-0 kubenswrapper[33572]: I1204 22:40:09.158351 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-catalog-content\") pod \"community-operators-xprnb\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.158722 master-0 kubenswrapper[33572]: I1204 22:40:09.158680 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jh6c\" (UniqueName: \"kubernetes.io/projected/ed45cd7a-60da-483e-a419-ccadecd30061-kube-api-access-9jh6c\") pod \"community-operators-xprnb\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.158880 master-0 kubenswrapper[33572]: I1204 22:40:09.158835 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-utilities\") pod \"community-operators-xprnb\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.262139 master-0 kubenswrapper[33572]: I1204 22:40:09.261413 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-utilities\") pod \"community-operators-xprnb\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.262323 master-0 kubenswrapper[33572]: I1204 22:40:09.262070 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-utilities\") pod \"community-operators-xprnb\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.262520 master-0 kubenswrapper[33572]: I1204 22:40:09.262480 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-catalog-content\") pod \"community-operators-xprnb\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.262752 master-0 kubenswrapper[33572]: I1204 22:40:09.262724 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jh6c\" (UniqueName: \"kubernetes.io/projected/ed45cd7a-60da-483e-a419-ccadecd30061-kube-api-access-9jh6c\") pod \"community-operators-xprnb\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.263028 master-0 kubenswrapper[33572]: I1204 22:40:09.263000 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-catalog-content\") pod \"community-operators-xprnb\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.283892 master-0 kubenswrapper[33572]: I1204 22:40:09.283825 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jh6c\" (UniqueName: \"kubernetes.io/projected/ed45cd7a-60da-483e-a419-ccadecd30061-kube-api-access-9jh6c\") pod \"community-operators-xprnb\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.374032 master-0 kubenswrapper[33572]: I1204 22:40:09.373920 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:09.546787 master-0 kubenswrapper[33572]: I1204 22:40:09.541062 33572 generic.go:334] "Generic (PLEG): container finished" podID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerID="742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7" exitCode=0 Dec 04 22:40:09.546787 master-0 kubenswrapper[33572]: I1204 22:40:09.541117 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6hct" event={"ID":"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635","Type":"ContainerDied","Data":"742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7"} Dec 04 22:40:10.603329 master-0 kubenswrapper[33572]: W1204 22:40:10.601254 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded45cd7a_60da_483e_a419_ccadecd30061.slice/crio-0d69d983925716e4b5598b3e998e84220898dd80d4c2f8cdd0b38074c46378f5 WatchSource:0}: Error finding container 0d69d983925716e4b5598b3e998e84220898dd80d4c2f8cdd0b38074c46378f5: Status 404 returned error can't find the container with id 0d69d983925716e4b5598b3e998e84220898dd80d4c2f8cdd0b38074c46378f5 Dec 04 22:40:10.653305 master-0 kubenswrapper[33572]: I1204 22:40:10.653193 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tdhk6"] Dec 04 22:40:10.658276 master-0 kubenswrapper[33572]: I1204 22:40:10.658221 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:10.665975 master-0 kubenswrapper[33572]: I1204 22:40:10.665925 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xprnb"] Dec 04 22:40:10.686002 master-0 kubenswrapper[33572]: I1204 22:40:10.685937 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdhk6"] Dec 04 22:40:10.802982 master-0 kubenswrapper[33572]: I1204 22:40:10.802907 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-utilities\") pod \"redhat-marketplace-tdhk6\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:10.803287 master-0 kubenswrapper[33572]: I1204 22:40:10.803052 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-catalog-content\") pod \"redhat-marketplace-tdhk6\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:10.803287 master-0 kubenswrapper[33572]: I1204 22:40:10.803235 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjs7f\" (UniqueName: \"kubernetes.io/projected/87297d8b-d0fb-4571-b239-648415089d10-kube-api-access-bjs7f\") pod \"redhat-marketplace-tdhk6\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:10.905341 master-0 kubenswrapper[33572]: I1204 22:40:10.905275 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjs7f\" (UniqueName: \"kubernetes.io/projected/87297d8b-d0fb-4571-b239-648415089d10-kube-api-access-bjs7f\") pod \"redhat-marketplace-tdhk6\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:10.905607 master-0 kubenswrapper[33572]: I1204 22:40:10.905571 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-utilities\") pod \"redhat-marketplace-tdhk6\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:10.905766 master-0 kubenswrapper[33572]: I1204 22:40:10.905627 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-catalog-content\") pod \"redhat-marketplace-tdhk6\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:10.906304 master-0 kubenswrapper[33572]: I1204 22:40:10.906273 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-catalog-content\") pod \"redhat-marketplace-tdhk6\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:10.907013 master-0 kubenswrapper[33572]: I1204 22:40:10.906957 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-utilities\") pod \"redhat-marketplace-tdhk6\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:10.924670 master-0 kubenswrapper[33572]: I1204 22:40:10.924633 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjs7f\" (UniqueName: \"kubernetes.io/projected/87297d8b-d0fb-4571-b239-648415089d10-kube-api-access-bjs7f\") pod \"redhat-marketplace-tdhk6\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:11.001527 master-0 kubenswrapper[33572]: I1204 22:40:10.999018 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:11.501163 master-0 kubenswrapper[33572]: I1204 22:40:11.501069 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdhk6"] Dec 04 22:40:11.635998 master-0 kubenswrapper[33572]: I1204 22:40:11.635941 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6hct" event={"ID":"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635","Type":"ContainerStarted","Data":"f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c"} Dec 04 22:40:11.638977 master-0 kubenswrapper[33572]: I1204 22:40:11.638934 33572 generic.go:334] "Generic (PLEG): container finished" podID="ed45cd7a-60da-483e-a419-ccadecd30061" containerID="50d3d989194e85f89f6e7d2eea64a200bd8fa80509ad10ea7d66156481ddcb07" exitCode=0 Dec 04 22:40:11.639039 master-0 kubenswrapper[33572]: I1204 22:40:11.639019 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xprnb" event={"ID":"ed45cd7a-60da-483e-a419-ccadecd30061","Type":"ContainerDied","Data":"50d3d989194e85f89f6e7d2eea64a200bd8fa80509ad10ea7d66156481ddcb07"} Dec 04 22:40:11.639086 master-0 kubenswrapper[33572]: I1204 22:40:11.639050 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xprnb" event={"ID":"ed45cd7a-60da-483e-a419-ccadecd30061","Type":"ContainerStarted","Data":"0d69d983925716e4b5598b3e998e84220898dd80d4c2f8cdd0b38074c46378f5"} Dec 04 22:40:11.643198 master-0 kubenswrapper[33572]: I1204 22:40:11.643169 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdhk6" event={"ID":"87297d8b-d0fb-4571-b239-648415089d10","Type":"ContainerStarted","Data":"9d4c2775a22ec3eda4435620c20858ca15390ada67d2159df92b47c457fefeb2"} Dec 04 22:40:11.669784 master-0 kubenswrapper[33572]: I1204 22:40:11.666610 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x6hct" podStartSLOduration=3.5492267220000002 podStartE2EDuration="6.666589114s" podCreationTimestamp="2025-12-04 22:40:05 +0000 UTC" firstStartedPulling="2025-12-04 22:40:07.48951539 +0000 UTC m=+1271.217041039" lastFinishedPulling="2025-12-04 22:40:10.606877772 +0000 UTC m=+1274.334403431" observedRunningTime="2025-12-04 22:40:11.657619979 +0000 UTC m=+1275.385145628" watchObservedRunningTime="2025-12-04 22:40:11.666589114 +0000 UTC m=+1275.394114763" Dec 04 22:40:12.657672 master-0 kubenswrapper[33572]: I1204 22:40:12.657606 33572 generic.go:334] "Generic (PLEG): container finished" podID="87297d8b-d0fb-4571-b239-648415089d10" containerID="1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb" exitCode=0 Dec 04 22:40:12.659344 master-0 kubenswrapper[33572]: I1204 22:40:12.659254 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdhk6" event={"ID":"87297d8b-d0fb-4571-b239-648415089d10","Type":"ContainerDied","Data":"1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb"} Dec 04 22:40:15.704211 master-0 kubenswrapper[33572]: I1204 22:40:15.704045 33572 generic.go:334] "Generic (PLEG): container finished" podID="ed45cd7a-60da-483e-a419-ccadecd30061" containerID="3b203b1cbc20c3a53d1d99c7fb55945d994697aabd124ed051c943f3fe1e374a" exitCode=0 Dec 04 22:40:15.704211 master-0 kubenswrapper[33572]: I1204 22:40:15.704156 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xprnb" event={"ID":"ed45cd7a-60da-483e-a419-ccadecd30061","Type":"ContainerDied","Data":"3b203b1cbc20c3a53d1d99c7fb55945d994697aabd124ed051c943f3fe1e374a"} Dec 04 22:40:15.708011 master-0 kubenswrapper[33572]: I1204 22:40:15.707894 33572 generic.go:334] "Generic (PLEG): container finished" podID="074eabcb-1348-4a02-9380-69479850bec2" containerID="6588b789721a40e6f088ea2c1087a915ed3aa925d54a36ab635e624516a7e848" exitCode=0 Dec 04 22:40:15.708011 master-0 kubenswrapper[33572]: I1204 22:40:15.707950 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqsr" event={"ID":"074eabcb-1348-4a02-9380-69479850bec2","Type":"ContainerDied","Data":"6588b789721a40e6f088ea2c1087a915ed3aa925d54a36ab635e624516a7e848"} Dec 04 22:40:16.214954 master-0 kubenswrapper[33572]: I1204 22:40:16.214826 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:16.215202 master-0 kubenswrapper[33572]: I1204 22:40:16.215009 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:16.732380 master-0 kubenswrapper[33572]: I1204 22:40:16.732217 33572 generic.go:334] "Generic (PLEG): container finished" podID="87297d8b-d0fb-4571-b239-648415089d10" containerID="0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303" exitCode=0 Dec 04 22:40:16.733259 master-0 kubenswrapper[33572]: I1204 22:40:16.732351 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdhk6" event={"ID":"87297d8b-d0fb-4571-b239-648415089d10","Type":"ContainerDied","Data":"0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303"} Dec 04 22:40:17.302783 master-0 kubenswrapper[33572]: I1204 22:40:17.302698 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x6hct" podUID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerName="registry-server" probeResult="failure" output=< Dec 04 22:40:17.302783 master-0 kubenswrapper[33572]: timeout: failed to connect service ":50051" within 1s Dec 04 22:40:17.302783 master-0 kubenswrapper[33572]: > Dec 04 22:40:19.827125 master-0 kubenswrapper[33572]: I1204 22:40:19.827001 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-znqsr" event={"ID":"074eabcb-1348-4a02-9380-69479850bec2","Type":"ContainerStarted","Data":"ded1000d0de9f67a7d8db7015f0bf3386f3b1a39ccbbd5210921aa1994e5ddfc"} Dec 04 22:40:19.836996 master-0 kubenswrapper[33572]: I1204 22:40:19.836907 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xprnb" event={"ID":"ed45cd7a-60da-483e-a419-ccadecd30061","Type":"ContainerStarted","Data":"c5b18b6e9b674202563f92963e9e193376be276678a2b15090e8aec1f0329d4f"} Dec 04 22:40:19.845375 master-0 kubenswrapper[33572]: I1204 22:40:19.844940 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdhk6" event={"ID":"87297d8b-d0fb-4571-b239-648415089d10","Type":"ContainerStarted","Data":"1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b"} Dec 04 22:40:20.055150 master-0 kubenswrapper[33572]: I1204 22:40:20.055034 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-znqsr" podStartSLOduration=2.552924606 podStartE2EDuration="13.055011556s" podCreationTimestamp="2025-12-04 22:40:07 +0000 UTC" firstStartedPulling="2025-12-04 22:40:08.508938203 +0000 UTC m=+1272.236463852" lastFinishedPulling="2025-12-04 22:40:19.011024903 +0000 UTC m=+1282.738550802" observedRunningTime="2025-12-04 22:40:20.050162864 +0000 UTC m=+1283.777688513" watchObservedRunningTime="2025-12-04 22:40:20.055011556 +0000 UTC m=+1283.782537215" Dec 04 22:40:20.189423 master-0 kubenswrapper[33572]: I1204 22:40:20.189313 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xprnb" podStartSLOduration=4.911212801 podStartE2EDuration="12.18928527s" podCreationTimestamp="2025-12-04 22:40:08 +0000 UTC" firstStartedPulling="2025-12-04 22:40:11.641948932 +0000 UTC m=+1275.369474581" lastFinishedPulling="2025-12-04 22:40:18.920021401 +0000 UTC m=+1282.647547050" observedRunningTime="2025-12-04 22:40:20.173640573 +0000 UTC m=+1283.901166242" watchObservedRunningTime="2025-12-04 22:40:20.18928527 +0000 UTC m=+1283.916810959" Dec 04 22:40:20.205295 master-0 kubenswrapper[33572]: I1204 22:40:20.205183 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tdhk6" podStartSLOduration=5.387257165 podStartE2EDuration="10.205159282s" podCreationTimestamp="2025-12-04 22:40:10 +0000 UTC" firstStartedPulling="2025-12-04 22:40:14.371855052 +0000 UTC m=+1278.099380701" lastFinishedPulling="2025-12-04 22:40:19.189757169 +0000 UTC m=+1282.917282818" observedRunningTime="2025-12-04 22:40:20.199070026 +0000 UTC m=+1283.926595705" watchObservedRunningTime="2025-12-04 22:40:20.205159282 +0000 UTC m=+1283.932684931" Dec 04 22:40:20.999849 master-0 kubenswrapper[33572]: I1204 22:40:20.999784 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:21.000424 master-0 kubenswrapper[33572]: I1204 22:40:21.000045 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:21.056859 master-0 kubenswrapper[33572]: I1204 22:40:21.056806 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:26.313038 master-0 kubenswrapper[33572]: I1204 22:40:26.312964 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:26.412713 master-0 kubenswrapper[33572]: I1204 22:40:26.412656 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:27.556743 master-0 kubenswrapper[33572]: I1204 22:40:27.556696 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:27.558014 master-0 kubenswrapper[33572]: I1204 22:40:27.557996 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:27.637451 master-0 kubenswrapper[33572]: I1204 22:40:27.637372 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:28.055940 master-0 kubenswrapper[33572]: I1204 22:40:28.055857 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-znqsr" Dec 04 22:40:29.375231 master-0 kubenswrapper[33572]: I1204 22:40:29.375117 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:29.375231 master-0 kubenswrapper[33572]: I1204 22:40:29.375215 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:29.463326 master-0 kubenswrapper[33572]: I1204 22:40:29.463234 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:30.106132 master-0 kubenswrapper[33572]: I1204 22:40:30.106074 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:30.287959 master-0 kubenswrapper[33572]: I1204 22:40:30.287873 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6hct"] Dec 04 22:40:30.288432 master-0 kubenswrapper[33572]: I1204 22:40:30.288220 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x6hct" podUID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerName="registry-server" containerID="cri-o://f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c" gracePeriod=2 Dec 04 22:40:30.874447 master-0 kubenswrapper[33572]: I1204 22:40:30.874394 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:31.011378 master-0 kubenswrapper[33572]: I1204 22:40:31.011310 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-utilities\") pod \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " Dec 04 22:40:31.011732 master-0 kubenswrapper[33572]: I1204 22:40:31.011490 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-catalog-content\") pod \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " Dec 04 22:40:31.011844 master-0 kubenswrapper[33572]: I1204 22:40:31.011777 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kffqt\" (UniqueName: \"kubernetes.io/projected/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-kube-api-access-kffqt\") pod \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\" (UID: \"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635\") " Dec 04 22:40:31.012475 master-0 kubenswrapper[33572]: I1204 22:40:31.012391 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-utilities" (OuterVolumeSpecName: "utilities") pod "f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" (UID: "f76deb1d-8ac8-4b6b-b7ed-03c4b7638635"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:40:31.012614 master-0 kubenswrapper[33572]: I1204 22:40:31.012567 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:31.019942 master-0 kubenswrapper[33572]: I1204 22:40:31.019815 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-kube-api-access-kffqt" (OuterVolumeSpecName: "kube-api-access-kffqt") pod "f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" (UID: "f76deb1d-8ac8-4b6b-b7ed-03c4b7638635"). InnerVolumeSpecName "kube-api-access-kffqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:40:31.044163 master-0 kubenswrapper[33572]: I1204 22:40:31.044075 33572 generic.go:334] "Generic (PLEG): container finished" podID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerID="f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c" exitCode=0 Dec 04 22:40:31.044163 master-0 kubenswrapper[33572]: I1204 22:40:31.044132 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x6hct" Dec 04 22:40:31.044559 master-0 kubenswrapper[33572]: I1204 22:40:31.044202 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6hct" event={"ID":"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635","Type":"ContainerDied","Data":"f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c"} Dec 04 22:40:31.044559 master-0 kubenswrapper[33572]: I1204 22:40:31.044280 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x6hct" event={"ID":"f76deb1d-8ac8-4b6b-b7ed-03c4b7638635","Type":"ContainerDied","Data":"aff92277e18b18f9c705455782d2ae5a41e7697dcbf84506c0a4dc8710b55465"} Dec 04 22:40:31.044559 master-0 kubenswrapper[33572]: I1204 22:40:31.044317 33572 scope.go:117] "RemoveContainer" containerID="f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c" Dec 04 22:40:31.077119 master-0 kubenswrapper[33572]: I1204 22:40:31.077053 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:31.106098 master-0 kubenswrapper[33572]: I1204 22:40:31.104451 33572 scope.go:117] "RemoveContainer" containerID="742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7" Dec 04 22:40:31.116614 master-0 kubenswrapper[33572]: I1204 22:40:31.116546 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kffqt\" (UniqueName: \"kubernetes.io/projected/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-kube-api-access-kffqt\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:31.154548 master-0 kubenswrapper[33572]: I1204 22:40:31.154487 33572 scope.go:117] "RemoveContainer" containerID="88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9" Dec 04 22:40:31.188959 master-0 kubenswrapper[33572]: I1204 22:40:31.188862 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" (UID: "f76deb1d-8ac8-4b6b-b7ed-03c4b7638635"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:40:31.206357 master-0 kubenswrapper[33572]: I1204 22:40:31.206305 33572 scope.go:117] "RemoveContainer" containerID="f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c" Dec 04 22:40:31.208018 master-0 kubenswrapper[33572]: E1204 22:40:31.207955 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c\": container with ID starting with f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c not found: ID does not exist" containerID="f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c" Dec 04 22:40:31.208137 master-0 kubenswrapper[33572]: I1204 22:40:31.208020 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c"} err="failed to get container status \"f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c\": rpc error: code = NotFound desc = could not find container \"f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c\": container with ID starting with f1898050e7b57d455d71d4c5a5513eee19029d6d6607451cd8178fa4b7bd545c not found: ID does not exist" Dec 04 22:40:31.208137 master-0 kubenswrapper[33572]: I1204 22:40:31.208060 33572 scope.go:117] "RemoveContainer" containerID="742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7" Dec 04 22:40:31.209915 master-0 kubenswrapper[33572]: E1204 22:40:31.209852 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7\": container with ID starting with 742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7 not found: ID does not exist" containerID="742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7" Dec 04 22:40:31.210094 master-0 kubenswrapper[33572]: I1204 22:40:31.209914 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7"} err="failed to get container status \"742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7\": rpc error: code = NotFound desc = could not find container \"742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7\": container with ID starting with 742dbe3d0688cbbf24e8b1d2d3ebf7a2469b4243c2c9f7df0dc7b0585718e9e7 not found: ID does not exist" Dec 04 22:40:31.210094 master-0 kubenswrapper[33572]: I1204 22:40:31.209988 33572 scope.go:117] "RemoveContainer" containerID="88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9" Dec 04 22:40:31.210812 master-0 kubenswrapper[33572]: E1204 22:40:31.210756 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9\": container with ID starting with 88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9 not found: ID does not exist" containerID="88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9" Dec 04 22:40:31.210812 master-0 kubenswrapper[33572]: I1204 22:40:31.210800 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9"} err="failed to get container status \"88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9\": rpc error: code = NotFound desc = could not find container \"88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9\": container with ID starting with 88dcd63f643b5d7cd3eddee9489a85b8a830bdca3f5aa0eddb5df5f2321709a9 not found: ID does not exist" Dec 04 22:40:31.219543 master-0 kubenswrapper[33572]: I1204 22:40:31.219484 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:31.326804 master-0 kubenswrapper[33572]: I1204 22:40:31.326603 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-znqsr"] Dec 04 22:40:31.467384 master-0 kubenswrapper[33572]: I1204 22:40:31.467291 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x6hct"] Dec 04 22:40:31.492272 master-0 kubenswrapper[33572]: I1204 22:40:31.492175 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x6hct"] Dec 04 22:40:31.699675 master-0 kubenswrapper[33572]: I1204 22:40:31.699542 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sw6sx"] Dec 04 22:40:31.699886 master-0 kubenswrapper[33572]: I1204 22:40:31.699849 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sw6sx" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerName="registry-server" containerID="cri-o://6ac117d3f888173d5f0f8aae01fddab59b22a163a9082dce4aa284a60ea267f0" gracePeriod=2 Dec 04 22:40:32.115487 master-0 kubenswrapper[33572]: I1204 22:40:32.115420 33572 generic.go:334] "Generic (PLEG): container finished" podID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerID="6ac117d3f888173d5f0f8aae01fddab59b22a163a9082dce4aa284a60ea267f0" exitCode=0 Dec 04 22:40:32.115487 master-0 kubenswrapper[33572]: I1204 22:40:32.115488 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerDied","Data":"6ac117d3f888173d5f0f8aae01fddab59b22a163a9082dce4aa284a60ea267f0"} Dec 04 22:40:32.311652 master-0 kubenswrapper[33572]: I1204 22:40:32.311535 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:40:32.466157 master-0 kubenswrapper[33572]: I1204 22:40:32.466032 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9rxt\" (UniqueName: \"kubernetes.io/projected/29828f55-427b-4fe3-8713-03bcd6ac9dec-kube-api-access-t9rxt\") pod \"29828f55-427b-4fe3-8713-03bcd6ac9dec\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " Dec 04 22:40:32.466366 master-0 kubenswrapper[33572]: I1204 22:40:32.466252 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-utilities\") pod \"29828f55-427b-4fe3-8713-03bcd6ac9dec\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " Dec 04 22:40:32.466366 master-0 kubenswrapper[33572]: I1204 22:40:32.466329 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-catalog-content\") pod \"29828f55-427b-4fe3-8713-03bcd6ac9dec\" (UID: \"29828f55-427b-4fe3-8713-03bcd6ac9dec\") " Dec 04 22:40:32.467815 master-0 kubenswrapper[33572]: I1204 22:40:32.467771 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-utilities" (OuterVolumeSpecName: "utilities") pod "29828f55-427b-4fe3-8713-03bcd6ac9dec" (UID: "29828f55-427b-4fe3-8713-03bcd6ac9dec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:40:32.469784 master-0 kubenswrapper[33572]: I1204 22:40:32.469750 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29828f55-427b-4fe3-8713-03bcd6ac9dec-kube-api-access-t9rxt" (OuterVolumeSpecName: "kube-api-access-t9rxt") pod "29828f55-427b-4fe3-8713-03bcd6ac9dec" (UID: "29828f55-427b-4fe3-8713-03bcd6ac9dec"). InnerVolumeSpecName "kube-api-access-t9rxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:40:32.507647 master-0 kubenswrapper[33572]: I1204 22:40:32.507575 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29828f55-427b-4fe3-8713-03bcd6ac9dec" (UID: "29828f55-427b-4fe3-8713-03bcd6ac9dec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:40:32.539355 master-0 kubenswrapper[33572]: I1204 22:40:32.539302 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" path="/var/lib/kubelet/pods/f76deb1d-8ac8-4b6b-b7ed-03c4b7638635/volumes" Dec 04 22:40:32.569620 master-0 kubenswrapper[33572]: I1204 22:40:32.569524 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:32.569620 master-0 kubenswrapper[33572]: I1204 22:40:32.569568 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29828f55-427b-4fe3-8713-03bcd6ac9dec-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:32.569620 master-0 kubenswrapper[33572]: I1204 22:40:32.569581 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9rxt\" (UniqueName: \"kubernetes.io/projected/29828f55-427b-4fe3-8713-03bcd6ac9dec-kube-api-access-t9rxt\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:32.708856 master-0 kubenswrapper[33572]: I1204 22:40:32.708736 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xprnb"] Dec 04 22:40:32.709353 master-0 kubenswrapper[33572]: I1204 22:40:32.709277 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xprnb" podUID="ed45cd7a-60da-483e-a419-ccadecd30061" containerName="registry-server" containerID="cri-o://c5b18b6e9b674202563f92963e9e193376be276678a2b15090e8aec1f0329d4f" gracePeriod=2 Dec 04 22:40:33.141267 master-0 kubenswrapper[33572]: I1204 22:40:33.140912 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sw6sx" event={"ID":"29828f55-427b-4fe3-8713-03bcd6ac9dec","Type":"ContainerDied","Data":"ee18ee29a901424dbc24deaf0f12aa2ff23c84383c4ff6df05066daea9fbaebe"} Dec 04 22:40:33.141267 master-0 kubenswrapper[33572]: I1204 22:40:33.140978 33572 scope.go:117] "RemoveContainer" containerID="6ac117d3f888173d5f0f8aae01fddab59b22a163a9082dce4aa284a60ea267f0" Dec 04 22:40:33.141267 master-0 kubenswrapper[33572]: I1204 22:40:33.141097 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sw6sx" Dec 04 22:40:33.178262 master-0 kubenswrapper[33572]: I1204 22:40:33.178164 33572 generic.go:334] "Generic (PLEG): container finished" podID="ed45cd7a-60da-483e-a419-ccadecd30061" containerID="c5b18b6e9b674202563f92963e9e193376be276678a2b15090e8aec1f0329d4f" exitCode=0 Dec 04 22:40:33.178262 master-0 kubenswrapper[33572]: I1204 22:40:33.178212 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xprnb" event={"ID":"ed45cd7a-60da-483e-a419-ccadecd30061","Type":"ContainerDied","Data":"c5b18b6e9b674202563f92963e9e193376be276678a2b15090e8aec1f0329d4f"} Dec 04 22:40:33.198049 master-0 kubenswrapper[33572]: I1204 22:40:33.196642 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sw6sx"] Dec 04 22:40:33.215421 master-0 kubenswrapper[33572]: I1204 22:40:33.211846 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sw6sx"] Dec 04 22:40:33.220815 master-0 kubenswrapper[33572]: I1204 22:40:33.220662 33572 scope.go:117] "RemoveContainer" containerID="5f1c65cf31bac9c169d2527d0952f6b2fb651f148801aa43c79ceb4a8adb4da6" Dec 04 22:40:33.246112 master-0 kubenswrapper[33572]: I1204 22:40:33.242959 33572 scope.go:117] "RemoveContainer" containerID="5b29db78fe5a1942ea20ecc7d711d841b8eb39751995722550ca54e6750f1a0c" Dec 04 22:40:33.347676 master-0 kubenswrapper[33572]: I1204 22:40:33.347638 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:33.512256 master-0 kubenswrapper[33572]: I1204 22:40:33.512135 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-utilities\") pod \"ed45cd7a-60da-483e-a419-ccadecd30061\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " Dec 04 22:40:33.512449 master-0 kubenswrapper[33572]: I1204 22:40:33.512354 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jh6c\" (UniqueName: \"kubernetes.io/projected/ed45cd7a-60da-483e-a419-ccadecd30061-kube-api-access-9jh6c\") pod \"ed45cd7a-60da-483e-a419-ccadecd30061\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " Dec 04 22:40:33.512449 master-0 kubenswrapper[33572]: I1204 22:40:33.512385 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-catalog-content\") pod \"ed45cd7a-60da-483e-a419-ccadecd30061\" (UID: \"ed45cd7a-60da-483e-a419-ccadecd30061\") " Dec 04 22:40:33.514033 master-0 kubenswrapper[33572]: I1204 22:40:33.513748 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-utilities" (OuterVolumeSpecName: "utilities") pod "ed45cd7a-60da-483e-a419-ccadecd30061" (UID: "ed45cd7a-60da-483e-a419-ccadecd30061"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:40:33.519556 master-0 kubenswrapper[33572]: I1204 22:40:33.519483 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed45cd7a-60da-483e-a419-ccadecd30061-kube-api-access-9jh6c" (OuterVolumeSpecName: "kube-api-access-9jh6c") pod "ed45cd7a-60da-483e-a419-ccadecd30061" (UID: "ed45cd7a-60da-483e-a419-ccadecd30061"). InnerVolumeSpecName "kube-api-access-9jh6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:40:33.592655 master-0 kubenswrapper[33572]: I1204 22:40:33.592569 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed45cd7a-60da-483e-a419-ccadecd30061" (UID: "ed45cd7a-60da-483e-a419-ccadecd30061"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:40:33.615244 master-0 kubenswrapper[33572]: I1204 22:40:33.615203 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:33.615244 master-0 kubenswrapper[33572]: I1204 22:40:33.615231 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jh6c\" (UniqueName: \"kubernetes.io/projected/ed45cd7a-60da-483e-a419-ccadecd30061-kube-api-access-9jh6c\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:33.615244 master-0 kubenswrapper[33572]: I1204 22:40:33.615241 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed45cd7a-60da-483e-a419-ccadecd30061-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:34.196883 master-0 kubenswrapper[33572]: I1204 22:40:34.196804 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xprnb" event={"ID":"ed45cd7a-60da-483e-a419-ccadecd30061","Type":"ContainerDied","Data":"0d69d983925716e4b5598b3e998e84220898dd80d4c2f8cdd0b38074c46378f5"} Dec 04 22:40:34.197612 master-0 kubenswrapper[33572]: I1204 22:40:34.196908 33572 scope.go:117] "RemoveContainer" containerID="c5b18b6e9b674202563f92963e9e193376be276678a2b15090e8aec1f0329d4f" Dec 04 22:40:34.197612 master-0 kubenswrapper[33572]: I1204 22:40:34.197060 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xprnb" Dec 04 22:40:34.253756 master-0 kubenswrapper[33572]: I1204 22:40:34.253698 33572 scope.go:117] "RemoveContainer" containerID="3b203b1cbc20c3a53d1d99c7fb55945d994697aabd124ed051c943f3fe1e374a" Dec 04 22:40:34.263245 master-0 kubenswrapper[33572]: I1204 22:40:34.262169 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xprnb"] Dec 04 22:40:34.272956 master-0 kubenswrapper[33572]: I1204 22:40:34.272118 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xprnb"] Dec 04 22:40:34.295632 master-0 kubenswrapper[33572]: I1204 22:40:34.295571 33572 scope.go:117] "RemoveContainer" containerID="50d3d989194e85f89f6e7d2eea64a200bd8fa80509ad10ea7d66156481ddcb07" Dec 04 22:40:34.541308 master-0 kubenswrapper[33572]: I1204 22:40:34.541216 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" path="/var/lib/kubelet/pods/29828f55-427b-4fe3-8713-03bcd6ac9dec/volumes" Dec 04 22:40:34.542095 master-0 kubenswrapper[33572]: I1204 22:40:34.542058 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed45cd7a-60da-483e-a419-ccadecd30061" path="/var/lib/kubelet/pods/ed45cd7a-60da-483e-a419-ccadecd30061/volumes" Dec 04 22:40:35.094597 master-0 kubenswrapper[33572]: I1204 22:40:35.091016 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdhk6"] Dec 04 22:40:35.094597 master-0 kubenswrapper[33572]: I1204 22:40:35.091257 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tdhk6" podUID="87297d8b-d0fb-4571-b239-648415089d10" containerName="registry-server" containerID="cri-o://1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b" gracePeriod=2 Dec 04 22:40:35.752836 master-0 kubenswrapper[33572]: I1204 22:40:35.752679 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:35.793596 master-0 kubenswrapper[33572]: I1204 22:40:35.793495 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjs7f\" (UniqueName: \"kubernetes.io/projected/87297d8b-d0fb-4571-b239-648415089d10-kube-api-access-bjs7f\") pod \"87297d8b-d0fb-4571-b239-648415089d10\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " Dec 04 22:40:35.793817 master-0 kubenswrapper[33572]: I1204 22:40:35.793710 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-utilities\") pod \"87297d8b-d0fb-4571-b239-648415089d10\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " Dec 04 22:40:35.793862 master-0 kubenswrapper[33572]: I1204 22:40:35.793817 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-catalog-content\") pod \"87297d8b-d0fb-4571-b239-648415089d10\" (UID: \"87297d8b-d0fb-4571-b239-648415089d10\") " Dec 04 22:40:35.795963 master-0 kubenswrapper[33572]: I1204 22:40:35.795411 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-utilities" (OuterVolumeSpecName: "utilities") pod "87297d8b-d0fb-4571-b239-648415089d10" (UID: "87297d8b-d0fb-4571-b239-648415089d10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:40:35.800305 master-0 kubenswrapper[33572]: I1204 22:40:35.798713 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87297d8b-d0fb-4571-b239-648415089d10-kube-api-access-bjs7f" (OuterVolumeSpecName: "kube-api-access-bjs7f") pod "87297d8b-d0fb-4571-b239-648415089d10" (UID: "87297d8b-d0fb-4571-b239-648415089d10"). InnerVolumeSpecName "kube-api-access-bjs7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:40:35.832100 master-0 kubenswrapper[33572]: I1204 22:40:35.832026 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87297d8b-d0fb-4571-b239-648415089d10" (UID: "87297d8b-d0fb-4571-b239-648415089d10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:40:35.896414 master-0 kubenswrapper[33572]: I1204 22:40:35.896377 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:35.896609 master-0 kubenswrapper[33572]: I1204 22:40:35.896591 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjs7f\" (UniqueName: \"kubernetes.io/projected/87297d8b-d0fb-4571-b239-648415089d10-kube-api-access-bjs7f\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:35.896685 master-0 kubenswrapper[33572]: I1204 22:40:35.896674 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87297d8b-d0fb-4571-b239-648415089d10-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:40:36.233893 master-0 kubenswrapper[33572]: I1204 22:40:36.233755 33572 generic.go:334] "Generic (PLEG): container finished" podID="87297d8b-d0fb-4571-b239-648415089d10" containerID="1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b" exitCode=0 Dec 04 22:40:36.233893 master-0 kubenswrapper[33572]: I1204 22:40:36.233803 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdhk6" event={"ID":"87297d8b-d0fb-4571-b239-648415089d10","Type":"ContainerDied","Data":"1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b"} Dec 04 22:40:36.233893 master-0 kubenswrapper[33572]: I1204 22:40:36.233820 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdhk6" Dec 04 22:40:36.233893 master-0 kubenswrapper[33572]: I1204 22:40:36.233838 33572 scope.go:117] "RemoveContainer" containerID="1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b" Dec 04 22:40:36.234196 master-0 kubenswrapper[33572]: I1204 22:40:36.233829 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdhk6" event={"ID":"87297d8b-d0fb-4571-b239-648415089d10","Type":"ContainerDied","Data":"9d4c2775a22ec3eda4435620c20858ca15390ada67d2159df92b47c457fefeb2"} Dec 04 22:40:36.258207 master-0 kubenswrapper[33572]: I1204 22:40:36.258165 33572 scope.go:117] "RemoveContainer" containerID="0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303" Dec 04 22:40:36.283168 master-0 kubenswrapper[33572]: I1204 22:40:36.283088 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdhk6"] Dec 04 22:40:36.297518 master-0 kubenswrapper[33572]: I1204 22:40:36.297196 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdhk6"] Dec 04 22:40:36.312463 master-0 kubenswrapper[33572]: I1204 22:40:36.312411 33572 scope.go:117] "RemoveContainer" containerID="1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb" Dec 04 22:40:36.350466 master-0 kubenswrapper[33572]: I1204 22:40:36.350421 33572 scope.go:117] "RemoveContainer" containerID="1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b" Dec 04 22:40:36.350924 master-0 kubenswrapper[33572]: E1204 22:40:36.350888 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b\": container with ID starting with 1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b not found: ID does not exist" containerID="1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b" Dec 04 22:40:36.350992 master-0 kubenswrapper[33572]: I1204 22:40:36.350922 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b"} err="failed to get container status \"1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b\": rpc error: code = NotFound desc = could not find container \"1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b\": container with ID starting with 1da89648a7b1c686da1cbe087aa722a2168c9afa5947c2529186ecff7763391b not found: ID does not exist" Dec 04 22:40:36.350992 master-0 kubenswrapper[33572]: I1204 22:40:36.350943 33572 scope.go:117] "RemoveContainer" containerID="0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303" Dec 04 22:40:36.351290 master-0 kubenswrapper[33572]: E1204 22:40:36.351260 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303\": container with ID starting with 0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303 not found: ID does not exist" containerID="0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303" Dec 04 22:40:36.351364 master-0 kubenswrapper[33572]: I1204 22:40:36.351285 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303"} err="failed to get container status \"0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303\": rpc error: code = NotFound desc = could not find container \"0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303\": container with ID starting with 0c6ab658d977b9b57b6feb52ac70563152f2c968fe723babef5a3bbb46bff303 not found: ID does not exist" Dec 04 22:40:36.351364 master-0 kubenswrapper[33572]: I1204 22:40:36.351302 33572 scope.go:117] "RemoveContainer" containerID="1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb" Dec 04 22:40:36.351839 master-0 kubenswrapper[33572]: E1204 22:40:36.351753 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb\": container with ID starting with 1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb not found: ID does not exist" containerID="1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb" Dec 04 22:40:36.351839 master-0 kubenswrapper[33572]: I1204 22:40:36.351810 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb"} err="failed to get container status \"1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb\": rpc error: code = NotFound desc = could not find container \"1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb\": container with ID starting with 1e760bd40622d5d14de432e3b36867d2e6aaee433b54a60807a3659afe9e33bb not found: ID does not exist" Dec 04 22:40:36.544128 master-0 kubenswrapper[33572]: I1204 22:40:36.544016 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87297d8b-d0fb-4571-b239-648415089d10" path="/var/lib/kubelet/pods/87297d8b-d0fb-4571-b239-648415089d10/volumes" Dec 04 22:40:58.354293 master-0 kubenswrapper[33572]: E1204 22:40:58.354215 33572 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:58306->192.168.32.10:37473: write tcp 192.168.32.10:58306->192.168.32.10:37473: write: broken pipe Dec 04 22:41:19.596205 master-0 kubenswrapper[33572]: I1204 22:41:19.596091 33572 scope.go:117] "RemoveContainer" containerID="83ac0904d547478fbfb25940f7fe81c541748b3e02f60b2cc361e5ac58be87c9" Dec 04 22:41:19.643875 master-0 kubenswrapper[33572]: I1204 22:41:19.643816 33572 scope.go:117] "RemoveContainer" containerID="1866e45bcac4dc82b281fc6aca968c7db2fc493bc176cdc7e9b222b0dc038ad3" Dec 04 22:41:19.703071 master-0 kubenswrapper[33572]: I1204 22:41:19.702979 33572 scope.go:117] "RemoveContainer" containerID="5adde3abc3e3f18bf0d31605d95049d1715c87900158c6a85b651c0730015dca" Dec 04 22:41:19.750421 master-0 kubenswrapper[33572]: I1204 22:41:19.750301 33572 scope.go:117] "RemoveContainer" containerID="ea94b2e79bc02874fbf7839b0b47359fe9347e3d1012c0f8016aaeadd109d828" Dec 04 22:41:19.826529 master-0 kubenswrapper[33572]: I1204 22:41:19.826468 33572 scope.go:117] "RemoveContainer" containerID="4114a176b9e0073b647cd0952384b11b473ec0f42e83a423a441a422d5f0900b" Dec 04 22:42:20.149440 master-0 kubenswrapper[33572]: I1204 22:42:20.149325 33572 scope.go:117] "RemoveContainer" containerID="26486bfde804d1b1b956d33edd58535742b07ece5cbbbc5d538e1b304ab6e0e7" Dec 04 22:42:20.183823 master-0 kubenswrapper[33572]: I1204 22:42:20.183759 33572 scope.go:117] "RemoveContainer" containerID="bbc91fb05397d144a428c1a56753d26e44ebf29c7aa72fe800ec67af9832175a" Dec 04 22:44:16.074737 master-0 kubenswrapper[33572]: I1204 22:44:16.074667 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f80f-account-create-update-mtpsx"] Dec 04 22:44:16.098104 master-0 kubenswrapper[33572]: I1204 22:44:16.098032 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nn74s"] Dec 04 22:44:16.118076 master-0 kubenswrapper[33572]: I1204 22:44:16.118035 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f80f-account-create-update-mtpsx"] Dec 04 22:44:16.132389 master-0 kubenswrapper[33572]: I1204 22:44:16.132342 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nn74s"] Dec 04 22:44:16.542437 master-0 kubenswrapper[33572]: I1204 22:44:16.542374 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3418162d-92a3-429f-b73f-13e85aab7f44" path="/var/lib/kubelet/pods/3418162d-92a3-429f-b73f-13e85aab7f44/volumes" Dec 04 22:44:16.543222 master-0 kubenswrapper[33572]: I1204 22:44:16.543180 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d9f3fad-681e-4cf1-8224-99ae62df7ad9" path="/var/lib/kubelet/pods/7d9f3fad-681e-4cf1-8224-99ae62df7ad9/volumes" Dec 04 22:44:20.289299 master-0 kubenswrapper[33572]: I1204 22:44:20.289197 33572 scope.go:117] "RemoveContainer" containerID="823485e52d9ee5d9d0786946d18b7c233caf1cf514e6f0afb6f9b0adfb248c1b" Dec 04 22:44:20.351080 master-0 kubenswrapper[33572]: I1204 22:44:20.351010 33572 scope.go:117] "RemoveContainer" containerID="c7d104dae7313232538706504d7586995262ea9fea00a7509ae95447fd126569" Dec 04 22:44:23.069705 master-0 kubenswrapper[33572]: I1204 22:44:23.069641 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4jgvt"] Dec 04 22:44:23.084262 master-0 kubenswrapper[33572]: I1204 22:44:23.084202 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a06d-account-create-update-pk8fp"] Dec 04 22:44:23.098709 master-0 kubenswrapper[33572]: I1204 22:44:23.098643 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4n4f2"] Dec 04 22:44:23.113415 master-0 kubenswrapper[33572]: I1204 22:44:23.113361 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4jgvt"] Dec 04 22:44:23.127227 master-0 kubenswrapper[33572]: I1204 22:44:23.127153 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a06d-account-create-update-pk8fp"] Dec 04 22:44:23.136967 master-0 kubenswrapper[33572]: I1204 22:44:23.136899 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4n4f2"] Dec 04 22:44:23.149345 master-0 kubenswrapper[33572]: I1204 22:44:23.149289 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c742-account-create-update-22tqm"] Dec 04 22:44:23.167243 master-0 kubenswrapper[33572]: I1204 22:44:23.167183 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c742-account-create-update-22tqm"] Dec 04 22:44:24.553538 master-0 kubenswrapper[33572]: I1204 22:44:24.553398 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="036de423-2892-492c-a9d4-d38ea619f55c" path="/var/lib/kubelet/pods/036de423-2892-492c-a9d4-d38ea619f55c/volumes" Dec 04 22:44:24.554952 master-0 kubenswrapper[33572]: I1204 22:44:24.554884 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="793ae801-c2fd-4d81-82b1-b72e577668a4" path="/var/lib/kubelet/pods/793ae801-c2fd-4d81-82b1-b72e577668a4/volumes" Dec 04 22:44:24.556480 master-0 kubenswrapper[33572]: I1204 22:44:24.556414 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a95a2cfd-787d-4d6a-94b6-38938be00535" path="/var/lib/kubelet/pods/a95a2cfd-787d-4d6a-94b6-38938be00535/volumes" Dec 04 22:44:24.557865 master-0 kubenswrapper[33572]: I1204 22:44:24.557802 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b52dac30-a647-4928-a2c3-849bab534073" path="/var/lib/kubelet/pods/b52dac30-a647-4928-a2c3-849bab534073/volumes" Dec 04 22:44:42.070874 master-0 kubenswrapper[33572]: I1204 22:44:42.070738 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qbxvp"] Dec 04 22:44:42.089838 master-0 kubenswrapper[33572]: I1204 22:44:42.089741 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qbxvp"] Dec 04 22:44:42.541840 master-0 kubenswrapper[33572]: I1204 22:44:42.541785 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c75f01-7399-4114-96b7-9435a6ba089b" path="/var/lib/kubelet/pods/13c75f01-7399-4114-96b7-9435a6ba089b/volumes" Dec 04 22:44:49.068718 master-0 kubenswrapper[33572]: I1204 22:44:49.068603 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2ef1-account-create-update-nzfgp"] Dec 04 22:44:49.107858 master-0 kubenswrapper[33572]: I1204 22:44:49.107785 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-kjf8b"] Dec 04 22:44:49.124184 master-0 kubenswrapper[33572]: I1204 22:44:49.124087 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-d6jvr"] Dec 04 22:44:49.147751 master-0 kubenswrapper[33572]: I1204 22:44:49.147625 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2ef1-account-create-update-nzfgp"] Dec 04 22:44:49.163211 master-0 kubenswrapper[33572]: I1204 22:44:49.163140 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-88ec-account-create-update-g22k5"] Dec 04 22:44:49.179577 master-0 kubenswrapper[33572]: I1204 22:44:49.179492 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-kjf8b"] Dec 04 22:44:49.194986 master-0 kubenswrapper[33572]: I1204 22:44:49.194913 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-d6jvr"] Dec 04 22:44:49.207518 master-0 kubenswrapper[33572]: I1204 22:44:49.207412 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-88ec-account-create-update-g22k5"] Dec 04 22:44:50.545619 master-0 kubenswrapper[33572]: I1204 22:44:50.545551 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49" path="/var/lib/kubelet/pods/29e3bb4b-ddd7-4df1-8ad6-cf8fbded2f49/volumes" Dec 04 22:44:50.546386 master-0 kubenswrapper[33572]: I1204 22:44:50.546349 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c0fabb4-e989-4e6b-b5f1-27f23997389b" path="/var/lib/kubelet/pods/9c0fabb4-e989-4e6b-b5f1-27f23997389b/volumes" Dec 04 22:44:50.547163 master-0 kubenswrapper[33572]: I1204 22:44:50.547129 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4f6a92e-1188-4a7d-94d8-0989924196a7" path="/var/lib/kubelet/pods/c4f6a92e-1188-4a7d-94d8-0989924196a7/volumes" Dec 04 22:44:50.547935 master-0 kubenswrapper[33572]: I1204 22:44:50.547872 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a41736-e271-4aef-a67f-77937bc3e446" path="/var/lib/kubelet/pods/d2a41736-e271-4aef-a67f-77937bc3e446/volumes" Dec 04 22:44:54.042150 master-0 kubenswrapper[33572]: I1204 22:44:54.042087 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-rln28"] Dec 04 22:44:54.052963 master-0 kubenswrapper[33572]: I1204 22:44:54.052898 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-rln28"] Dec 04 22:44:54.553880 master-0 kubenswrapper[33572]: I1204 22:44:54.553796 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff98c13-5ff9-4ffd-a779-e3121642011e" path="/var/lib/kubelet/pods/2ff98c13-5ff9-4ffd-a779-e3121642011e/volumes" Dec 04 22:45:00.070774 master-0 kubenswrapper[33572]: I1204 22:45:00.069878 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-ab6c-account-create-update-vs69m"] Dec 04 22:45:00.084664 master-0 kubenswrapper[33572]: I1204 22:45:00.084609 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-xd84n"] Dec 04 22:45:00.105289 master-0 kubenswrapper[33572]: I1204 22:45:00.105231 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-ab6c-account-create-update-vs69m"] Dec 04 22:45:00.126769 master-0 kubenswrapper[33572]: I1204 22:45:00.126647 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-xd84n"] Dec 04 22:45:00.157940 master-0 kubenswrapper[33572]: I1204 22:45:00.157812 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95"] Dec 04 22:45:00.160226 master-0 kubenswrapper[33572]: E1204 22:45:00.160200 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87297d8b-d0fb-4571-b239-648415089d10" containerName="extract-content" Dec 04 22:45:00.160226 master-0 kubenswrapper[33572]: I1204 22:45:00.160222 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="87297d8b-d0fb-4571-b239-648415089d10" containerName="extract-content" Dec 04 22:45:00.160330 master-0 kubenswrapper[33572]: E1204 22:45:00.160245 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed45cd7a-60da-483e-a419-ccadecd30061" containerName="registry-server" Dec 04 22:45:00.160330 master-0 kubenswrapper[33572]: I1204 22:45:00.160251 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed45cd7a-60da-483e-a419-ccadecd30061" containerName="registry-server" Dec 04 22:45:00.160330 master-0 kubenswrapper[33572]: E1204 22:45:00.160258 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerName="extract-content" Dec 04 22:45:00.160330 master-0 kubenswrapper[33572]: I1204 22:45:00.160264 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerName="extract-content" Dec 04 22:45:00.160330 master-0 kubenswrapper[33572]: E1204 22:45:00.160286 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerName="registry-server" Dec 04 22:45:00.160330 master-0 kubenswrapper[33572]: I1204 22:45:00.160294 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerName="registry-server" Dec 04 22:45:00.160330 master-0 kubenswrapper[33572]: E1204 22:45:00.160309 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerName="extract-utilities" Dec 04 22:45:00.160330 master-0 kubenswrapper[33572]: I1204 22:45:00.160314 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerName="extract-utilities" Dec 04 22:45:00.160330 master-0 kubenswrapper[33572]: E1204 22:45:00.160333 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerName="registry-server" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: I1204 22:45:00.160338 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerName="registry-server" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: E1204 22:45:00.160347 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed45cd7a-60da-483e-a419-ccadecd30061" containerName="extract-utilities" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: I1204 22:45:00.160353 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed45cd7a-60da-483e-a419-ccadecd30061" containerName="extract-utilities" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: E1204 22:45:00.160378 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87297d8b-d0fb-4571-b239-648415089d10" containerName="extract-utilities" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: I1204 22:45:00.160386 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="87297d8b-d0fb-4571-b239-648415089d10" containerName="extract-utilities" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: E1204 22:45:00.160400 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerName="extract-utilities" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: I1204 22:45:00.160408 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerName="extract-utilities" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: E1204 22:45:00.160427 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87297d8b-d0fb-4571-b239-648415089d10" containerName="registry-server" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: I1204 22:45:00.160433 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="87297d8b-d0fb-4571-b239-648415089d10" containerName="registry-server" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: E1204 22:45:00.160464 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed45cd7a-60da-483e-a419-ccadecd30061" containerName="extract-content" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: I1204 22:45:00.160470 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed45cd7a-60da-483e-a419-ccadecd30061" containerName="extract-content" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: E1204 22:45:00.160480 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerName="extract-content" Dec 04 22:45:00.160606 master-0 kubenswrapper[33572]: I1204 22:45:00.160486 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerName="extract-content" Dec 04 22:45:00.163280 master-0 kubenswrapper[33572]: I1204 22:45:00.163245 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="f76deb1d-8ac8-4b6b-b7ed-03c4b7638635" containerName="registry-server" Dec 04 22:45:00.163280 master-0 kubenswrapper[33572]: I1204 22:45:00.163279 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="87297d8b-d0fb-4571-b239-648415089d10" containerName="registry-server" Dec 04 22:45:00.163394 master-0 kubenswrapper[33572]: I1204 22:45:00.163294 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed45cd7a-60da-483e-a419-ccadecd30061" containerName="registry-server" Dec 04 22:45:00.163394 master-0 kubenswrapper[33572]: I1204 22:45:00.163309 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerName="extract-content" Dec 04 22:45:00.163394 master-0 kubenswrapper[33572]: I1204 22:45:00.163341 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerName="extract-utilities" Dec 04 22:45:00.163394 master-0 kubenswrapper[33572]: I1204 22:45:00.163350 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="29828f55-427b-4fe3-8713-03bcd6ac9dec" containerName="registry-server" Dec 04 22:45:00.164347 master-0 kubenswrapper[33572]: I1204 22:45:00.164318 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.167022 master-0 kubenswrapper[33572]: I1204 22:45:00.166979 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 22:45:00.167256 master-0 kubenswrapper[33572]: I1204 22:45:00.167227 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-njflt" Dec 04 22:45:00.176571 master-0 kubenswrapper[33572]: I1204 22:45:00.174173 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95"] Dec 04 22:45:00.299616 master-0 kubenswrapper[33572]: I1204 22:45:00.299540 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7krqk\" (UniqueName: \"kubernetes.io/projected/3f4da918-3ef6-4aff-b2be-843b32b21453-kube-api-access-7krqk\") pod \"collect-profiles-29414805-jsb95\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.299616 master-0 kubenswrapper[33572]: I1204 22:45:00.299608 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4da918-3ef6-4aff-b2be-843b32b21453-config-volume\") pod \"collect-profiles-29414805-jsb95\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.300159 master-0 kubenswrapper[33572]: I1204 22:45:00.300115 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f4da918-3ef6-4aff-b2be-843b32b21453-secret-volume\") pod \"collect-profiles-29414805-jsb95\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.403469 master-0 kubenswrapper[33572]: I1204 22:45:00.402852 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7krqk\" (UniqueName: \"kubernetes.io/projected/3f4da918-3ef6-4aff-b2be-843b32b21453-kube-api-access-7krqk\") pod \"collect-profiles-29414805-jsb95\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.403469 master-0 kubenswrapper[33572]: I1204 22:45:00.402985 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4da918-3ef6-4aff-b2be-843b32b21453-config-volume\") pod \"collect-profiles-29414805-jsb95\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.403469 master-0 kubenswrapper[33572]: I1204 22:45:00.403408 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f4da918-3ef6-4aff-b2be-843b32b21453-secret-volume\") pod \"collect-profiles-29414805-jsb95\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.405005 master-0 kubenswrapper[33572]: I1204 22:45:00.404939 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4da918-3ef6-4aff-b2be-843b32b21453-config-volume\") pod \"collect-profiles-29414805-jsb95\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.407642 master-0 kubenswrapper[33572]: I1204 22:45:00.407591 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f4da918-3ef6-4aff-b2be-843b32b21453-secret-volume\") pod \"collect-profiles-29414805-jsb95\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.421662 master-0 kubenswrapper[33572]: I1204 22:45:00.421608 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7krqk\" (UniqueName: \"kubernetes.io/projected/3f4da918-3ef6-4aff-b2be-843b32b21453-kube-api-access-7krqk\") pod \"collect-profiles-29414805-jsb95\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.514492 master-0 kubenswrapper[33572]: I1204 22:45:00.514416 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:00.542783 master-0 kubenswrapper[33572]: I1204 22:45:00.542715 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a267444-303d-4eaf-98f5-01b11c7efe8f" path="/var/lib/kubelet/pods/2a267444-303d-4eaf-98f5-01b11c7efe8f/volumes" Dec 04 22:45:00.543809 master-0 kubenswrapper[33572]: I1204 22:45:00.543766 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8733e1a3-7b35-40cb-b129-536fe10961a8" path="/var/lib/kubelet/pods/8733e1a3-7b35-40cb-b129-536fe10961a8/volumes" Dec 04 22:45:01.063183 master-0 kubenswrapper[33572]: I1204 22:45:01.063127 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95"] Dec 04 22:45:01.788769 master-0 kubenswrapper[33572]: I1204 22:45:01.788658 33572 generic.go:334] "Generic (PLEG): container finished" podID="3f4da918-3ef6-4aff-b2be-843b32b21453" containerID="b4159c1d8950f153a4f36269b281bd8368f6c78338942be190a7c758a1c64136" exitCode=0 Dec 04 22:45:01.788769 master-0 kubenswrapper[33572]: I1204 22:45:01.788727 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" event={"ID":"3f4da918-3ef6-4aff-b2be-843b32b21453","Type":"ContainerDied","Data":"b4159c1d8950f153a4f36269b281bd8368f6c78338942be190a7c758a1c64136"} Dec 04 22:45:01.788769 master-0 kubenswrapper[33572]: I1204 22:45:01.788774 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" event={"ID":"3f4da918-3ef6-4aff-b2be-843b32b21453","Type":"ContainerStarted","Data":"f6f721cb02bb462e381429935a5099ca16022f48476ae6765a8f1917c2133419"} Dec 04 22:45:03.339307 master-0 kubenswrapper[33572]: I1204 22:45:03.339193 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:03.504093 master-0 kubenswrapper[33572]: I1204 22:45:03.503927 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4da918-3ef6-4aff-b2be-843b32b21453-config-volume\") pod \"3f4da918-3ef6-4aff-b2be-843b32b21453\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " Dec 04 22:45:03.505084 master-0 kubenswrapper[33572]: I1204 22:45:03.504319 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7krqk\" (UniqueName: \"kubernetes.io/projected/3f4da918-3ef6-4aff-b2be-843b32b21453-kube-api-access-7krqk\") pod \"3f4da918-3ef6-4aff-b2be-843b32b21453\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " Dec 04 22:45:03.505084 master-0 kubenswrapper[33572]: I1204 22:45:03.504380 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f4da918-3ef6-4aff-b2be-843b32b21453-secret-volume\") pod \"3f4da918-3ef6-4aff-b2be-843b32b21453\" (UID: \"3f4da918-3ef6-4aff-b2be-843b32b21453\") " Dec 04 22:45:03.505084 master-0 kubenswrapper[33572]: I1204 22:45:03.504477 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f4da918-3ef6-4aff-b2be-843b32b21453-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f4da918-3ef6-4aff-b2be-843b32b21453" (UID: "3f4da918-3ef6-4aff-b2be-843b32b21453"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 22:45:03.506154 master-0 kubenswrapper[33572]: I1204 22:45:03.505541 33572 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f4da918-3ef6-4aff-b2be-843b32b21453-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 22:45:03.509118 master-0 kubenswrapper[33572]: I1204 22:45:03.509045 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4da918-3ef6-4aff-b2be-843b32b21453-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f4da918-3ef6-4aff-b2be-843b32b21453" (UID: "3f4da918-3ef6-4aff-b2be-843b32b21453"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 22:45:03.522543 master-0 kubenswrapper[33572]: I1204 22:45:03.522344 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4da918-3ef6-4aff-b2be-843b32b21453-kube-api-access-7krqk" (OuterVolumeSpecName: "kube-api-access-7krqk") pod "3f4da918-3ef6-4aff-b2be-843b32b21453" (UID: "3f4da918-3ef6-4aff-b2be-843b32b21453"). InnerVolumeSpecName "kube-api-access-7krqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:45:03.607130 master-0 kubenswrapper[33572]: I1204 22:45:03.607076 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7krqk\" (UniqueName: \"kubernetes.io/projected/3f4da918-3ef6-4aff-b2be-843b32b21453-kube-api-access-7krqk\") on node \"master-0\" DevicePath \"\"" Dec 04 22:45:03.607130 master-0 kubenswrapper[33572]: I1204 22:45:03.607110 33572 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f4da918-3ef6-4aff-b2be-843b32b21453-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 22:45:03.822536 master-0 kubenswrapper[33572]: I1204 22:45:03.822449 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" event={"ID":"3f4da918-3ef6-4aff-b2be-843b32b21453","Type":"ContainerDied","Data":"f6f721cb02bb462e381429935a5099ca16022f48476ae6765a8f1917c2133419"} Dec 04 22:45:03.822536 master-0 kubenswrapper[33572]: I1204 22:45:03.822518 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6f721cb02bb462e381429935a5099ca16022f48476ae6765a8f1917c2133419" Dec 04 22:45:03.822842 master-0 kubenswrapper[33572]: I1204 22:45:03.822578 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414805-jsb95" Dec 04 22:45:03.931943 master-0 kubenswrapper[33572]: E1204 22:45:03.931875 33572 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f4da918_3ef6_4aff_b2be_843b32b21453.slice/crio-f6f721cb02bb462e381429935a5099ca16022f48476ae6765a8f1917c2133419\": RecentStats: unable to find data in memory cache]" Dec 04 22:45:04.472312 master-0 kubenswrapper[33572]: I1204 22:45:04.472226 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x"] Dec 04 22:45:04.492684 master-0 kubenswrapper[33572]: I1204 22:45:04.492599 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414760-r947x"] Dec 04 22:45:04.547305 master-0 kubenswrapper[33572]: I1204 22:45:04.547219 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da6da420-9631-4bce-b238-96ab361e23e9" path="/var/lib/kubelet/pods/da6da420-9631-4bce-b238-96ab361e23e9/volumes" Dec 04 22:45:18.050562 master-0 kubenswrapper[33572]: I1204 22:45:18.050493 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ms5vx"] Dec 04 22:45:18.069634 master-0 kubenswrapper[33572]: I1204 22:45:18.069556 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ms5vx"] Dec 04 22:45:18.544459 master-0 kubenswrapper[33572]: I1204 22:45:18.544359 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cdb5e15-32e4-4257-be61-ec0ba1a6884e" path="/var/lib/kubelet/pods/8cdb5e15-32e4-4257-be61-ec0ba1a6884e/volumes" Dec 04 22:45:20.487985 master-0 kubenswrapper[33572]: I1204 22:45:20.487743 33572 scope.go:117] "RemoveContainer" containerID="4431ddd9fe3466ed91a5948ba60b94aaa2fa445da95a9fcfbb56879b9423eb60" Dec 04 22:45:20.534564 master-0 kubenswrapper[33572]: I1204 22:45:20.534239 33572 scope.go:117] "RemoveContainer" containerID="930a9c223992f2d6a73d76abfb2e99eb730dd3718df76150ecd3d064b4b416c7" Dec 04 22:45:20.623448 master-0 kubenswrapper[33572]: I1204 22:45:20.623392 33572 scope.go:117] "RemoveContainer" containerID="abdd217b047e5958a981618bd0d5cac76014fb5b033a88a73333a5f1d1e58c17" Dec 04 22:45:20.666569 master-0 kubenswrapper[33572]: I1204 22:45:20.666427 33572 scope.go:117] "RemoveContainer" containerID="f6abd516c5fea9b5bd341b0a7d1e2139fc949d8d6089d7ff3cdb7a518d0c2899" Dec 04 22:45:20.729549 master-0 kubenswrapper[33572]: I1204 22:45:20.729478 33572 scope.go:117] "RemoveContainer" containerID="03618bfdca10cd63f0ba0d81558c82a816166ed5c737eaa60242022f8268fd49" Dec 04 22:45:20.755396 master-0 kubenswrapper[33572]: I1204 22:45:20.755335 33572 scope.go:117] "RemoveContainer" containerID="ade358974030732d6f6c32378d336f49dc4370ee70236beddd36048f301cbb18" Dec 04 22:45:20.814896 master-0 kubenswrapper[33572]: I1204 22:45:20.814841 33572 scope.go:117] "RemoveContainer" containerID="6e66a7b6018782d15f4a77164d01c739390756258f36da8cfc42f73c273c24bf" Dec 04 22:45:20.843904 master-0 kubenswrapper[33572]: I1204 22:45:20.843867 33572 scope.go:117] "RemoveContainer" containerID="d76d7f47dc8ce011a7120baf31993be5c69c3cdf49281070022020d1a3fd0584" Dec 04 22:45:21.101211 master-0 kubenswrapper[33572]: I1204 22:45:21.100941 33572 scope.go:117] "RemoveContainer" containerID="262ca22dfdb092ef67da696554009a9d233510bdc1ab8f72d9b6f18e697b954e" Dec 04 22:45:21.136646 master-0 kubenswrapper[33572]: I1204 22:45:21.136581 33572 scope.go:117] "RemoveContainer" containerID="d689486f9be58a6935f936459fe33002b1b01666be0772cbeb95fabb72de0703" Dec 04 22:45:21.170090 master-0 kubenswrapper[33572]: I1204 22:45:21.170033 33572 scope.go:117] "RemoveContainer" containerID="d281c21cd4e6a5bdaa904d1cec96c25970563004aa9943074804eafd85bd5f1e" Dec 04 22:45:21.201579 master-0 kubenswrapper[33572]: I1204 22:45:21.201523 33572 scope.go:117] "RemoveContainer" containerID="42139ff7bbf274e84e342d108accf059c395a55f884268c0d278fd9ac303b7e9" Dec 04 22:45:21.238639 master-0 kubenswrapper[33572]: I1204 22:45:21.238538 33572 scope.go:117] "RemoveContainer" containerID="0d636103e4ae963e2e654a6eae6dddd7b9985787031941a9ef697cd6da609156" Dec 04 22:45:21.308224 master-0 kubenswrapper[33572]: I1204 22:45:21.308087 33572 scope.go:117] "RemoveContainer" containerID="b3650a3e3b1126254c5989162d4092a91d06a2dd1e654579dae08edb3440de9c" Dec 04 22:45:24.099356 master-0 kubenswrapper[33572]: I1204 22:45:24.099276 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-l59gq"] Dec 04 22:45:24.115409 master-0 kubenswrapper[33572]: I1204 22:45:24.115322 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-l59gq"] Dec 04 22:45:24.548920 master-0 kubenswrapper[33572]: I1204 22:45:24.548827 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d5cab4-61b9-4503-8e64-09f107844457" path="/var/lib/kubelet/pods/58d5cab4-61b9-4503-8e64-09f107844457/volumes" Dec 04 22:45:27.030110 master-0 kubenswrapper[33572]: I1204 22:45:27.030052 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-lxrlk"] Dec 04 22:45:27.043281 master-0 kubenswrapper[33572]: I1204 22:45:27.043188 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-lxrlk"] Dec 04 22:45:28.556647 master-0 kubenswrapper[33572]: I1204 22:45:28.556484 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2609a439-7ac0-4253-a74e-e4c90a023832" path="/var/lib/kubelet/pods/2609a439-7ac0-4253-a74e-e4c90a023832/volumes" Dec 04 22:45:30.070351 master-0 kubenswrapper[33572]: I1204 22:45:30.070275 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7675d-db-sync-d9l4w"] Dec 04 22:45:30.086534 master-0 kubenswrapper[33572]: I1204 22:45:30.086458 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7675d-db-sync-d9l4w"] Dec 04 22:45:30.551881 master-0 kubenswrapper[33572]: I1204 22:45:30.551820 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c952fb4f-1a86-4eab-9c4e-4b046b8d81b2" path="/var/lib/kubelet/pods/c952fb4f-1a86-4eab-9c4e-4b046b8d81b2/volumes" Dec 04 22:45:36.050776 master-0 kubenswrapper[33572]: I1204 22:45:36.050683 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-sync-j89sq"] Dec 04 22:45:36.069653 master-0 kubenswrapper[33572]: I1204 22:45:36.069574 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-sync-j89sq"] Dec 04 22:45:36.547847 master-0 kubenswrapper[33572]: I1204 22:45:36.547669 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320dd132-b10a-4d56-88a4-307ecb61196f" path="/var/lib/kubelet/pods/320dd132-b10a-4d56-88a4-307ecb61196f/volumes" Dec 04 22:45:42.048997 master-0 kubenswrapper[33572]: I1204 22:45:42.048891 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-create-vp6mx"] Dec 04 22:45:42.058073 master-0 kubenswrapper[33572]: I1204 22:45:42.058013 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-create-vp6mx"] Dec 04 22:45:42.547049 master-0 kubenswrapper[33572]: I1204 22:45:42.546988 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbdff964-9c49-48a4-a0f8-7044131a5627" path="/var/lib/kubelet/pods/fbdff964-9c49-48a4-a0f8-7044131a5627/volumes" Dec 04 22:45:43.064664 master-0 kubenswrapper[33572]: I1204 22:45:43.064604 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-ab22-account-create-update-pqwpd"] Dec 04 22:45:43.085485 master-0 kubenswrapper[33572]: I1204 22:45:43.085421 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-ab22-account-create-update-pqwpd"] Dec 04 22:45:44.543770 master-0 kubenswrapper[33572]: I1204 22:45:44.543686 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee3b31f-b225-485b-9675-70162abf39f2" path="/var/lib/kubelet/pods/cee3b31f-b225-485b-9675-70162abf39f2/volumes" Dec 04 22:46:02.255931 master-0 kubenswrapper[33572]: I1204 22:46:02.255772 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-sync-nc4qg"] Dec 04 22:46:02.271013 master-0 kubenswrapper[33572]: I1204 22:46:02.270939 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-sync-nc4qg"] Dec 04 22:46:02.550001 master-0 kubenswrapper[33572]: I1204 22:46:02.549905 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="415ef8f6-9405-4807-8661-ca51383aa454" path="/var/lib/kubelet/pods/415ef8f6-9405-4807-8661-ca51383aa454/volumes" Dec 04 22:46:21.468533 master-0 kubenswrapper[33572]: I1204 22:46:21.467680 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-23b3-account-create-update-kzbkj"] Dec 04 22:46:21.490853 master-0 kubenswrapper[33572]: I1204 22:46:21.490780 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-23b3-account-create-update-kzbkj"] Dec 04 22:46:21.682168 master-0 kubenswrapper[33572]: I1204 22:46:21.682113 33572 scope.go:117] "RemoveContainer" containerID="e0f150fa83bea57a5a2417465b010b1a258fc7779fd43e1dec7600654531dc50" Dec 04 22:46:21.708434 master-0 kubenswrapper[33572]: I1204 22:46:21.708378 33572 scope.go:117] "RemoveContainer" containerID="84422d6e37213b635208a37b386a19d499b708db49e9be10ed48145dcfa136bd" Dec 04 22:46:21.801311 master-0 kubenswrapper[33572]: I1204 22:46:21.801232 33572 scope.go:117] "RemoveContainer" containerID="5ee780f89386259b20927daa58537ef5615e819e02fafc58790a9eea8e5f1738" Dec 04 22:46:21.861933 master-0 kubenswrapper[33572]: I1204 22:46:21.861855 33572 scope.go:117] "RemoveContainer" containerID="f348291203168a853e748fd118c52a1a8854a4b78ba57f8f0ea43f0ebaae2bbc" Dec 04 22:46:21.911390 master-0 kubenswrapper[33572]: I1204 22:46:21.911298 33572 scope.go:117] "RemoveContainer" containerID="4027c31e26a3a9c5fe6bcba8d9a071357e11963e4c240501d1c05241a7ed24bd" Dec 04 22:46:21.993266 master-0 kubenswrapper[33572]: I1204 22:46:21.993197 33572 scope.go:117] "RemoveContainer" containerID="f5642154d0543171958cf65f3dc2a6a8cd75c02eb1c3577bddc47c91c3995270" Dec 04 22:46:22.043864 master-0 kubenswrapper[33572]: I1204 22:46:22.043810 33572 scope.go:117] "RemoveContainer" containerID="849da94721ea8e63b0b4991267e31331a5a838d1865e205d04bfa2710bd2b223" Dec 04 22:46:22.095381 master-0 kubenswrapper[33572]: I1204 22:46:22.089619 33572 scope.go:117] "RemoveContainer" containerID="5e21fdbe67e53c4bf0d5a117687c3388a3fb0445e6b388899942d8529fd65c82" Dec 04 22:46:22.541549 master-0 kubenswrapper[33572]: I1204 22:46:22.541445 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d81f44b5-895d-417b-aff2-1d54081061f9" path="/var/lib/kubelet/pods/d81f44b5-895d-417b-aff2-1d54081061f9/volumes" Dec 04 22:46:23.047729 master-0 kubenswrapper[33572]: I1204 22:46:23.047647 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-j7k4l"] Dec 04 22:46:23.070710 master-0 kubenswrapper[33572]: I1204 22:46:23.070636 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-trw6l"] Dec 04 22:46:23.084460 master-0 kubenswrapper[33572]: I1204 22:46:23.084379 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-80fd-account-create-update-4cjck"] Dec 04 22:46:23.097670 master-0 kubenswrapper[33572]: I1204 22:46:23.097618 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2433-account-create-update-cdgnq"] Dec 04 22:46:23.109918 master-0 kubenswrapper[33572]: I1204 22:46:23.109845 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-2q2sn"] Dec 04 22:46:23.119516 master-0 kubenswrapper[33572]: I1204 22:46:23.119444 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-80fd-account-create-update-4cjck"] Dec 04 22:46:23.133099 master-0 kubenswrapper[33572]: I1204 22:46:23.132997 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-j7k4l"] Dec 04 22:46:23.143269 master-0 kubenswrapper[33572]: I1204 22:46:23.143221 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2433-account-create-update-cdgnq"] Dec 04 22:46:23.155681 master-0 kubenswrapper[33572]: I1204 22:46:23.155632 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-trw6l"] Dec 04 22:46:23.165286 master-0 kubenswrapper[33572]: I1204 22:46:23.165221 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-2q2sn"] Dec 04 22:46:24.546656 master-0 kubenswrapper[33572]: I1204 22:46:24.546587 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2306e236-149a-4214-8600-218585ace100" path="/var/lib/kubelet/pods/2306e236-149a-4214-8600-218585ace100/volumes" Dec 04 22:46:24.547480 master-0 kubenswrapper[33572]: I1204 22:46:24.547372 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b" path="/var/lib/kubelet/pods/2886ee55-a1dc-40b7-a0f6-fdfa59da4d7b/volumes" Dec 04 22:46:24.548225 master-0 kubenswrapper[33572]: I1204 22:46:24.548177 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67167b90-f405-4945-a970-7c5d1a5dcef7" path="/var/lib/kubelet/pods/67167b90-f405-4945-a970-7c5d1a5dcef7/volumes" Dec 04 22:46:24.549071 master-0 kubenswrapper[33572]: I1204 22:46:24.549034 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9072a8-1d23-4e01-b480-6aa621ae89a3" path="/var/lib/kubelet/pods/cf9072a8-1d23-4e01-b480-6aa621ae89a3/volumes" Dec 04 22:46:24.550738 master-0 kubenswrapper[33572]: I1204 22:46:24.550689 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3" path="/var/lib/kubelet/pods/d41d0d7d-d412-404c-9bb0-4f34e1ab5ab3/volumes" Dec 04 22:46:56.111912 master-0 kubenswrapper[33572]: I1204 22:46:56.111774 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k62b9"] Dec 04 22:46:56.131800 master-0 kubenswrapper[33572]: I1204 22:46:56.131716 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k62b9"] Dec 04 22:46:56.542441 master-0 kubenswrapper[33572]: I1204 22:46:56.542388 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26cb867-ec4f-4c99-8185-d397e453bb90" path="/var/lib/kubelet/pods/f26cb867-ec4f-4c99-8185-d397e453bb90/volumes" Dec 04 22:47:19.114844 master-0 kubenswrapper[33572]: I1204 22:47:19.114721 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-96wfh"] Dec 04 22:47:19.150101 master-0 kubenswrapper[33572]: I1204 22:47:19.147404 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8rh5h"] Dec 04 22:47:19.172746 master-0 kubenswrapper[33572]: I1204 22:47:19.172667 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-96wfh"] Dec 04 22:47:19.188297 master-0 kubenswrapper[33572]: I1204 22:47:19.188232 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8rh5h"] Dec 04 22:47:20.547112 master-0 kubenswrapper[33572]: I1204 22:47:20.546991 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e68458-b43a-471d-8d30-bfe010b365f3" path="/var/lib/kubelet/pods/26e68458-b43a-471d-8d30-bfe010b365f3/volumes" Dec 04 22:47:20.549931 master-0 kubenswrapper[33572]: I1204 22:47:20.549860 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="384308f7-6942-4013-9bcd-e56fde1ab09f" path="/var/lib/kubelet/pods/384308f7-6942-4013-9bcd-e56fde1ab09f/volumes" Dec 04 22:47:22.321569 master-0 kubenswrapper[33572]: I1204 22:47:22.321432 33572 scope.go:117] "RemoveContainer" containerID="8c5dbec18cb7e4add3367a1efac5162eeeaed9f7beb683bd63cda3b8fa6aecf5" Dec 04 22:47:22.379800 master-0 kubenswrapper[33572]: I1204 22:47:22.379642 33572 scope.go:117] "RemoveContainer" containerID="082c871709438f1274362858f45d52463bb8b43b9ce1cf0e394b3b9521eecd14" Dec 04 22:47:22.403011 master-0 kubenswrapper[33572]: I1204 22:47:22.402957 33572 scope.go:117] "RemoveContainer" containerID="242c47f45e03b07771899700798be9deba66cf0c07123c21cad3994123bc0422" Dec 04 22:47:22.481814 master-0 kubenswrapper[33572]: I1204 22:47:22.481733 33572 scope.go:117] "RemoveContainer" containerID="a5e1d5e6943df0bb64fdeb000117bedd6ffce96fec86ae5ecd8e66985ddd7d16" Dec 04 22:47:22.559756 master-0 kubenswrapper[33572]: I1204 22:47:22.559709 33572 scope.go:117] "RemoveContainer" containerID="5e0a765d17be796298bfbad89487ca763f9600f55ffb5dc9d673752542258cf2" Dec 04 22:47:22.635287 master-0 kubenswrapper[33572]: I1204 22:47:22.635242 33572 scope.go:117] "RemoveContainer" containerID="513d79bb156bbe5c2d9e61efcc644f4744c5bad082e30b7c689c2a3acfd9c94c" Dec 04 22:47:22.663722 master-0 kubenswrapper[33572]: I1204 22:47:22.663663 33572 scope.go:117] "RemoveContainer" containerID="4c5601a6dd584d847db5e6296f0eaa5934906f136443aa63338b0d8679ac80c3" Dec 04 22:47:22.697534 master-0 kubenswrapper[33572]: I1204 22:47:22.697455 33572 scope.go:117] "RemoveContainer" containerID="342ab6428cfe3389305d6d9a5b6e25caf9b329efe3f47477ce696e2be0645bd3" Dec 04 22:47:22.744964 master-0 kubenswrapper[33572]: I1204 22:47:22.744914 33572 scope.go:117] "RemoveContainer" containerID="5ba2aa41593fdbd3e0b2385aa3a69c480fdbfe1b4a47b7d8332aabb99b886942" Dec 04 22:47:56.076697 master-0 kubenswrapper[33572]: I1204 22:47:56.076571 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-host-discover-j8m8x"] Dec 04 22:47:56.097156 master-0 kubenswrapper[33572]: I1204 22:47:56.096945 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-host-discover-j8m8x"] Dec 04 22:47:56.552760 master-0 kubenswrapper[33572]: I1204 22:47:56.552694 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e36abc43-b5d9-41be-b4f3-c90c72b43065" path="/var/lib/kubelet/pods/e36abc43-b5d9-41be-b4f3-c90c72b43065/volumes" Dec 04 22:47:59.054971 master-0 kubenswrapper[33572]: I1204 22:47:59.054883 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ks44t"] Dec 04 22:47:59.074480 master-0 kubenswrapper[33572]: I1204 22:47:59.074379 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ks44t"] Dec 04 22:48:00.543397 master-0 kubenswrapper[33572]: I1204 22:48:00.543319 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe4a826-dcad-4726-9545-a2f208036313" path="/var/lib/kubelet/pods/6fe4a826-dcad-4726-9545-a2f208036313/volumes" Dec 04 22:48:22.995194 master-0 kubenswrapper[33572]: I1204 22:48:22.994967 33572 scope.go:117] "RemoveContainer" containerID="e1e47bf8f98f63f3437bfb5a44c506f051d1792be8fc6c07e683ee32ef14498f" Dec 04 22:48:23.047881 master-0 kubenswrapper[33572]: I1204 22:48:23.047797 33572 scope.go:117] "RemoveContainer" containerID="edc92f41f7b8ffc4163eafadb32c5be365030d626372bdf7dd6922f06d933a18" Dec 04 22:51:11.893889 master-0 kubenswrapper[33572]: I1204 22:51:11.893801 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fgqms"] Dec 04 22:51:11.895219 master-0 kubenswrapper[33572]: E1204 22:51:11.894374 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4da918-3ef6-4aff-b2be-843b32b21453" containerName="collect-profiles" Dec 04 22:51:11.895219 master-0 kubenswrapper[33572]: I1204 22:51:11.894388 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4da918-3ef6-4aff-b2be-843b32b21453" containerName="collect-profiles" Dec 04 22:51:11.895219 master-0 kubenswrapper[33572]: I1204 22:51:11.894649 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4da918-3ef6-4aff-b2be-843b32b21453" containerName="collect-profiles" Dec 04 22:51:11.896298 master-0 kubenswrapper[33572]: I1204 22:51:11.896254 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:11.904534 master-0 kubenswrapper[33572]: I1204 22:51:11.904465 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fgqms"] Dec 04 22:51:12.004854 master-0 kubenswrapper[33572]: I1204 22:51:12.004777 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-859d7\" (UniqueName: \"kubernetes.io/projected/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-kube-api-access-859d7\") pod \"community-operators-fgqms\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:12.005090 master-0 kubenswrapper[33572]: I1204 22:51:12.004903 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-utilities\") pod \"community-operators-fgqms\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:12.005090 master-0 kubenswrapper[33572]: I1204 22:51:12.005057 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-catalog-content\") pod \"community-operators-fgqms\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:12.106299 master-0 kubenswrapper[33572]: I1204 22:51:12.106233 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-859d7\" (UniqueName: \"kubernetes.io/projected/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-kube-api-access-859d7\") pod \"community-operators-fgqms\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:12.106299 master-0 kubenswrapper[33572]: I1204 22:51:12.106311 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-utilities\") pod \"community-operators-fgqms\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:12.106674 master-0 kubenswrapper[33572]: I1204 22:51:12.106415 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-catalog-content\") pod \"community-operators-fgqms\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:12.107095 master-0 kubenswrapper[33572]: I1204 22:51:12.107053 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-catalog-content\") pod \"community-operators-fgqms\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:12.107095 master-0 kubenswrapper[33572]: I1204 22:51:12.107081 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-utilities\") pod \"community-operators-fgqms\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:12.124179 master-0 kubenswrapper[33572]: I1204 22:51:12.124114 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-859d7\" (UniqueName: \"kubernetes.io/projected/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-kube-api-access-859d7\") pod \"community-operators-fgqms\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:12.264122 master-0 kubenswrapper[33572]: I1204 22:51:12.263932 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:12.919814 master-0 kubenswrapper[33572]: W1204 22:51:12.919748 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fc8c55b_e5a3_4df7_8fb9_6abd8d8eb132.slice/crio-c01c9eb3df4943da26a28c7850299361a296ae5f55274c575d92faa4c316258c WatchSource:0}: Error finding container c01c9eb3df4943da26a28c7850299361a296ae5f55274c575d92faa4c316258c: Status 404 returned error can't find the container with id c01c9eb3df4943da26a28c7850299361a296ae5f55274c575d92faa4c316258c Dec 04 22:51:12.925376 master-0 kubenswrapper[33572]: I1204 22:51:12.925154 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fgqms"] Dec 04 22:51:13.662410 master-0 kubenswrapper[33572]: I1204 22:51:13.662267 33572 generic.go:334] "Generic (PLEG): container finished" podID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerID="49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317" exitCode=0 Dec 04 22:51:13.662795 master-0 kubenswrapper[33572]: I1204 22:51:13.662450 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgqms" event={"ID":"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132","Type":"ContainerDied","Data":"49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317"} Dec 04 22:51:13.662795 master-0 kubenswrapper[33572]: I1204 22:51:13.662555 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgqms" event={"ID":"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132","Type":"ContainerStarted","Data":"c01c9eb3df4943da26a28c7850299361a296ae5f55274c575d92faa4c316258c"} Dec 04 22:51:13.665651 master-0 kubenswrapper[33572]: I1204 22:51:13.665622 33572 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 22:51:14.690369 master-0 kubenswrapper[33572]: I1204 22:51:14.690298 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgqms" event={"ID":"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132","Type":"ContainerStarted","Data":"4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3"} Dec 04 22:51:15.708382 master-0 kubenswrapper[33572]: I1204 22:51:15.708321 33572 generic.go:334] "Generic (PLEG): container finished" podID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerID="4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3" exitCode=0 Dec 04 22:51:15.708382 master-0 kubenswrapper[33572]: I1204 22:51:15.708374 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgqms" event={"ID":"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132","Type":"ContainerDied","Data":"4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3"} Dec 04 22:51:16.728498 master-0 kubenswrapper[33572]: I1204 22:51:16.728411 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgqms" event={"ID":"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132","Type":"ContainerStarted","Data":"a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd"} Dec 04 22:51:16.764004 master-0 kubenswrapper[33572]: I1204 22:51:16.763924 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fgqms" podStartSLOduration=3.164046528 podStartE2EDuration="5.763902442s" podCreationTimestamp="2025-12-04 22:51:11 +0000 UTC" firstStartedPulling="2025-12-04 22:51:13.665402954 +0000 UTC m=+1937.392928643" lastFinishedPulling="2025-12-04 22:51:16.265258898 +0000 UTC m=+1939.992784557" observedRunningTime="2025-12-04 22:51:16.758086143 +0000 UTC m=+1940.485611862" watchObservedRunningTime="2025-12-04 22:51:16.763902442 +0000 UTC m=+1940.491428081" Dec 04 22:51:22.264823 master-0 kubenswrapper[33572]: I1204 22:51:22.264587 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:22.264823 master-0 kubenswrapper[33572]: I1204 22:51:22.264696 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:22.330210 master-0 kubenswrapper[33572]: I1204 22:51:22.330135 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:23.036325 master-0 kubenswrapper[33572]: I1204 22:51:23.036231 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:23.561021 master-0 kubenswrapper[33572]: I1204 22:51:23.560945 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fgqms"] Dec 04 22:51:24.983781 master-0 kubenswrapper[33572]: I1204 22:51:24.983705 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fgqms" podUID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerName="registry-server" containerID="cri-o://a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd" gracePeriod=2 Dec 04 22:51:25.800967 master-0 kubenswrapper[33572]: I1204 22:51:25.800857 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:25.930647 master-0 kubenswrapper[33572]: I1204 22:51:25.930543 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-catalog-content\") pod \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " Dec 04 22:51:25.930901 master-0 kubenswrapper[33572]: I1204 22:51:25.930798 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-859d7\" (UniqueName: \"kubernetes.io/projected/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-kube-api-access-859d7\") pod \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " Dec 04 22:51:25.930998 master-0 kubenswrapper[33572]: I1204 22:51:25.930944 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-utilities\") pod \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\" (UID: \"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132\") " Dec 04 22:51:25.933951 master-0 kubenswrapper[33572]: I1204 22:51:25.933839 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-utilities" (OuterVolumeSpecName: "utilities") pod "3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" (UID: "3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:51:25.936409 master-0 kubenswrapper[33572]: I1204 22:51:25.936340 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-kube-api-access-859d7" (OuterVolumeSpecName: "kube-api-access-859d7") pod "3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" (UID: "3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132"). InnerVolumeSpecName "kube-api-access-859d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:51:25.995815 master-0 kubenswrapper[33572]: I1204 22:51:25.995752 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" (UID: "3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:51:25.998641 master-0 kubenswrapper[33572]: I1204 22:51:25.998616 33572 generic.go:334] "Generic (PLEG): container finished" podID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerID="a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd" exitCode=0 Dec 04 22:51:25.998769 master-0 kubenswrapper[33572]: I1204 22:51:25.998669 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgqms" event={"ID":"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132","Type":"ContainerDied","Data":"a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd"} Dec 04 22:51:25.998903 master-0 kubenswrapper[33572]: I1204 22:51:25.998883 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fgqms" event={"ID":"3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132","Type":"ContainerDied","Data":"c01c9eb3df4943da26a28c7850299361a296ae5f55274c575d92faa4c316258c"} Dec 04 22:51:25.998991 master-0 kubenswrapper[33572]: I1204 22:51:25.998711 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fgqms" Dec 04 22:51:25.999165 master-0 kubenswrapper[33572]: I1204 22:51:25.998915 33572 scope.go:117] "RemoveContainer" containerID="a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd" Dec 04 22:51:26.034486 master-0 kubenswrapper[33572]: I1204 22:51:26.034418 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-859d7\" (UniqueName: \"kubernetes.io/projected/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-kube-api-access-859d7\") on node \"master-0\" DevicePath \"\"" Dec 04 22:51:26.034486 master-0 kubenswrapper[33572]: I1204 22:51:26.034472 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:51:26.034486 master-0 kubenswrapper[33572]: I1204 22:51:26.034492 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:51:26.051899 master-0 kubenswrapper[33572]: I1204 22:51:26.051851 33572 scope.go:117] "RemoveContainer" containerID="4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3" Dec 04 22:51:26.063341 master-0 kubenswrapper[33572]: I1204 22:51:26.063286 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fgqms"] Dec 04 22:51:26.077919 master-0 kubenswrapper[33572]: I1204 22:51:26.077847 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fgqms"] Dec 04 22:51:26.081043 master-0 kubenswrapper[33572]: I1204 22:51:26.081011 33572 scope.go:117] "RemoveContainer" containerID="49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317" Dec 04 22:51:26.124057 master-0 kubenswrapper[33572]: I1204 22:51:26.123998 33572 scope.go:117] "RemoveContainer" containerID="a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd" Dec 04 22:51:26.128413 master-0 kubenswrapper[33572]: E1204 22:51:26.128324 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd\": container with ID starting with a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd not found: ID does not exist" containerID="a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd" Dec 04 22:51:26.128566 master-0 kubenswrapper[33572]: I1204 22:51:26.128429 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd"} err="failed to get container status \"a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd\": rpc error: code = NotFound desc = could not find container \"a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd\": container with ID starting with a63b5db97c43e70195a2b6f5f37937425a2c5b4215313cbd80ade18a8221d2bd not found: ID does not exist" Dec 04 22:51:26.128566 master-0 kubenswrapper[33572]: I1204 22:51:26.128488 33572 scope.go:117] "RemoveContainer" containerID="4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3" Dec 04 22:51:26.129284 master-0 kubenswrapper[33572]: E1204 22:51:26.129244 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3\": container with ID starting with 4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3 not found: ID does not exist" containerID="4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3" Dec 04 22:51:26.129357 master-0 kubenswrapper[33572]: I1204 22:51:26.129312 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3"} err="failed to get container status \"4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3\": rpc error: code = NotFound desc = could not find container \"4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3\": container with ID starting with 4d37ec7713ef61cc1debde31356f4d8ffe722989d2148a5de09c2d9dcd9269c3 not found: ID does not exist" Dec 04 22:51:26.129409 master-0 kubenswrapper[33572]: I1204 22:51:26.129349 33572 scope.go:117] "RemoveContainer" containerID="49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317" Dec 04 22:51:26.129864 master-0 kubenswrapper[33572]: E1204 22:51:26.129778 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317\": container with ID starting with 49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317 not found: ID does not exist" containerID="49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317" Dec 04 22:51:26.129864 master-0 kubenswrapper[33572]: I1204 22:51:26.129840 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317"} err="failed to get container status \"49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317\": rpc error: code = NotFound desc = could not find container \"49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317\": container with ID starting with 49e80c9593984bbea6442488a348e5af28013669246d476bfa00f976c3518317 not found: ID does not exist" Dec 04 22:51:26.551799 master-0 kubenswrapper[33572]: I1204 22:51:26.551696 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" path="/var/lib/kubelet/pods/3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132/volumes" Dec 04 22:51:33.208725 master-0 kubenswrapper[33572]: I1204 22:51:33.205911 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qvhw4"] Dec 04 22:51:33.208725 master-0 kubenswrapper[33572]: E1204 22:51:33.206558 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerName="extract-utilities" Dec 04 22:51:33.208725 master-0 kubenswrapper[33572]: I1204 22:51:33.206576 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerName="extract-utilities" Dec 04 22:51:33.208725 master-0 kubenswrapper[33572]: E1204 22:51:33.206645 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerName="registry-server" Dec 04 22:51:33.208725 master-0 kubenswrapper[33572]: I1204 22:51:33.206654 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerName="registry-server" Dec 04 22:51:33.208725 master-0 kubenswrapper[33572]: E1204 22:51:33.206695 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerName="extract-content" Dec 04 22:51:33.208725 master-0 kubenswrapper[33572]: I1204 22:51:33.206704 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerName="extract-content" Dec 04 22:51:33.208725 master-0 kubenswrapper[33572]: I1204 22:51:33.207031 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fc8c55b-e5a3-4df7-8fb9-6abd8d8eb132" containerName="registry-server" Dec 04 22:51:33.214535 master-0 kubenswrapper[33572]: I1204 22:51:33.211098 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:33.223545 master-0 kubenswrapper[33572]: I1204 22:51:33.223384 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvhw4"] Dec 04 22:51:33.341261 master-0 kubenswrapper[33572]: I1204 22:51:33.341151 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rp2l\" (UniqueName: \"kubernetes.io/projected/88842dc2-39b2-4c06-bf88-2120de259ac7-kube-api-access-2rp2l\") pod \"redhat-marketplace-qvhw4\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:33.341261 master-0 kubenswrapper[33572]: I1204 22:51:33.341259 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-catalog-content\") pod \"redhat-marketplace-qvhw4\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:33.341593 master-0 kubenswrapper[33572]: I1204 22:51:33.341444 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-utilities\") pod \"redhat-marketplace-qvhw4\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:33.443606 master-0 kubenswrapper[33572]: I1204 22:51:33.443357 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rp2l\" (UniqueName: \"kubernetes.io/projected/88842dc2-39b2-4c06-bf88-2120de259ac7-kube-api-access-2rp2l\") pod \"redhat-marketplace-qvhw4\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:33.443606 master-0 kubenswrapper[33572]: I1204 22:51:33.443425 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-catalog-content\") pod \"redhat-marketplace-qvhw4\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:33.443606 master-0 kubenswrapper[33572]: I1204 22:51:33.443535 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-utilities\") pod \"redhat-marketplace-qvhw4\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:33.444662 master-0 kubenswrapper[33572]: I1204 22:51:33.444282 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-utilities\") pod \"redhat-marketplace-qvhw4\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:33.448613 master-0 kubenswrapper[33572]: I1204 22:51:33.444830 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-catalog-content\") pod \"redhat-marketplace-qvhw4\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:33.472647 master-0 kubenswrapper[33572]: I1204 22:51:33.469057 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rp2l\" (UniqueName: \"kubernetes.io/projected/88842dc2-39b2-4c06-bf88-2120de259ac7-kube-api-access-2rp2l\") pod \"redhat-marketplace-qvhw4\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:33.558897 master-0 kubenswrapper[33572]: I1204 22:51:33.558833 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:34.114140 master-0 kubenswrapper[33572]: I1204 22:51:34.114040 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvhw4"] Dec 04 22:51:35.137543 master-0 kubenswrapper[33572]: I1204 22:51:35.137468 33572 generic.go:334] "Generic (PLEG): container finished" podID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerID="2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc" exitCode=0 Dec 04 22:51:35.138261 master-0 kubenswrapper[33572]: I1204 22:51:35.137546 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvhw4" event={"ID":"88842dc2-39b2-4c06-bf88-2120de259ac7","Type":"ContainerDied","Data":"2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc"} Dec 04 22:51:35.138261 master-0 kubenswrapper[33572]: I1204 22:51:35.137611 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvhw4" event={"ID":"88842dc2-39b2-4c06-bf88-2120de259ac7","Type":"ContainerStarted","Data":"d8d11d68a766d9a5ae1823275e7fa968b81b5b2b586f909f47e87b56384826fb"} Dec 04 22:51:36.152318 master-0 kubenswrapper[33572]: I1204 22:51:36.152225 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvhw4" event={"ID":"88842dc2-39b2-4c06-bf88-2120de259ac7","Type":"ContainerStarted","Data":"5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453"} Dec 04 22:51:37.175639 master-0 kubenswrapper[33572]: I1204 22:51:37.175422 33572 generic.go:334] "Generic (PLEG): container finished" podID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerID="5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453" exitCode=0 Dec 04 22:51:37.175639 master-0 kubenswrapper[33572]: I1204 22:51:37.175569 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvhw4" event={"ID":"88842dc2-39b2-4c06-bf88-2120de259ac7","Type":"ContainerDied","Data":"5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453"} Dec 04 22:51:38.198103 master-0 kubenswrapper[33572]: I1204 22:51:38.196135 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvhw4" event={"ID":"88842dc2-39b2-4c06-bf88-2120de259ac7","Type":"ContainerStarted","Data":"481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906"} Dec 04 22:51:38.232393 master-0 kubenswrapper[33572]: I1204 22:51:38.232148 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qvhw4" podStartSLOduration=2.6848149770000003 podStartE2EDuration="5.23212665s" podCreationTimestamp="2025-12-04 22:51:33 +0000 UTC" firstStartedPulling="2025-12-04 22:51:35.139269945 +0000 UTC m=+1958.866795594" lastFinishedPulling="2025-12-04 22:51:37.686581618 +0000 UTC m=+1961.414107267" observedRunningTime="2025-12-04 22:51:38.226407083 +0000 UTC m=+1961.953932732" watchObservedRunningTime="2025-12-04 22:51:38.23212665 +0000 UTC m=+1961.959652309" Dec 04 22:51:43.559164 master-0 kubenswrapper[33572]: I1204 22:51:43.559095 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:43.560597 master-0 kubenswrapper[33572]: I1204 22:51:43.560052 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:43.659077 master-0 kubenswrapper[33572]: I1204 22:51:43.658789 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:44.346738 master-0 kubenswrapper[33572]: I1204 22:51:44.346676 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:44.456588 master-0 kubenswrapper[33572]: I1204 22:51:44.456068 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvhw4"] Dec 04 22:51:46.309099 master-0 kubenswrapper[33572]: I1204 22:51:46.308956 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qvhw4" podUID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerName="registry-server" containerID="cri-o://481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906" gracePeriod=2 Dec 04 22:51:46.992896 master-0 kubenswrapper[33572]: I1204 22:51:46.992845 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:47.134094 master-0 kubenswrapper[33572]: I1204 22:51:47.134015 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-catalog-content\") pod \"88842dc2-39b2-4c06-bf88-2120de259ac7\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " Dec 04 22:51:47.134220 master-0 kubenswrapper[33572]: I1204 22:51:47.134098 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rp2l\" (UniqueName: \"kubernetes.io/projected/88842dc2-39b2-4c06-bf88-2120de259ac7-kube-api-access-2rp2l\") pod \"88842dc2-39b2-4c06-bf88-2120de259ac7\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " Dec 04 22:51:47.134220 master-0 kubenswrapper[33572]: I1204 22:51:47.134135 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-utilities\") pod \"88842dc2-39b2-4c06-bf88-2120de259ac7\" (UID: \"88842dc2-39b2-4c06-bf88-2120de259ac7\") " Dec 04 22:51:47.135872 master-0 kubenswrapper[33572]: I1204 22:51:47.135824 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-utilities" (OuterVolumeSpecName: "utilities") pod "88842dc2-39b2-4c06-bf88-2120de259ac7" (UID: "88842dc2-39b2-4c06-bf88-2120de259ac7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:51:47.137291 master-0 kubenswrapper[33572]: I1204 22:51:47.137264 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88842dc2-39b2-4c06-bf88-2120de259ac7-kube-api-access-2rp2l" (OuterVolumeSpecName: "kube-api-access-2rp2l") pod "88842dc2-39b2-4c06-bf88-2120de259ac7" (UID: "88842dc2-39b2-4c06-bf88-2120de259ac7"). InnerVolumeSpecName "kube-api-access-2rp2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:51:47.152429 master-0 kubenswrapper[33572]: I1204 22:51:47.152287 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88842dc2-39b2-4c06-bf88-2120de259ac7" (UID: "88842dc2-39b2-4c06-bf88-2120de259ac7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:51:47.236993 master-0 kubenswrapper[33572]: I1204 22:51:47.236944 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rp2l\" (UniqueName: \"kubernetes.io/projected/88842dc2-39b2-4c06-bf88-2120de259ac7-kube-api-access-2rp2l\") on node \"master-0\" DevicePath \"\"" Dec 04 22:51:47.236993 master-0 kubenswrapper[33572]: I1204 22:51:47.236983 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:51:47.236993 master-0 kubenswrapper[33572]: I1204 22:51:47.236993 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88842dc2-39b2-4c06-bf88-2120de259ac7-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:51:47.326643 master-0 kubenswrapper[33572]: I1204 22:51:47.326569 33572 generic.go:334] "Generic (PLEG): container finished" podID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerID="481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906" exitCode=0 Dec 04 22:51:47.326643 master-0 kubenswrapper[33572]: I1204 22:51:47.326631 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvhw4" event={"ID":"88842dc2-39b2-4c06-bf88-2120de259ac7","Type":"ContainerDied","Data":"481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906"} Dec 04 22:51:47.327236 master-0 kubenswrapper[33572]: I1204 22:51:47.326667 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qvhw4" event={"ID":"88842dc2-39b2-4c06-bf88-2120de259ac7","Type":"ContainerDied","Data":"d8d11d68a766d9a5ae1823275e7fa968b81b5b2b586f909f47e87b56384826fb"} Dec 04 22:51:47.327236 master-0 kubenswrapper[33572]: I1204 22:51:47.326689 33572 scope.go:117] "RemoveContainer" containerID="481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906" Dec 04 22:51:47.327236 master-0 kubenswrapper[33572]: I1204 22:51:47.326635 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qvhw4" Dec 04 22:51:47.368667 master-0 kubenswrapper[33572]: I1204 22:51:47.368406 33572 scope.go:117] "RemoveContainer" containerID="5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453" Dec 04 22:51:47.375901 master-0 kubenswrapper[33572]: I1204 22:51:47.375847 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvhw4"] Dec 04 22:51:47.387336 master-0 kubenswrapper[33572]: I1204 22:51:47.387254 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qvhw4"] Dec 04 22:51:47.409155 master-0 kubenswrapper[33572]: I1204 22:51:47.409003 33572 scope.go:117] "RemoveContainer" containerID="2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc" Dec 04 22:51:47.439802 master-0 kubenswrapper[33572]: I1204 22:51:47.439680 33572 scope.go:117] "RemoveContainer" containerID="481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906" Dec 04 22:51:47.440138 master-0 kubenswrapper[33572]: E1204 22:51:47.440080 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906\": container with ID starting with 481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906 not found: ID does not exist" containerID="481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906" Dec 04 22:51:47.440313 master-0 kubenswrapper[33572]: I1204 22:51:47.440146 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906"} err="failed to get container status \"481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906\": rpc error: code = NotFound desc = could not find container \"481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906\": container with ID starting with 481dec51f43ac7d1fd044501830e2d017caf3f3174e9f6a3b0517d700f5de906 not found: ID does not exist" Dec 04 22:51:47.440313 master-0 kubenswrapper[33572]: I1204 22:51:47.440170 33572 scope.go:117] "RemoveContainer" containerID="5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453" Dec 04 22:51:47.440802 master-0 kubenswrapper[33572]: E1204 22:51:47.440405 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453\": container with ID starting with 5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453 not found: ID does not exist" containerID="5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453" Dec 04 22:51:47.440802 master-0 kubenswrapper[33572]: I1204 22:51:47.440433 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453"} err="failed to get container status \"5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453\": rpc error: code = NotFound desc = could not find container \"5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453\": container with ID starting with 5247fead58fb1516a326d463a4ee1ff6dc3e6786fc345890cdb530f963411453 not found: ID does not exist" Dec 04 22:51:47.440802 master-0 kubenswrapper[33572]: I1204 22:51:47.440479 33572 scope.go:117] "RemoveContainer" containerID="2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc" Dec 04 22:51:47.440802 master-0 kubenswrapper[33572]: E1204 22:51:47.440717 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc\": container with ID starting with 2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc not found: ID does not exist" containerID="2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc" Dec 04 22:51:47.440802 master-0 kubenswrapper[33572]: I1204 22:51:47.440743 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc"} err="failed to get container status \"2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc\": rpc error: code = NotFound desc = could not find container \"2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc\": container with ID starting with 2c8eb5713fcbc835a77efe968f35616e6a2b367b0af49a7813fa02aede84bbcc not found: ID does not exist" Dec 04 22:51:48.542324 master-0 kubenswrapper[33572]: I1204 22:51:48.542244 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88842dc2-39b2-4c06-bf88-2120de259ac7" path="/var/lib/kubelet/pods/88842dc2-39b2-4c06-bf88-2120de259ac7/volumes" Dec 04 22:57:03.498485 master-0 kubenswrapper[33572]: I1204 22:57:03.496776 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jvmmw"] Dec 04 22:57:03.498485 master-0 kubenswrapper[33572]: E1204 22:57:03.497584 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerName="extract-content" Dec 04 22:57:03.498485 master-0 kubenswrapper[33572]: I1204 22:57:03.497609 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerName="extract-content" Dec 04 22:57:03.498485 master-0 kubenswrapper[33572]: E1204 22:57:03.497664 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerName="registry-server" Dec 04 22:57:03.498485 master-0 kubenswrapper[33572]: I1204 22:57:03.497677 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerName="registry-server" Dec 04 22:57:03.498485 master-0 kubenswrapper[33572]: E1204 22:57:03.497708 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerName="extract-utilities" Dec 04 22:57:03.498485 master-0 kubenswrapper[33572]: I1204 22:57:03.497723 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerName="extract-utilities" Dec 04 22:57:03.498485 master-0 kubenswrapper[33572]: I1204 22:57:03.498201 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="88842dc2-39b2-4c06-bf88-2120de259ac7" containerName="registry-server" Dec 04 22:57:03.501572 master-0 kubenswrapper[33572]: I1204 22:57:03.501467 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:03.525393 master-0 kubenswrapper[33572]: I1204 22:57:03.525303 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvmmw"] Dec 04 22:57:03.593292 master-0 kubenswrapper[33572]: I1204 22:57:03.593214 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgnxq\" (UniqueName: \"kubernetes.io/projected/4a82beef-13bb-4dc7-a953-75a9d411a207-kube-api-access-kgnxq\") pod \"redhat-operators-jvmmw\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:03.593539 master-0 kubenswrapper[33572]: I1204 22:57:03.593332 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-utilities\") pod \"redhat-operators-jvmmw\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:03.593539 master-0 kubenswrapper[33572]: I1204 22:57:03.593492 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-catalog-content\") pod \"redhat-operators-jvmmw\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:03.696383 master-0 kubenswrapper[33572]: I1204 22:57:03.696301 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgnxq\" (UniqueName: \"kubernetes.io/projected/4a82beef-13bb-4dc7-a953-75a9d411a207-kube-api-access-kgnxq\") pod \"redhat-operators-jvmmw\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:03.696674 master-0 kubenswrapper[33572]: I1204 22:57:03.696434 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-utilities\") pod \"redhat-operators-jvmmw\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:03.696674 master-0 kubenswrapper[33572]: I1204 22:57:03.696488 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-catalog-content\") pod \"redhat-operators-jvmmw\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:03.697233 master-0 kubenswrapper[33572]: I1204 22:57:03.697194 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-catalog-content\") pod \"redhat-operators-jvmmw\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:03.697879 master-0 kubenswrapper[33572]: I1204 22:57:03.697838 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-utilities\") pod \"redhat-operators-jvmmw\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:03.721104 master-0 kubenswrapper[33572]: I1204 22:57:03.720988 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgnxq\" (UniqueName: \"kubernetes.io/projected/4a82beef-13bb-4dc7-a953-75a9d411a207-kube-api-access-kgnxq\") pod \"redhat-operators-jvmmw\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:03.863342 master-0 kubenswrapper[33572]: I1204 22:57:03.863253 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:04.000441 master-0 kubenswrapper[33572]: I1204 22:57:04.000389 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jsj7z"] Dec 04 22:57:04.003453 master-0 kubenswrapper[33572]: I1204 22:57:04.003409 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.021411 master-0 kubenswrapper[33572]: I1204 22:57:04.021328 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jsj7z"] Dec 04 22:57:04.106766 master-0 kubenswrapper[33572]: I1204 22:57:04.106700 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-catalog-content\") pod \"certified-operators-jsj7z\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.106985 master-0 kubenswrapper[33572]: I1204 22:57:04.106799 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-utilities\") pod \"certified-operators-jsj7z\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.106985 master-0 kubenswrapper[33572]: I1204 22:57:04.106826 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfldz\" (UniqueName: \"kubernetes.io/projected/eac29863-376a-48f4-a602-c2364fe3ae3f-kube-api-access-bfldz\") pod \"certified-operators-jsj7z\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.209997 master-0 kubenswrapper[33572]: I1204 22:57:04.207700 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-utilities\") pod \"certified-operators-jsj7z\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.209997 master-0 kubenswrapper[33572]: I1204 22:57:04.207744 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfldz\" (UniqueName: \"kubernetes.io/projected/eac29863-376a-48f4-a602-c2364fe3ae3f-kube-api-access-bfldz\") pod \"certified-operators-jsj7z\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.209997 master-0 kubenswrapper[33572]: I1204 22:57:04.207900 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-catalog-content\") pod \"certified-operators-jsj7z\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.209997 master-0 kubenswrapper[33572]: I1204 22:57:04.208431 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-catalog-content\") pod \"certified-operators-jsj7z\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.209997 master-0 kubenswrapper[33572]: I1204 22:57:04.208673 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-utilities\") pod \"certified-operators-jsj7z\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.230446 master-0 kubenswrapper[33572]: I1204 22:57:04.230369 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfldz\" (UniqueName: \"kubernetes.io/projected/eac29863-376a-48f4-a602-c2364fe3ae3f-kube-api-access-bfldz\") pod \"certified-operators-jsj7z\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.346529 master-0 kubenswrapper[33572]: I1204 22:57:04.346455 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:04.664044 master-0 kubenswrapper[33572]: I1204 22:57:04.663997 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvmmw"] Dec 04 22:57:04.901877 master-0 kubenswrapper[33572]: W1204 22:57:04.901805 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeac29863_376a_48f4_a602_c2364fe3ae3f.slice/crio-84e0e1557423d35c42cd0b248f4661e11396a3ccc5a95f01b8d25cf45511c288 WatchSource:0}: Error finding container 84e0e1557423d35c42cd0b248f4661e11396a3ccc5a95f01b8d25cf45511c288: Status 404 returned error can't find the container with id 84e0e1557423d35c42cd0b248f4661e11396a3ccc5a95f01b8d25cf45511c288 Dec 04 22:57:04.924277 master-0 kubenswrapper[33572]: I1204 22:57:04.924193 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jsj7z"] Dec 04 22:57:05.474222 master-0 kubenswrapper[33572]: I1204 22:57:05.474129 33572 generic.go:334] "Generic (PLEG): container finished" podID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerID="a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d" exitCode=0 Dec 04 22:57:05.474484 master-0 kubenswrapper[33572]: I1204 22:57:05.474241 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvmmw" event={"ID":"4a82beef-13bb-4dc7-a953-75a9d411a207","Type":"ContainerDied","Data":"a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d"} Dec 04 22:57:05.474484 master-0 kubenswrapper[33572]: I1204 22:57:05.474279 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvmmw" event={"ID":"4a82beef-13bb-4dc7-a953-75a9d411a207","Type":"ContainerStarted","Data":"a90b81d49e92526436d8962c3b62a7098f2a974964178a2aee9b304ff866e343"} Dec 04 22:57:05.476384 master-0 kubenswrapper[33572]: I1204 22:57:05.476350 33572 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 22:57:05.479856 master-0 kubenswrapper[33572]: I1204 22:57:05.479794 33572 generic.go:334] "Generic (PLEG): container finished" podID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerID="652fd0503ea087653937ad4badf82d2de8f5bf2077fb46ca5556d616f78011aa" exitCode=0 Dec 04 22:57:05.480607 master-0 kubenswrapper[33572]: I1204 22:57:05.479863 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jsj7z" event={"ID":"eac29863-376a-48f4-a602-c2364fe3ae3f","Type":"ContainerDied","Data":"652fd0503ea087653937ad4badf82d2de8f5bf2077fb46ca5556d616f78011aa"} Dec 04 22:57:05.480717 master-0 kubenswrapper[33572]: I1204 22:57:05.480677 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jsj7z" event={"ID":"eac29863-376a-48f4-a602-c2364fe3ae3f","Type":"ContainerStarted","Data":"84e0e1557423d35c42cd0b248f4661e11396a3ccc5a95f01b8d25cf45511c288"} Dec 04 22:57:06.494458 master-0 kubenswrapper[33572]: I1204 22:57:06.494381 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvmmw" event={"ID":"4a82beef-13bb-4dc7-a953-75a9d411a207","Type":"ContainerStarted","Data":"464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f"} Dec 04 22:57:07.510790 master-0 kubenswrapper[33572]: I1204 22:57:07.510658 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jsj7z" event={"ID":"eac29863-376a-48f4-a602-c2364fe3ae3f","Type":"ContainerDied","Data":"7bad7792bc8bdada70f5f135e9cca584e1dc169bd8c34ecc272f1566156eda21"} Dec 04 22:57:07.513483 master-0 kubenswrapper[33572]: I1204 22:57:07.510589 33572 generic.go:334] "Generic (PLEG): container finished" podID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerID="7bad7792bc8bdada70f5f135e9cca584e1dc169bd8c34ecc272f1566156eda21" exitCode=0 Dec 04 22:57:07.517255 master-0 kubenswrapper[33572]: I1204 22:57:07.517187 33572 generic.go:334] "Generic (PLEG): container finished" podID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerID="464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f" exitCode=0 Dec 04 22:57:07.517255 master-0 kubenswrapper[33572]: I1204 22:57:07.517235 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvmmw" event={"ID":"4a82beef-13bb-4dc7-a953-75a9d411a207","Type":"ContainerDied","Data":"464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f"} Dec 04 22:57:08.540362 master-0 kubenswrapper[33572]: I1204 22:57:08.540295 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jsj7z" event={"ID":"eac29863-376a-48f4-a602-c2364fe3ae3f","Type":"ContainerStarted","Data":"1e4d8cebb5a52885f60ab8e39d9a2539be00eda361dd27f89704aad2844dc641"} Dec 04 22:57:08.540362 master-0 kubenswrapper[33572]: I1204 22:57:08.540355 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvmmw" event={"ID":"4a82beef-13bb-4dc7-a953-75a9d411a207","Type":"ContainerStarted","Data":"3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb"} Dec 04 22:57:08.565770 master-0 kubenswrapper[33572]: I1204 22:57:08.565673 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jsj7z" podStartSLOduration=2.880841544 podStartE2EDuration="5.565654552s" podCreationTimestamp="2025-12-04 22:57:03 +0000 UTC" firstStartedPulling="2025-12-04 22:57:05.481910327 +0000 UTC m=+2289.209435976" lastFinishedPulling="2025-12-04 22:57:08.166723295 +0000 UTC m=+2291.894248984" observedRunningTime="2025-12-04 22:57:08.556078402 +0000 UTC m=+2292.283604051" watchObservedRunningTime="2025-12-04 22:57:08.565654552 +0000 UTC m=+2292.293180191" Dec 04 22:57:08.582060 master-0 kubenswrapper[33572]: I1204 22:57:08.581986 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jvmmw" podStartSLOduration=3.120883756 podStartE2EDuration="5.581967194s" podCreationTimestamp="2025-12-04 22:57:03 +0000 UTC" firstStartedPulling="2025-12-04 22:57:05.476274724 +0000 UTC m=+2289.203800373" lastFinishedPulling="2025-12-04 22:57:07.937358122 +0000 UTC m=+2291.664883811" observedRunningTime="2025-12-04 22:57:08.576024283 +0000 UTC m=+2292.303549922" watchObservedRunningTime="2025-12-04 22:57:08.581967194 +0000 UTC m=+2292.309492843" Dec 04 22:57:13.864014 master-0 kubenswrapper[33572]: I1204 22:57:13.863898 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:13.864014 master-0 kubenswrapper[33572]: I1204 22:57:13.863959 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:14.346846 master-0 kubenswrapper[33572]: I1204 22:57:14.346756 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:14.346846 master-0 kubenswrapper[33572]: I1204 22:57:14.346834 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:14.463278 master-0 kubenswrapper[33572]: I1204 22:57:14.463237 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:14.689924 master-0 kubenswrapper[33572]: I1204 22:57:14.689827 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:14.911448 master-0 kubenswrapper[33572]: I1204 22:57:14.911385 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jvmmw" podUID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerName="registry-server" probeResult="failure" output=< Dec 04 22:57:14.911448 master-0 kubenswrapper[33572]: timeout: failed to connect service ":50051" within 1s Dec 04 22:57:14.911448 master-0 kubenswrapper[33572]: > Dec 04 22:57:18.258482 master-0 kubenswrapper[33572]: I1204 22:57:18.258387 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jsj7z"] Dec 04 22:57:18.259047 master-0 kubenswrapper[33572]: I1204 22:57:18.258699 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jsj7z" podUID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerName="registry-server" containerID="cri-o://1e4d8cebb5a52885f60ab8e39d9a2539be00eda361dd27f89704aad2844dc641" gracePeriod=2 Dec 04 22:57:18.688419 master-0 kubenswrapper[33572]: I1204 22:57:18.688338 33572 generic.go:334] "Generic (PLEG): container finished" podID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerID="1e4d8cebb5a52885f60ab8e39d9a2539be00eda361dd27f89704aad2844dc641" exitCode=0 Dec 04 22:57:18.688879 master-0 kubenswrapper[33572]: I1204 22:57:18.688422 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jsj7z" event={"ID":"eac29863-376a-48f4-a602-c2364fe3ae3f","Type":"ContainerDied","Data":"1e4d8cebb5a52885f60ab8e39d9a2539be00eda361dd27f89704aad2844dc641"} Dec 04 22:57:20.080361 master-0 kubenswrapper[33572]: I1204 22:57:20.080295 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:20.256342 master-0 kubenswrapper[33572]: I1204 22:57:20.256186 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfldz\" (UniqueName: \"kubernetes.io/projected/eac29863-376a-48f4-a602-c2364fe3ae3f-kube-api-access-bfldz\") pod \"eac29863-376a-48f4-a602-c2364fe3ae3f\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " Dec 04 22:57:20.256584 master-0 kubenswrapper[33572]: I1204 22:57:20.256474 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-utilities\") pod \"eac29863-376a-48f4-a602-c2364fe3ae3f\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " Dec 04 22:57:20.256953 master-0 kubenswrapper[33572]: I1204 22:57:20.256901 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-catalog-content\") pod \"eac29863-376a-48f4-a602-c2364fe3ae3f\" (UID: \"eac29863-376a-48f4-a602-c2364fe3ae3f\") " Dec 04 22:57:20.257401 master-0 kubenswrapper[33572]: I1204 22:57:20.257330 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-utilities" (OuterVolumeSpecName: "utilities") pod "eac29863-376a-48f4-a602-c2364fe3ae3f" (UID: "eac29863-376a-48f4-a602-c2364fe3ae3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:57:20.259242 master-0 kubenswrapper[33572]: I1204 22:57:20.259125 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:57:20.260348 master-0 kubenswrapper[33572]: I1204 22:57:20.260271 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac29863-376a-48f4-a602-c2364fe3ae3f-kube-api-access-bfldz" (OuterVolumeSpecName: "kube-api-access-bfldz") pod "eac29863-376a-48f4-a602-c2364fe3ae3f" (UID: "eac29863-376a-48f4-a602-c2364fe3ae3f"). InnerVolumeSpecName "kube-api-access-bfldz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:57:20.357485 master-0 kubenswrapper[33572]: I1204 22:57:20.357321 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eac29863-376a-48f4-a602-c2364fe3ae3f" (UID: "eac29863-376a-48f4-a602-c2364fe3ae3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:57:20.361486 master-0 kubenswrapper[33572]: I1204 22:57:20.361425 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfldz\" (UniqueName: \"kubernetes.io/projected/eac29863-376a-48f4-a602-c2364fe3ae3f-kube-api-access-bfldz\") on node \"master-0\" DevicePath \"\"" Dec 04 22:57:20.361486 master-0 kubenswrapper[33572]: I1204 22:57:20.361470 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eac29863-376a-48f4-a602-c2364fe3ae3f-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:57:20.714898 master-0 kubenswrapper[33572]: I1204 22:57:20.714801 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jsj7z" event={"ID":"eac29863-376a-48f4-a602-c2364fe3ae3f","Type":"ContainerDied","Data":"84e0e1557423d35c42cd0b248f4661e11396a3ccc5a95f01b8d25cf45511c288"} Dec 04 22:57:20.714898 master-0 kubenswrapper[33572]: I1204 22:57:20.714867 33572 scope.go:117] "RemoveContainer" containerID="1e4d8cebb5a52885f60ab8e39d9a2539be00eda361dd27f89704aad2844dc641" Dec 04 22:57:20.714898 master-0 kubenswrapper[33572]: I1204 22:57:20.714897 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jsj7z" Dec 04 22:57:20.747575 master-0 kubenswrapper[33572]: I1204 22:57:20.747491 33572 scope.go:117] "RemoveContainer" containerID="7bad7792bc8bdada70f5f135e9cca584e1dc169bd8c34ecc272f1566156eda21" Dec 04 22:57:20.752167 master-0 kubenswrapper[33572]: I1204 22:57:20.752104 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jsj7z"] Dec 04 22:57:20.761611 master-0 kubenswrapper[33572]: I1204 22:57:20.761536 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jsj7z"] Dec 04 22:57:20.784270 master-0 kubenswrapper[33572]: I1204 22:57:20.784217 33572 scope.go:117] "RemoveContainer" containerID="652fd0503ea087653937ad4badf82d2de8f5bf2077fb46ca5556d616f78011aa" Dec 04 22:57:22.550867 master-0 kubenswrapper[33572]: I1204 22:57:22.550751 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac29863-376a-48f4-a602-c2364fe3ae3f" path="/var/lib/kubelet/pods/eac29863-376a-48f4-a602-c2364fe3ae3f/volumes" Dec 04 22:57:23.946310 master-0 kubenswrapper[33572]: I1204 22:57:23.946123 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:24.025579 master-0 kubenswrapper[33572]: I1204 22:57:24.025441 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:24.649238 master-0 kubenswrapper[33572]: I1204 22:57:24.649144 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvmmw"] Dec 04 22:57:25.815711 master-0 kubenswrapper[33572]: I1204 22:57:25.815603 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jvmmw" podUID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerName="registry-server" containerID="cri-o://3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb" gracePeriod=2 Dec 04 22:57:26.438712 master-0 kubenswrapper[33572]: I1204 22:57:26.438642 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:26.557946 master-0 kubenswrapper[33572]: I1204 22:57:26.557875 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-utilities\") pod \"4a82beef-13bb-4dc7-a953-75a9d411a207\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " Dec 04 22:57:26.558797 master-0 kubenswrapper[33572]: I1204 22:57:26.558756 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-utilities" (OuterVolumeSpecName: "utilities") pod "4a82beef-13bb-4dc7-a953-75a9d411a207" (UID: "4a82beef-13bb-4dc7-a953-75a9d411a207"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:57:26.558981 master-0 kubenswrapper[33572]: I1204 22:57:26.558926 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgnxq\" (UniqueName: \"kubernetes.io/projected/4a82beef-13bb-4dc7-a953-75a9d411a207-kube-api-access-kgnxq\") pod \"4a82beef-13bb-4dc7-a953-75a9d411a207\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " Dec 04 22:57:26.559267 master-0 kubenswrapper[33572]: I1204 22:57:26.559010 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-catalog-content\") pod \"4a82beef-13bb-4dc7-a953-75a9d411a207\" (UID: \"4a82beef-13bb-4dc7-a953-75a9d411a207\") " Dec 04 22:57:26.562734 master-0 kubenswrapper[33572]: I1204 22:57:26.562671 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 22:57:26.566766 master-0 kubenswrapper[33572]: I1204 22:57:26.566709 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a82beef-13bb-4dc7-a953-75a9d411a207-kube-api-access-kgnxq" (OuterVolumeSpecName: "kube-api-access-kgnxq") pod "4a82beef-13bb-4dc7-a953-75a9d411a207" (UID: "4a82beef-13bb-4dc7-a953-75a9d411a207"). InnerVolumeSpecName "kube-api-access-kgnxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 22:57:26.664829 master-0 kubenswrapper[33572]: I1204 22:57:26.664776 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgnxq\" (UniqueName: \"kubernetes.io/projected/4a82beef-13bb-4dc7-a953-75a9d411a207-kube-api-access-kgnxq\") on node \"master-0\" DevicePath \"\"" Dec 04 22:57:26.690326 master-0 kubenswrapper[33572]: I1204 22:57:26.690242 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a82beef-13bb-4dc7-a953-75a9d411a207" (UID: "4a82beef-13bb-4dc7-a953-75a9d411a207"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 22:57:26.769993 master-0 kubenswrapper[33572]: I1204 22:57:26.769764 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a82beef-13bb-4dc7-a953-75a9d411a207-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 22:57:26.834859 master-0 kubenswrapper[33572]: I1204 22:57:26.834804 33572 generic.go:334] "Generic (PLEG): container finished" podID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerID="3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb" exitCode=0 Dec 04 22:57:26.835737 master-0 kubenswrapper[33572]: I1204 22:57:26.834856 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvmmw" event={"ID":"4a82beef-13bb-4dc7-a953-75a9d411a207","Type":"ContainerDied","Data":"3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb"} Dec 04 22:57:26.835737 master-0 kubenswrapper[33572]: I1204 22:57:26.835614 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvmmw" event={"ID":"4a82beef-13bb-4dc7-a953-75a9d411a207","Type":"ContainerDied","Data":"a90b81d49e92526436d8962c3b62a7098f2a974964178a2aee9b304ff866e343"} Dec 04 22:57:26.835737 master-0 kubenswrapper[33572]: I1204 22:57:26.835643 33572 scope.go:117] "RemoveContainer" containerID="3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb" Dec 04 22:57:26.835737 master-0 kubenswrapper[33572]: I1204 22:57:26.834911 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvmmw" Dec 04 22:57:26.869296 master-0 kubenswrapper[33572]: I1204 22:57:26.869217 33572 scope.go:117] "RemoveContainer" containerID="464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f" Dec 04 22:57:26.906727 master-0 kubenswrapper[33572]: I1204 22:57:26.906636 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvmmw"] Dec 04 22:57:26.915039 master-0 kubenswrapper[33572]: I1204 22:57:26.914977 33572 scope.go:117] "RemoveContainer" containerID="a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d" Dec 04 22:57:26.922249 master-0 kubenswrapper[33572]: I1204 22:57:26.922169 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jvmmw"] Dec 04 22:57:26.992771 master-0 kubenswrapper[33572]: I1204 22:57:26.992709 33572 scope.go:117] "RemoveContainer" containerID="3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb" Dec 04 22:57:26.993608 master-0 kubenswrapper[33572]: E1204 22:57:26.993557 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb\": container with ID starting with 3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb not found: ID does not exist" containerID="3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb" Dec 04 22:57:26.993684 master-0 kubenswrapper[33572]: I1204 22:57:26.993615 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb"} err="failed to get container status \"3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb\": rpc error: code = NotFound desc = could not find container \"3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb\": container with ID starting with 3edc5d224f8dc60371668662d9721aefa34d0ee63a755194a3fab5f3ad793cfb not found: ID does not exist" Dec 04 22:57:26.993684 master-0 kubenswrapper[33572]: I1204 22:57:26.993646 33572 scope.go:117] "RemoveContainer" containerID="464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f" Dec 04 22:57:26.994141 master-0 kubenswrapper[33572]: E1204 22:57:26.994096 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f\": container with ID starting with 464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f not found: ID does not exist" containerID="464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f" Dec 04 22:57:26.994205 master-0 kubenswrapper[33572]: I1204 22:57:26.994144 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f"} err="failed to get container status \"464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f\": rpc error: code = NotFound desc = could not find container \"464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f\": container with ID starting with 464181c926d43def9a2040c1d3883916bf99241379d6ed5760153d561ecc303f not found: ID does not exist" Dec 04 22:57:26.994205 master-0 kubenswrapper[33572]: I1204 22:57:26.994183 33572 scope.go:117] "RemoveContainer" containerID="a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d" Dec 04 22:57:26.994592 master-0 kubenswrapper[33572]: E1204 22:57:26.994552 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d\": container with ID starting with a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d not found: ID does not exist" containerID="a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d" Dec 04 22:57:26.994592 master-0 kubenswrapper[33572]: I1204 22:57:26.994580 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d"} err="failed to get container status \"a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d\": rpc error: code = NotFound desc = could not find container \"a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d\": container with ID starting with a83d3c7496aeec72a73c52a3e3e353baf9cf9dc959729f4cdcbc47d7164f003d not found: ID does not exist" Dec 04 22:57:28.550901 master-0 kubenswrapper[33572]: I1204 22:57:28.550819 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a82beef-13bb-4dc7-a953-75a9d411a207" path="/var/lib/kubelet/pods/4a82beef-13bb-4dc7-a953-75a9d411a207/volumes" Dec 04 22:58:44.004572 master-0 kubenswrapper[33572]: E1204 22:58:44.004476 33572 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:47916->192.168.32.10:37473: write tcp 192.168.32.10:47916->192.168.32.10:37473: write: broken pipe Dec 04 23:00:00.202430 master-0 kubenswrapper[33572]: I1204 23:00:00.202334 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl"] Dec 04 23:00:00.203085 master-0 kubenswrapper[33572]: E1204 23:00:00.202963 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerName="registry-server" Dec 04 23:00:00.203085 master-0 kubenswrapper[33572]: I1204 23:00:00.202984 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerName="registry-server" Dec 04 23:00:00.203085 master-0 kubenswrapper[33572]: E1204 23:00:00.203002 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerName="registry-server" Dec 04 23:00:00.203085 master-0 kubenswrapper[33572]: I1204 23:00:00.203011 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerName="registry-server" Dec 04 23:00:00.203085 master-0 kubenswrapper[33572]: E1204 23:00:00.203034 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerName="extract-content" Dec 04 23:00:00.203085 master-0 kubenswrapper[33572]: I1204 23:00:00.203043 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerName="extract-content" Dec 04 23:00:00.203085 master-0 kubenswrapper[33572]: E1204 23:00:00.203081 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerName="extract-utilities" Dec 04 23:00:00.203085 master-0 kubenswrapper[33572]: I1204 23:00:00.203091 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerName="extract-utilities" Dec 04 23:00:00.203469 master-0 kubenswrapper[33572]: E1204 23:00:00.203132 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerName="extract-content" Dec 04 23:00:00.203469 master-0 kubenswrapper[33572]: I1204 23:00:00.203141 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerName="extract-content" Dec 04 23:00:00.203469 master-0 kubenswrapper[33572]: E1204 23:00:00.203159 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerName="extract-utilities" Dec 04 23:00:00.203469 master-0 kubenswrapper[33572]: I1204 23:00:00.203168 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerName="extract-utilities" Dec 04 23:00:00.203681 master-0 kubenswrapper[33572]: I1204 23:00:00.203521 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac29863-376a-48f4-a602-c2364fe3ae3f" containerName="registry-server" Dec 04 23:00:00.203681 master-0 kubenswrapper[33572]: I1204 23:00:00.203546 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a82beef-13bb-4dc7-a953-75a9d411a207" containerName="registry-server" Dec 04 23:00:00.204580 master-0 kubenswrapper[33572]: I1204 23:00:00.204543 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:00.206888 master-0 kubenswrapper[33572]: I1204 23:00:00.206827 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 23:00:00.207093 master-0 kubenswrapper[33572]: I1204 23:00:00.207059 33572 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-njflt" Dec 04 23:00:00.220374 master-0 kubenswrapper[33572]: I1204 23:00:00.220314 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl"] Dec 04 23:00:00.283651 master-0 kubenswrapper[33572]: I1204 23:00:00.283565 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdgt6\" (UniqueName: \"kubernetes.io/projected/52c83349-0c00-485a-b45e-45dc4818a965-kube-api-access-bdgt6\") pod \"collect-profiles-29414820-ckxxl\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:00.283895 master-0 kubenswrapper[33572]: I1204 23:00:00.283664 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52c83349-0c00-485a-b45e-45dc4818a965-config-volume\") pod \"collect-profiles-29414820-ckxxl\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:00.283895 master-0 kubenswrapper[33572]: I1204 23:00:00.283870 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52c83349-0c00-485a-b45e-45dc4818a965-secret-volume\") pod \"collect-profiles-29414820-ckxxl\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:00.388100 master-0 kubenswrapper[33572]: I1204 23:00:00.386708 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdgt6\" (UniqueName: \"kubernetes.io/projected/52c83349-0c00-485a-b45e-45dc4818a965-kube-api-access-bdgt6\") pod \"collect-profiles-29414820-ckxxl\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:00.388100 master-0 kubenswrapper[33572]: I1204 23:00:00.387358 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52c83349-0c00-485a-b45e-45dc4818a965-config-volume\") pod \"collect-profiles-29414820-ckxxl\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:00.388100 master-0 kubenswrapper[33572]: I1204 23:00:00.387613 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52c83349-0c00-485a-b45e-45dc4818a965-secret-volume\") pod \"collect-profiles-29414820-ckxxl\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:00.391732 master-0 kubenswrapper[33572]: I1204 23:00:00.389337 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52c83349-0c00-485a-b45e-45dc4818a965-config-volume\") pod \"collect-profiles-29414820-ckxxl\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:00.395520 master-0 kubenswrapper[33572]: I1204 23:00:00.392593 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52c83349-0c00-485a-b45e-45dc4818a965-secret-volume\") pod \"collect-profiles-29414820-ckxxl\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:00.416309 master-0 kubenswrapper[33572]: I1204 23:00:00.416252 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdgt6\" (UniqueName: \"kubernetes.io/projected/52c83349-0c00-485a-b45e-45dc4818a965-kube-api-access-bdgt6\") pod \"collect-profiles-29414820-ckxxl\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:00.572568 master-0 kubenswrapper[33572]: I1204 23:00:00.571832 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:01.086025 master-0 kubenswrapper[33572]: I1204 23:00:01.085956 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl"] Dec 04 23:00:01.090687 master-0 kubenswrapper[33572]: W1204 23:00:01.090619 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52c83349_0c00_485a_b45e_45dc4818a965.slice/crio-4e50e6071850331bae836866f12ef10162b78cc4df8c6dd22970a55347c96a76 WatchSource:0}: Error finding container 4e50e6071850331bae836866f12ef10162b78cc4df8c6dd22970a55347c96a76: Status 404 returned error can't find the container with id 4e50e6071850331bae836866f12ef10162b78cc4df8c6dd22970a55347c96a76 Dec 04 23:00:01.244139 master-0 kubenswrapper[33572]: I1204 23:00:01.244079 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" event={"ID":"52c83349-0c00-485a-b45e-45dc4818a965","Type":"ContainerStarted","Data":"4e50e6071850331bae836866f12ef10162b78cc4df8c6dd22970a55347c96a76"} Dec 04 23:00:02.264172 master-0 kubenswrapper[33572]: I1204 23:00:02.264010 33572 generic.go:334] "Generic (PLEG): container finished" podID="52c83349-0c00-485a-b45e-45dc4818a965" containerID="84a04ec911d73da85a5b86de1a6fac41e98cbb42c88ff2ce5bd62da4bf1dd0f9" exitCode=0 Dec 04 23:00:02.264172 master-0 kubenswrapper[33572]: I1204 23:00:02.264069 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" event={"ID":"52c83349-0c00-485a-b45e-45dc4818a965","Type":"ContainerDied","Data":"84a04ec911d73da85a5b86de1a6fac41e98cbb42c88ff2ce5bd62da4bf1dd0f9"} Dec 04 23:00:03.751348 master-0 kubenswrapper[33572]: I1204 23:00:03.751286 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:03.886280 master-0 kubenswrapper[33572]: I1204 23:00:03.886212 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52c83349-0c00-485a-b45e-45dc4818a965-secret-volume\") pod \"52c83349-0c00-485a-b45e-45dc4818a965\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " Dec 04 23:00:03.886579 master-0 kubenswrapper[33572]: I1204 23:00:03.886380 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52c83349-0c00-485a-b45e-45dc4818a965-config-volume\") pod \"52c83349-0c00-485a-b45e-45dc4818a965\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " Dec 04 23:00:03.886579 master-0 kubenswrapper[33572]: I1204 23:00:03.886409 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdgt6\" (UniqueName: \"kubernetes.io/projected/52c83349-0c00-485a-b45e-45dc4818a965-kube-api-access-bdgt6\") pod \"52c83349-0c00-485a-b45e-45dc4818a965\" (UID: \"52c83349-0c00-485a-b45e-45dc4818a965\") " Dec 04 23:00:03.887084 master-0 kubenswrapper[33572]: I1204 23:00:03.887026 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52c83349-0c00-485a-b45e-45dc4818a965-config-volume" (OuterVolumeSpecName: "config-volume") pod "52c83349-0c00-485a-b45e-45dc4818a965" (UID: "52c83349-0c00-485a-b45e-45dc4818a965"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 23:00:03.889236 master-0 kubenswrapper[33572]: I1204 23:00:03.889196 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c83349-0c00-485a-b45e-45dc4818a965-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "52c83349-0c00-485a-b45e-45dc4818a965" (UID: "52c83349-0c00-485a-b45e-45dc4818a965"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 23:00:03.890330 master-0 kubenswrapper[33572]: I1204 23:00:03.890256 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c83349-0c00-485a-b45e-45dc4818a965-kube-api-access-bdgt6" (OuterVolumeSpecName: "kube-api-access-bdgt6") pod "52c83349-0c00-485a-b45e-45dc4818a965" (UID: "52c83349-0c00-485a-b45e-45dc4818a965"). InnerVolumeSpecName "kube-api-access-bdgt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 23:00:03.990673 master-0 kubenswrapper[33572]: I1204 23:00:03.990590 33572 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52c83349-0c00-485a-b45e-45dc4818a965-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 23:00:03.990673 master-0 kubenswrapper[33572]: I1204 23:00:03.990658 33572 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52c83349-0c00-485a-b45e-45dc4818a965-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 04 23:00:03.990673 master-0 kubenswrapper[33572]: I1204 23:00:03.990676 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdgt6\" (UniqueName: \"kubernetes.io/projected/52c83349-0c00-485a-b45e-45dc4818a965-kube-api-access-bdgt6\") on node \"master-0\" DevicePath \"\"" Dec 04 23:00:04.292017 master-0 kubenswrapper[33572]: I1204 23:00:04.291865 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" event={"ID":"52c83349-0c00-485a-b45e-45dc4818a965","Type":"ContainerDied","Data":"4e50e6071850331bae836866f12ef10162b78cc4df8c6dd22970a55347c96a76"} Dec 04 23:00:04.292017 master-0 kubenswrapper[33572]: I1204 23:00:04.291944 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e50e6071850331bae836866f12ef10162b78cc4df8c6dd22970a55347c96a76" Dec 04 23:00:04.292017 master-0 kubenswrapper[33572]: I1204 23:00:04.291939 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414820-ckxxl" Dec 04 23:00:04.884364 master-0 kubenswrapper[33572]: I1204 23:00:04.884295 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr"] Dec 04 23:00:04.896070 master-0 kubenswrapper[33572]: I1204 23:00:04.895992 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414775-47tzr"] Dec 04 23:00:06.546253 master-0 kubenswrapper[33572]: I1204 23:00:06.546178 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85a7edee-7a4c-4f4f-b537-d1ce3a9f812f" path="/var/lib/kubelet/pods/85a7edee-7a4c-4f4f-b537-d1ce3a9f812f/volumes" Dec 04 23:00:23.739022 master-0 kubenswrapper[33572]: I1204 23:00:23.738939 33572 scope.go:117] "RemoveContainer" containerID="00713a1c06d69e4187d092bf84b0d17670a9eda7c3ce1307b7efa35d4e53871c" Dec 04 23:01:00.188145 master-0 kubenswrapper[33572]: I1204 23:01:00.188068 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29414821-bj9tw"] Dec 04 23:01:00.188886 master-0 kubenswrapper[33572]: E1204 23:01:00.188848 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c83349-0c00-485a-b45e-45dc4818a965" containerName="collect-profiles" Dec 04 23:01:00.188886 master-0 kubenswrapper[33572]: I1204 23:01:00.188882 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c83349-0c00-485a-b45e-45dc4818a965" containerName="collect-profiles" Dec 04 23:01:00.189379 master-0 kubenswrapper[33572]: I1204 23:01:00.189345 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c83349-0c00-485a-b45e-45dc4818a965" containerName="collect-profiles" Dec 04 23:01:00.190596 master-0 kubenswrapper[33572]: I1204 23:01:00.190538 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.207650 master-0 kubenswrapper[33572]: I1204 23:01:00.203636 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414821-bj9tw"] Dec 04 23:01:00.303125 master-0 kubenswrapper[33572]: I1204 23:01:00.303044 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwfx5\" (UniqueName: \"kubernetes.io/projected/a97eedad-6712-4411-9c71-2c5f1187a374-kube-api-access-qwfx5\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.303357 master-0 kubenswrapper[33572]: I1204 23:01:00.303319 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-fernet-keys\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.303445 master-0 kubenswrapper[33572]: I1204 23:01:00.303420 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-config-data\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.303773 master-0 kubenswrapper[33572]: I1204 23:01:00.303728 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-combined-ca-bundle\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.405626 master-0 kubenswrapper[33572]: I1204 23:01:00.405546 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwfx5\" (UniqueName: \"kubernetes.io/projected/a97eedad-6712-4411-9c71-2c5f1187a374-kube-api-access-qwfx5\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.405857 master-0 kubenswrapper[33572]: I1204 23:01:00.405695 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-fernet-keys\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.405857 master-0 kubenswrapper[33572]: I1204 23:01:00.405813 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-config-data\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.405975 master-0 kubenswrapper[33572]: I1204 23:01:00.405946 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-combined-ca-bundle\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.410537 master-0 kubenswrapper[33572]: I1204 23:01:00.410345 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-config-data\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.410537 master-0 kubenswrapper[33572]: I1204 23:01:00.410460 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-fernet-keys\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.411156 master-0 kubenswrapper[33572]: I1204 23:01:00.411120 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-combined-ca-bundle\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.422731 master-0 kubenswrapper[33572]: I1204 23:01:00.422687 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwfx5\" (UniqueName: \"kubernetes.io/projected/a97eedad-6712-4411-9c71-2c5f1187a374-kube-api-access-qwfx5\") pod \"keystone-cron-29414821-bj9tw\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:00.512105 master-0 kubenswrapper[33572]: I1204 23:01:00.511969 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:01.103181 master-0 kubenswrapper[33572]: W1204 23:01:01.103091 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda97eedad_6712_4411_9c71_2c5f1187a374.slice/crio-942c2ac49b627d3dbcfbc0ed6ccfca262fc8da4c1097f92bbc0bd4510ad5b010 WatchSource:0}: Error finding container 942c2ac49b627d3dbcfbc0ed6ccfca262fc8da4c1097f92bbc0bd4510ad5b010: Status 404 returned error can't find the container with id 942c2ac49b627d3dbcfbc0ed6ccfca262fc8da4c1097f92bbc0bd4510ad5b010 Dec 04 23:01:01.105738 master-0 kubenswrapper[33572]: I1204 23:01:01.105679 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414821-bj9tw"] Dec 04 23:01:01.190358 master-0 kubenswrapper[33572]: I1204 23:01:01.190274 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414821-bj9tw" event={"ID":"a97eedad-6712-4411-9c71-2c5f1187a374","Type":"ContainerStarted","Data":"942c2ac49b627d3dbcfbc0ed6ccfca262fc8da4c1097f92bbc0bd4510ad5b010"} Dec 04 23:01:02.205297 master-0 kubenswrapper[33572]: I1204 23:01:02.205230 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414821-bj9tw" event={"ID":"a97eedad-6712-4411-9c71-2c5f1187a374","Type":"ContainerStarted","Data":"0db82c6dea084bb67fa0e9362db40375d06c748a25f9713c272599f9e5292dac"} Dec 04 23:01:02.232240 master-0 kubenswrapper[33572]: I1204 23:01:02.232159 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29414821-bj9tw" podStartSLOduration=2.232143288 podStartE2EDuration="2.232143288s" podCreationTimestamp="2025-12-04 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 23:01:02.222088459 +0000 UTC m=+2525.949614118" watchObservedRunningTime="2025-12-04 23:01:02.232143288 +0000 UTC m=+2525.959668937" Dec 04 23:01:04.247541 master-0 kubenswrapper[33572]: I1204 23:01:04.247430 33572 generic.go:334] "Generic (PLEG): container finished" podID="a97eedad-6712-4411-9c71-2c5f1187a374" containerID="0db82c6dea084bb67fa0e9362db40375d06c748a25f9713c272599f9e5292dac" exitCode=0 Dec 04 23:01:04.248391 master-0 kubenswrapper[33572]: I1204 23:01:04.247537 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414821-bj9tw" event={"ID":"a97eedad-6712-4411-9c71-2c5f1187a374","Type":"ContainerDied","Data":"0db82c6dea084bb67fa0e9362db40375d06c748a25f9713c272599f9e5292dac"} Dec 04 23:01:05.769728 master-0 kubenswrapper[33572]: I1204 23:01:05.769643 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:05.877608 master-0 kubenswrapper[33572]: I1204 23:01:05.877541 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-config-data\") pod \"a97eedad-6712-4411-9c71-2c5f1187a374\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " Dec 04 23:01:05.877832 master-0 kubenswrapper[33572]: I1204 23:01:05.877802 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-combined-ca-bundle\") pod \"a97eedad-6712-4411-9c71-2c5f1187a374\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " Dec 04 23:01:05.877888 master-0 kubenswrapper[33572]: I1204 23:01:05.877836 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-fernet-keys\") pod \"a97eedad-6712-4411-9c71-2c5f1187a374\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " Dec 04 23:01:05.877968 master-0 kubenswrapper[33572]: I1204 23:01:05.877940 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwfx5\" (UniqueName: \"kubernetes.io/projected/a97eedad-6712-4411-9c71-2c5f1187a374-kube-api-access-qwfx5\") pod \"a97eedad-6712-4411-9c71-2c5f1187a374\" (UID: \"a97eedad-6712-4411-9c71-2c5f1187a374\") " Dec 04 23:01:05.880739 master-0 kubenswrapper[33572]: I1204 23:01:05.880675 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97eedad-6712-4411-9c71-2c5f1187a374-kube-api-access-qwfx5" (OuterVolumeSpecName: "kube-api-access-qwfx5") pod "a97eedad-6712-4411-9c71-2c5f1187a374" (UID: "a97eedad-6712-4411-9c71-2c5f1187a374"). InnerVolumeSpecName "kube-api-access-qwfx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 23:01:05.880890 master-0 kubenswrapper[33572]: I1204 23:01:05.880816 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a97eedad-6712-4411-9c71-2c5f1187a374" (UID: "a97eedad-6712-4411-9c71-2c5f1187a374"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 23:01:05.907813 master-0 kubenswrapper[33572]: I1204 23:01:05.907750 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a97eedad-6712-4411-9c71-2c5f1187a374" (UID: "a97eedad-6712-4411-9c71-2c5f1187a374"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 23:01:05.931658 master-0 kubenswrapper[33572]: I1204 23:01:05.931467 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-config-data" (OuterVolumeSpecName: "config-data") pod "a97eedad-6712-4411-9c71-2c5f1187a374" (UID: "a97eedad-6712-4411-9c71-2c5f1187a374"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 23:01:05.980284 master-0 kubenswrapper[33572]: I1204 23:01:05.980195 33572 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 04 23:01:05.980284 master-0 kubenswrapper[33572]: I1204 23:01:05.980238 33572 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-fernet-keys\") on node \"master-0\" DevicePath \"\"" Dec 04 23:01:05.980284 master-0 kubenswrapper[33572]: I1204 23:01:05.980248 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwfx5\" (UniqueName: \"kubernetes.io/projected/a97eedad-6712-4411-9c71-2c5f1187a374-kube-api-access-qwfx5\") on node \"master-0\" DevicePath \"\"" Dec 04 23:01:05.980284 master-0 kubenswrapper[33572]: I1204 23:01:05.980258 33572 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a97eedad-6712-4411-9c71-2c5f1187a374-config-data\") on node \"master-0\" DevicePath \"\"" Dec 04 23:01:06.291973 master-0 kubenswrapper[33572]: I1204 23:01:06.291887 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414821-bj9tw" event={"ID":"a97eedad-6712-4411-9c71-2c5f1187a374","Type":"ContainerDied","Data":"942c2ac49b627d3dbcfbc0ed6ccfca262fc8da4c1097f92bbc0bd4510ad5b010"} Dec 04 23:01:06.291973 master-0 kubenswrapper[33572]: I1204 23:01:06.291939 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="942c2ac49b627d3dbcfbc0ed6ccfca262fc8da4c1097f92bbc0bd4510ad5b010" Dec 04 23:01:06.292717 master-0 kubenswrapper[33572]: I1204 23:01:06.291994 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414821-bj9tw" Dec 04 23:01:38.168586 master-0 kubenswrapper[33572]: I1204 23:01:38.168480 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4x8tr"] Dec 04 23:01:38.169695 master-0 kubenswrapper[33572]: E1204 23:01:38.169649 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97eedad-6712-4411-9c71-2c5f1187a374" containerName="keystone-cron" Dec 04 23:01:38.169763 master-0 kubenswrapper[33572]: I1204 23:01:38.169702 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97eedad-6712-4411-9c71-2c5f1187a374" containerName="keystone-cron" Dec 04 23:01:38.170414 master-0 kubenswrapper[33572]: I1204 23:01:38.170376 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97eedad-6712-4411-9c71-2c5f1187a374" containerName="keystone-cron" Dec 04 23:01:38.174786 master-0 kubenswrapper[33572]: I1204 23:01:38.174737 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:38.181717 master-0 kubenswrapper[33572]: I1204 23:01:38.181643 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4x8tr"] Dec 04 23:01:38.228755 master-0 kubenswrapper[33572]: I1204 23:01:38.228683 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-catalog-content\") pod \"community-operators-4x8tr\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:38.228996 master-0 kubenswrapper[33572]: I1204 23:01:38.228969 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjfk4\" (UniqueName: \"kubernetes.io/projected/9e996f94-c149-4043-b4c0-d317e0a94198-kube-api-access-qjfk4\") pod \"community-operators-4x8tr\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:38.229537 master-0 kubenswrapper[33572]: I1204 23:01:38.229479 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-utilities\") pod \"community-operators-4x8tr\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:38.332895 master-0 kubenswrapper[33572]: I1204 23:01:38.332830 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-utilities\") pod \"community-operators-4x8tr\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:38.333221 master-0 kubenswrapper[33572]: I1204 23:01:38.333188 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-catalog-content\") pod \"community-operators-4x8tr\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:38.333408 master-0 kubenswrapper[33572]: I1204 23:01:38.333391 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjfk4\" (UniqueName: \"kubernetes.io/projected/9e996f94-c149-4043-b4c0-d317e0a94198-kube-api-access-qjfk4\") pod \"community-operators-4x8tr\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:38.334012 master-0 kubenswrapper[33572]: I1204 23:01:38.333556 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-utilities\") pod \"community-operators-4x8tr\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:38.334126 master-0 kubenswrapper[33572]: I1204 23:01:38.333781 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-catalog-content\") pod \"community-operators-4x8tr\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:38.358026 master-0 kubenswrapper[33572]: I1204 23:01:38.357966 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjfk4\" (UniqueName: \"kubernetes.io/projected/9e996f94-c149-4043-b4c0-d317e0a94198-kube-api-access-qjfk4\") pod \"community-operators-4x8tr\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:38.549040 master-0 kubenswrapper[33572]: I1204 23:01:38.548977 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:39.144242 master-0 kubenswrapper[33572]: I1204 23:01:39.144185 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4x8tr"] Dec 04 23:01:39.760434 master-0 kubenswrapper[33572]: I1204 23:01:39.760284 33572 generic.go:334] "Generic (PLEG): container finished" podID="9e996f94-c149-4043-b4c0-d317e0a94198" containerID="cf3893f39c63bbf87bae8f753fdb061a5fe72ae8d4da1e1f296579d383fe403f" exitCode=0 Dec 04 23:01:39.760434 master-0 kubenswrapper[33572]: I1204 23:01:39.760378 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x8tr" event={"ID":"9e996f94-c149-4043-b4c0-d317e0a94198","Type":"ContainerDied","Data":"cf3893f39c63bbf87bae8f753fdb061a5fe72ae8d4da1e1f296579d383fe403f"} Dec 04 23:01:39.761076 master-0 kubenswrapper[33572]: I1204 23:01:39.760434 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x8tr" event={"ID":"9e996f94-c149-4043-b4c0-d317e0a94198","Type":"ContainerStarted","Data":"2681e9ea2a4c952c1ac4c2e656795a72efd75761afa0308eac5b60164331f47b"} Dec 04 23:01:45.845549 master-0 kubenswrapper[33572]: I1204 23:01:45.845318 33572 generic.go:334] "Generic (PLEG): container finished" podID="9e996f94-c149-4043-b4c0-d317e0a94198" containerID="419b5d80fd2de7fe05f566e46d091262db87346b9f19ccd417a9b2570dd3661d" exitCode=0 Dec 04 23:01:45.845549 master-0 kubenswrapper[33572]: I1204 23:01:45.845392 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x8tr" event={"ID":"9e996f94-c149-4043-b4c0-d317e0a94198","Type":"ContainerDied","Data":"419b5d80fd2de7fe05f566e46d091262db87346b9f19ccd417a9b2570dd3661d"} Dec 04 23:01:46.864723 master-0 kubenswrapper[33572]: I1204 23:01:46.864652 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x8tr" event={"ID":"9e996f94-c149-4043-b4c0-d317e0a94198","Type":"ContainerStarted","Data":"9ac72c068d7dd35dfcb2215d8eafc74e46faa400fd2229de663dbf8c355697b1"} Dec 04 23:01:46.912769 master-0 kubenswrapper[33572]: I1204 23:01:46.912674 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4x8tr" podStartSLOduration=2.38341388 podStartE2EDuration="8.912647034s" podCreationTimestamp="2025-12-04 23:01:38 +0000 UTC" firstStartedPulling="2025-12-04 23:01:39.763132882 +0000 UTC m=+2563.490658571" lastFinishedPulling="2025-12-04 23:01:46.292366066 +0000 UTC m=+2570.019891725" observedRunningTime="2025-12-04 23:01:46.88953417 +0000 UTC m=+2570.617059849" watchObservedRunningTime="2025-12-04 23:01:46.912647034 +0000 UTC m=+2570.640172703" Dec 04 23:01:48.577100 master-0 kubenswrapper[33572]: I1204 23:01:48.549645 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:48.578718 master-0 kubenswrapper[33572]: I1204 23:01:48.578045 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:49.634405 master-0 kubenswrapper[33572]: I1204 23:01:49.634210 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4x8tr" podUID="9e996f94-c149-4043-b4c0-d317e0a94198" containerName="registry-server" probeResult="failure" output=< Dec 04 23:01:49.634405 master-0 kubenswrapper[33572]: timeout: failed to connect service ":50051" within 1s Dec 04 23:01:49.634405 master-0 kubenswrapper[33572]: > Dec 04 23:01:56.650953 master-0 kubenswrapper[33572]: I1204 23:01:56.646347 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qzdc4"] Dec 04 23:01:56.650953 master-0 kubenswrapper[33572]: I1204 23:01:56.649810 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:56.690486 master-0 kubenswrapper[33572]: I1204 23:01:56.687046 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-catalog-content\") pod \"redhat-marketplace-qzdc4\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:56.690486 master-0 kubenswrapper[33572]: I1204 23:01:56.687137 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-utilities\") pod \"redhat-marketplace-qzdc4\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:56.690486 master-0 kubenswrapper[33572]: I1204 23:01:56.687477 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zh6g\" (UniqueName: \"kubernetes.io/projected/2e19aff9-6c1d-431f-8303-2bbcbc44d685-kube-api-access-5zh6g\") pod \"redhat-marketplace-qzdc4\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:56.690486 master-0 kubenswrapper[33572]: I1204 23:01:56.690265 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzdc4"] Dec 04 23:01:56.790086 master-0 kubenswrapper[33572]: I1204 23:01:56.790011 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zh6g\" (UniqueName: \"kubernetes.io/projected/2e19aff9-6c1d-431f-8303-2bbcbc44d685-kube-api-access-5zh6g\") pod \"redhat-marketplace-qzdc4\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:56.790325 master-0 kubenswrapper[33572]: I1204 23:01:56.790156 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-catalog-content\") pod \"redhat-marketplace-qzdc4\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:56.790325 master-0 kubenswrapper[33572]: I1204 23:01:56.790179 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-utilities\") pod \"redhat-marketplace-qzdc4\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:56.790829 master-0 kubenswrapper[33572]: I1204 23:01:56.790782 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-utilities\") pod \"redhat-marketplace-qzdc4\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:56.793397 master-0 kubenswrapper[33572]: I1204 23:01:56.792772 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-catalog-content\") pod \"redhat-marketplace-qzdc4\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:56.814167 master-0 kubenswrapper[33572]: I1204 23:01:56.814112 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zh6g\" (UniqueName: \"kubernetes.io/projected/2e19aff9-6c1d-431f-8303-2bbcbc44d685-kube-api-access-5zh6g\") pod \"redhat-marketplace-qzdc4\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:57.007088 master-0 kubenswrapper[33572]: I1204 23:01:57.006990 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:01:57.496860 master-0 kubenswrapper[33572]: I1204 23:01:57.496807 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzdc4"] Dec 04 23:01:58.029310 master-0 kubenswrapper[33572]: I1204 23:01:58.029220 33572 generic.go:334] "Generic (PLEG): container finished" podID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerID="f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae" exitCode=0 Dec 04 23:01:58.030159 master-0 kubenswrapper[33572]: I1204 23:01:58.029316 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzdc4" event={"ID":"2e19aff9-6c1d-431f-8303-2bbcbc44d685","Type":"ContainerDied","Data":"f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae"} Dec 04 23:01:58.030159 master-0 kubenswrapper[33572]: I1204 23:01:58.029367 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzdc4" event={"ID":"2e19aff9-6c1d-431f-8303-2bbcbc44d685","Type":"ContainerStarted","Data":"6ef867bda6a8abc02d192a51ea1648b732cdd33f4073195bf46ef9aba8a29331"} Dec 04 23:01:58.620100 master-0 kubenswrapper[33572]: I1204 23:01:58.619954 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:58.701413 master-0 kubenswrapper[33572]: I1204 23:01:58.701337 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:01:59.042570 master-0 kubenswrapper[33572]: I1204 23:01:59.042497 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzdc4" event={"ID":"2e19aff9-6c1d-431f-8303-2bbcbc44d685","Type":"ContainerStarted","Data":"b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93"} Dec 04 23:02:00.091731 master-0 kubenswrapper[33572]: I1204 23:02:00.090658 33572 generic.go:334] "Generic (PLEG): container finished" podID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerID="b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93" exitCode=0 Dec 04 23:02:00.091731 master-0 kubenswrapper[33572]: I1204 23:02:00.090731 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzdc4" event={"ID":"2e19aff9-6c1d-431f-8303-2bbcbc44d685","Type":"ContainerDied","Data":"b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93"} Dec 04 23:02:00.893474 master-0 kubenswrapper[33572]: I1204 23:02:00.893370 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4x8tr"] Dec 04 23:02:00.893953 master-0 kubenswrapper[33572]: I1204 23:02:00.893891 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4x8tr" podUID="9e996f94-c149-4043-b4c0-d317e0a94198" containerName="registry-server" containerID="cri-o://9ac72c068d7dd35dfcb2215d8eafc74e46faa400fd2229de663dbf8c355697b1" gracePeriod=2 Dec 04 23:02:01.118035 master-0 kubenswrapper[33572]: I1204 23:02:01.117963 33572 generic.go:334] "Generic (PLEG): container finished" podID="9e996f94-c149-4043-b4c0-d317e0a94198" containerID="9ac72c068d7dd35dfcb2215d8eafc74e46faa400fd2229de663dbf8c355697b1" exitCode=0 Dec 04 23:02:01.118730 master-0 kubenswrapper[33572]: I1204 23:02:01.118054 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x8tr" event={"ID":"9e996f94-c149-4043-b4c0-d317e0a94198","Type":"ContainerDied","Data":"9ac72c068d7dd35dfcb2215d8eafc74e46faa400fd2229de663dbf8c355697b1"} Dec 04 23:02:01.122535 master-0 kubenswrapper[33572]: I1204 23:02:01.121917 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzdc4" event={"ID":"2e19aff9-6c1d-431f-8303-2bbcbc44d685","Type":"ContainerStarted","Data":"e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60"} Dec 04 23:02:01.170392 master-0 kubenswrapper[33572]: I1204 23:02:01.170231 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qzdc4" podStartSLOduration=2.7218027019999997 podStartE2EDuration="5.170191836s" podCreationTimestamp="2025-12-04 23:01:56 +0000 UTC" firstStartedPulling="2025-12-04 23:01:58.033400842 +0000 UTC m=+2581.760926501" lastFinishedPulling="2025-12-04 23:02:00.481789976 +0000 UTC m=+2584.209315635" observedRunningTime="2025-12-04 23:02:01.155998939 +0000 UTC m=+2584.883524608" watchObservedRunningTime="2025-12-04 23:02:01.170191836 +0000 UTC m=+2584.897717535" Dec 04 23:02:01.547758 master-0 kubenswrapper[33572]: I1204 23:02:01.547651 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:02:01.653437 master-0 kubenswrapper[33572]: I1204 23:02:01.653340 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjfk4\" (UniqueName: \"kubernetes.io/projected/9e996f94-c149-4043-b4c0-d317e0a94198-kube-api-access-qjfk4\") pod \"9e996f94-c149-4043-b4c0-d317e0a94198\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " Dec 04 23:02:01.653437 master-0 kubenswrapper[33572]: I1204 23:02:01.653442 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-utilities\") pod \"9e996f94-c149-4043-b4c0-d317e0a94198\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " Dec 04 23:02:01.653731 master-0 kubenswrapper[33572]: I1204 23:02:01.653674 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-catalog-content\") pod \"9e996f94-c149-4043-b4c0-d317e0a94198\" (UID: \"9e996f94-c149-4043-b4c0-d317e0a94198\") " Dec 04 23:02:01.666826 master-0 kubenswrapper[33572]: I1204 23:02:01.666778 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-utilities" (OuterVolumeSpecName: "utilities") pod "9e996f94-c149-4043-b4c0-d317e0a94198" (UID: "9e996f94-c149-4043-b4c0-d317e0a94198"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 23:02:01.705181 master-0 kubenswrapper[33572]: I1204 23:02:01.680908 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e996f94-c149-4043-b4c0-d317e0a94198-kube-api-access-qjfk4" (OuterVolumeSpecName: "kube-api-access-qjfk4") pod "9e996f94-c149-4043-b4c0-d317e0a94198" (UID: "9e996f94-c149-4043-b4c0-d317e0a94198"). InnerVolumeSpecName "kube-api-access-qjfk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 23:02:01.725555 master-0 kubenswrapper[33572]: I1204 23:02:01.722967 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e996f94-c149-4043-b4c0-d317e0a94198" (UID: "9e996f94-c149-4043-b4c0-d317e0a94198"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 23:02:01.765542 master-0 kubenswrapper[33572]: I1204 23:02:01.764862 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 23:02:01.765542 master-0 kubenswrapper[33572]: I1204 23:02:01.764913 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjfk4\" (UniqueName: \"kubernetes.io/projected/9e996f94-c149-4043-b4c0-d317e0a94198-kube-api-access-qjfk4\") on node \"master-0\" DevicePath \"\"" Dec 04 23:02:01.765542 master-0 kubenswrapper[33572]: I1204 23:02:01.764928 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e996f94-c149-4043-b4c0-d317e0a94198-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 23:02:02.140591 master-0 kubenswrapper[33572]: I1204 23:02:02.137569 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4x8tr" Dec 04 23:02:02.140591 master-0 kubenswrapper[33572]: I1204 23:02:02.137632 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4x8tr" event={"ID":"9e996f94-c149-4043-b4c0-d317e0a94198","Type":"ContainerDied","Data":"2681e9ea2a4c952c1ac4c2e656795a72efd75761afa0308eac5b60164331f47b"} Dec 04 23:02:02.140591 master-0 kubenswrapper[33572]: I1204 23:02:02.137678 33572 scope.go:117] "RemoveContainer" containerID="9ac72c068d7dd35dfcb2215d8eafc74e46faa400fd2229de663dbf8c355697b1" Dec 04 23:02:02.177900 master-0 kubenswrapper[33572]: I1204 23:02:02.177798 33572 scope.go:117] "RemoveContainer" containerID="419b5d80fd2de7fe05f566e46d091262db87346b9f19ccd417a9b2570dd3661d" Dec 04 23:02:02.202947 master-0 kubenswrapper[33572]: I1204 23:02:02.202895 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4x8tr"] Dec 04 23:02:02.224978 master-0 kubenswrapper[33572]: I1204 23:02:02.224904 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4x8tr"] Dec 04 23:02:02.227437 master-0 kubenswrapper[33572]: I1204 23:02:02.227402 33572 scope.go:117] "RemoveContainer" containerID="cf3893f39c63bbf87bae8f753fdb061a5fe72ae8d4da1e1f296579d383fe403f" Dec 04 23:02:02.548360 master-0 kubenswrapper[33572]: I1204 23:02:02.548246 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e996f94-c149-4043-b4c0-d317e0a94198" path="/var/lib/kubelet/pods/9e996f94-c149-4043-b4c0-d317e0a94198/volumes" Dec 04 23:02:07.007755 master-0 kubenswrapper[33572]: I1204 23:02:07.007617 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:02:07.007755 master-0 kubenswrapper[33572]: I1204 23:02:07.007743 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:02:07.097733 master-0 kubenswrapper[33572]: I1204 23:02:07.097657 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:02:07.308480 master-0 kubenswrapper[33572]: I1204 23:02:07.308397 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:02:08.297257 master-0 kubenswrapper[33572]: I1204 23:02:08.297156 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzdc4"] Dec 04 23:02:09.265273 master-0 kubenswrapper[33572]: I1204 23:02:09.265125 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qzdc4" podUID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerName="registry-server" containerID="cri-o://e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60" gracePeriod=2 Dec 04 23:02:09.932320 master-0 kubenswrapper[33572]: I1204 23:02:09.932101 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:02:09.971665 master-0 kubenswrapper[33572]: I1204 23:02:09.971574 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-catalog-content\") pod \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " Dec 04 23:02:09.971900 master-0 kubenswrapper[33572]: I1204 23:02:09.971718 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zh6g\" (UniqueName: \"kubernetes.io/projected/2e19aff9-6c1d-431f-8303-2bbcbc44d685-kube-api-access-5zh6g\") pod \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " Dec 04 23:02:09.971900 master-0 kubenswrapper[33572]: I1204 23:02:09.971753 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-utilities\") pod \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\" (UID: \"2e19aff9-6c1d-431f-8303-2bbcbc44d685\") " Dec 04 23:02:09.998253 master-0 kubenswrapper[33572]: I1204 23:02:09.997791 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e19aff9-6c1d-431f-8303-2bbcbc44d685-kube-api-access-5zh6g" (OuterVolumeSpecName: "kube-api-access-5zh6g") pod "2e19aff9-6c1d-431f-8303-2bbcbc44d685" (UID: "2e19aff9-6c1d-431f-8303-2bbcbc44d685"). InnerVolumeSpecName "kube-api-access-5zh6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 23:02:09.998849 master-0 kubenswrapper[33572]: I1204 23:02:09.998650 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-utilities" (OuterVolumeSpecName: "utilities") pod "2e19aff9-6c1d-431f-8303-2bbcbc44d685" (UID: "2e19aff9-6c1d-431f-8303-2bbcbc44d685"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 23:02:10.024514 master-0 kubenswrapper[33572]: I1204 23:02:10.024418 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e19aff9-6c1d-431f-8303-2bbcbc44d685" (UID: "2e19aff9-6c1d-431f-8303-2bbcbc44d685"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 23:02:10.080665 master-0 kubenswrapper[33572]: I1204 23:02:10.076283 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zh6g\" (UniqueName: \"kubernetes.io/projected/2e19aff9-6c1d-431f-8303-2bbcbc44d685-kube-api-access-5zh6g\") on node \"master-0\" DevicePath \"\"" Dec 04 23:02:10.080665 master-0 kubenswrapper[33572]: I1204 23:02:10.076361 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 23:02:10.080665 master-0 kubenswrapper[33572]: I1204 23:02:10.076377 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e19aff9-6c1d-431f-8303-2bbcbc44d685-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 23:02:10.288265 master-0 kubenswrapper[33572]: I1204 23:02:10.288135 33572 generic.go:334] "Generic (PLEG): container finished" podID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerID="e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60" exitCode=0 Dec 04 23:02:10.288265 master-0 kubenswrapper[33572]: I1204 23:02:10.288238 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzdc4" event={"ID":"2e19aff9-6c1d-431f-8303-2bbcbc44d685","Type":"ContainerDied","Data":"e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60"} Dec 04 23:02:10.288703 master-0 kubenswrapper[33572]: I1204 23:02:10.288291 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzdc4" event={"ID":"2e19aff9-6c1d-431f-8303-2bbcbc44d685","Type":"ContainerDied","Data":"6ef867bda6a8abc02d192a51ea1648b732cdd33f4073195bf46ef9aba8a29331"} Dec 04 23:02:10.288703 master-0 kubenswrapper[33572]: I1204 23:02:10.288330 33572 scope.go:117] "RemoveContainer" containerID="e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60" Dec 04 23:02:10.288703 master-0 kubenswrapper[33572]: I1204 23:02:10.288694 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzdc4" Dec 04 23:02:10.340261 master-0 kubenswrapper[33572]: I1204 23:02:10.340176 33572 scope.go:117] "RemoveContainer" containerID="b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93" Dec 04 23:02:10.365645 master-0 kubenswrapper[33572]: I1204 23:02:10.365481 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzdc4"] Dec 04 23:02:10.385916 master-0 kubenswrapper[33572]: I1204 23:02:10.385836 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzdc4"] Dec 04 23:02:10.390883 master-0 kubenswrapper[33572]: I1204 23:02:10.390837 33572 scope.go:117] "RemoveContainer" containerID="f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae" Dec 04 23:02:10.424132 master-0 kubenswrapper[33572]: I1204 23:02:10.424078 33572 scope.go:117] "RemoveContainer" containerID="e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60" Dec 04 23:02:10.424691 master-0 kubenswrapper[33572]: E1204 23:02:10.424644 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60\": container with ID starting with e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60 not found: ID does not exist" containerID="e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60" Dec 04 23:02:10.424803 master-0 kubenswrapper[33572]: I1204 23:02:10.424690 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60"} err="failed to get container status \"e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60\": rpc error: code = NotFound desc = could not find container \"e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60\": container with ID starting with e8081c5074b76045eff2b5e886af3864a3f4a7ee72cb0c531f2d543d55b32d60 not found: ID does not exist" Dec 04 23:02:10.424803 master-0 kubenswrapper[33572]: I1204 23:02:10.424719 33572 scope.go:117] "RemoveContainer" containerID="b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93" Dec 04 23:02:10.425286 master-0 kubenswrapper[33572]: E1204 23:02:10.425246 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93\": container with ID starting with b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93 not found: ID does not exist" containerID="b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93" Dec 04 23:02:10.425286 master-0 kubenswrapper[33572]: I1204 23:02:10.425278 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93"} err="failed to get container status \"b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93\": rpc error: code = NotFound desc = could not find container \"b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93\": container with ID starting with b55d98fb68cff719a233e0bc58eca5d8b4f3d0b2df1c6ce5f22e8625ac5e8f93 not found: ID does not exist" Dec 04 23:02:10.425490 master-0 kubenswrapper[33572]: I1204 23:02:10.425298 33572 scope.go:117] "RemoveContainer" containerID="f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae" Dec 04 23:02:10.425648 master-0 kubenswrapper[33572]: E1204 23:02:10.425600 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae\": container with ID starting with f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae not found: ID does not exist" containerID="f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae" Dec 04 23:02:10.425648 master-0 kubenswrapper[33572]: I1204 23:02:10.425638 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae"} err="failed to get container status \"f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae\": rpc error: code = NotFound desc = could not find container \"f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae\": container with ID starting with f1a8d8f41c64dca44a8322a60075f91bc564fa5a2264f3b8b01bac00f3a29cae not found: ID does not exist" Dec 04 23:02:10.550184 master-0 kubenswrapper[33572]: I1204 23:02:10.548565 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" path="/var/lib/kubelet/pods/2e19aff9-6c1d-431f-8303-2bbcbc44d685/volumes" Dec 04 23:02:33.295470 master-0 kubenswrapper[33572]: I1204 23:02:33.295388 33572 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-d974c476-m8fbn" podUID="2280c7c8-99f9-44e4-be67-358609c9c7d8" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 04 23:07:13.169718 master-0 kubenswrapper[33572]: I1204 23:07:13.169605 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hwtbv"] Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: E1204 23:07:13.170189 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerName="registry-server" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: I1204 23:07:13.170207 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerName="registry-server" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: E1204 23:07:13.170233 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e996f94-c149-4043-b4c0-d317e0a94198" containerName="extract-utilities" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: I1204 23:07:13.170243 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e996f94-c149-4043-b4c0-d317e0a94198" containerName="extract-utilities" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: E1204 23:07:13.170268 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerName="extract-utilities" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: I1204 23:07:13.170277 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerName="extract-utilities" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: E1204 23:07:13.170292 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e996f94-c149-4043-b4c0-d317e0a94198" containerName="registry-server" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: I1204 23:07:13.170301 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e996f94-c149-4043-b4c0-d317e0a94198" containerName="registry-server" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: E1204 23:07:13.170316 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e996f94-c149-4043-b4c0-d317e0a94198" containerName="extract-content" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: I1204 23:07:13.170325 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e996f94-c149-4043-b4c0-d317e0a94198" containerName="extract-content" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: E1204 23:07:13.170364 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerName="extract-content" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: I1204 23:07:13.170373 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerName="extract-content" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: I1204 23:07:13.170662 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e19aff9-6c1d-431f-8303-2bbcbc44d685" containerName="registry-server" Dec 04 23:07:13.170747 master-0 kubenswrapper[33572]: I1204 23:07:13.170741 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e996f94-c149-4043-b4c0-d317e0a94198" containerName="registry-server" Dec 04 23:07:13.172928 master-0 kubenswrapper[33572]: I1204 23:07:13.172878 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hwtbv"] Dec 04 23:07:13.173040 master-0 kubenswrapper[33572]: I1204 23:07:13.172992 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:13.300180 master-0 kubenswrapper[33572]: I1204 23:07:13.300121 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-catalog-content\") pod \"redhat-operators-hwtbv\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:13.300526 master-0 kubenswrapper[33572]: I1204 23:07:13.300318 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-utilities\") pod \"redhat-operators-hwtbv\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:13.300526 master-0 kubenswrapper[33572]: I1204 23:07:13.300397 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2nph\" (UniqueName: \"kubernetes.io/projected/de98ac70-996d-42ab-a1ef-3888260fb293-kube-api-access-w2nph\") pod \"redhat-operators-hwtbv\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:13.402860 master-0 kubenswrapper[33572]: I1204 23:07:13.402793 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2nph\" (UniqueName: \"kubernetes.io/projected/de98ac70-996d-42ab-a1ef-3888260fb293-kube-api-access-w2nph\") pod \"redhat-operators-hwtbv\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:13.403102 master-0 kubenswrapper[33572]: I1204 23:07:13.402929 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-catalog-content\") pod \"redhat-operators-hwtbv\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:13.403102 master-0 kubenswrapper[33572]: I1204 23:07:13.403028 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-utilities\") pod \"redhat-operators-hwtbv\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:13.403559 master-0 kubenswrapper[33572]: I1204 23:07:13.403533 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-utilities\") pod \"redhat-operators-hwtbv\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:13.403979 master-0 kubenswrapper[33572]: I1204 23:07:13.403911 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-catalog-content\") pod \"redhat-operators-hwtbv\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:13.420207 master-0 kubenswrapper[33572]: I1204 23:07:13.419801 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2nph\" (UniqueName: \"kubernetes.io/projected/de98ac70-996d-42ab-a1ef-3888260fb293-kube-api-access-w2nph\") pod \"redhat-operators-hwtbv\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:13.559839 master-0 kubenswrapper[33572]: I1204 23:07:13.559757 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:14.023849 master-0 kubenswrapper[33572]: W1204 23:07:14.023785 33572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde98ac70_996d_42ab_a1ef_3888260fb293.slice/crio-bbc08cc0da941f68bbc05220c63b7ac1b2eedbdb6f752630df039b35ffc7bc95 WatchSource:0}: Error finding container bbc08cc0da941f68bbc05220c63b7ac1b2eedbdb6f752630df039b35ffc7bc95: Status 404 returned error can't find the container with id bbc08cc0da941f68bbc05220c63b7ac1b2eedbdb6f752630df039b35ffc7bc95 Dec 04 23:07:14.025566 master-0 kubenswrapper[33572]: I1204 23:07:14.025467 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hwtbv"] Dec 04 23:07:14.895589 master-0 kubenswrapper[33572]: I1204 23:07:14.895469 33572 generic.go:334] "Generic (PLEG): container finished" podID="de98ac70-996d-42ab-a1ef-3888260fb293" containerID="1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6" exitCode=0 Dec 04 23:07:14.895589 master-0 kubenswrapper[33572]: I1204 23:07:14.895557 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwtbv" event={"ID":"de98ac70-996d-42ab-a1ef-3888260fb293","Type":"ContainerDied","Data":"1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6"} Dec 04 23:07:14.896454 master-0 kubenswrapper[33572]: I1204 23:07:14.895628 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwtbv" event={"ID":"de98ac70-996d-42ab-a1ef-3888260fb293","Type":"ContainerStarted","Data":"bbc08cc0da941f68bbc05220c63b7ac1b2eedbdb6f752630df039b35ffc7bc95"} Dec 04 23:07:14.899074 master-0 kubenswrapper[33572]: I1204 23:07:14.899028 33572 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 23:07:16.927306 master-0 kubenswrapper[33572]: I1204 23:07:16.927211 33572 generic.go:334] "Generic (PLEG): container finished" podID="de98ac70-996d-42ab-a1ef-3888260fb293" containerID="e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab" exitCode=0 Dec 04 23:07:16.928073 master-0 kubenswrapper[33572]: I1204 23:07:16.927308 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwtbv" event={"ID":"de98ac70-996d-42ab-a1ef-3888260fb293","Type":"ContainerDied","Data":"e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab"} Dec 04 23:07:18.959403 master-0 kubenswrapper[33572]: I1204 23:07:18.959239 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwtbv" event={"ID":"de98ac70-996d-42ab-a1ef-3888260fb293","Type":"ContainerStarted","Data":"c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b"} Dec 04 23:07:18.990288 master-0 kubenswrapper[33572]: I1204 23:07:18.989994 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hwtbv" podStartSLOduration=3.459321981 podStartE2EDuration="5.989973301s" podCreationTimestamp="2025-12-04 23:07:13 +0000 UTC" firstStartedPulling="2025-12-04 23:07:14.898904365 +0000 UTC m=+2898.626430024" lastFinishedPulling="2025-12-04 23:07:17.429555655 +0000 UTC m=+2901.157081344" observedRunningTime="2025-12-04 23:07:18.986638991 +0000 UTC m=+2902.714164760" watchObservedRunningTime="2025-12-04 23:07:18.989973301 +0000 UTC m=+2902.717498960" Dec 04 23:07:23.564355 master-0 kubenswrapper[33572]: I1204 23:07:23.560713 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:23.564355 master-0 kubenswrapper[33572]: I1204 23:07:23.560793 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:24.648078 master-0 kubenswrapper[33572]: I1204 23:07:24.647976 33572 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hwtbv" podUID="de98ac70-996d-42ab-a1ef-3888260fb293" containerName="registry-server" probeResult="failure" output=< Dec 04 23:07:24.648078 master-0 kubenswrapper[33572]: timeout: failed to connect service ":50051" within 1s Dec 04 23:07:24.648078 master-0 kubenswrapper[33572]: > Dec 04 23:07:33.617289 master-0 kubenswrapper[33572]: I1204 23:07:33.617217 33572 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:33.716586 master-0 kubenswrapper[33572]: I1204 23:07:33.716489 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:33.872838 master-0 kubenswrapper[33572]: I1204 23:07:33.872682 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hwtbv"] Dec 04 23:07:35.235303 master-0 kubenswrapper[33572]: I1204 23:07:35.235224 33572 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hwtbv" podUID="de98ac70-996d-42ab-a1ef-3888260fb293" containerName="registry-server" containerID="cri-o://c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b" gracePeriod=2 Dec 04 23:07:35.876584 master-0 kubenswrapper[33572]: I1204 23:07:35.876469 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:35.995983 master-0 kubenswrapper[33572]: I1204 23:07:35.995920 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-catalog-content\") pod \"de98ac70-996d-42ab-a1ef-3888260fb293\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " Dec 04 23:07:35.996204 master-0 kubenswrapper[33572]: I1204 23:07:35.995999 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2nph\" (UniqueName: \"kubernetes.io/projected/de98ac70-996d-42ab-a1ef-3888260fb293-kube-api-access-w2nph\") pod \"de98ac70-996d-42ab-a1ef-3888260fb293\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " Dec 04 23:07:35.996204 master-0 kubenswrapper[33572]: I1204 23:07:35.996080 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-utilities\") pod \"de98ac70-996d-42ab-a1ef-3888260fb293\" (UID: \"de98ac70-996d-42ab-a1ef-3888260fb293\") " Dec 04 23:07:35.997620 master-0 kubenswrapper[33572]: I1204 23:07:35.997592 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-utilities" (OuterVolumeSpecName: "utilities") pod "de98ac70-996d-42ab-a1ef-3888260fb293" (UID: "de98ac70-996d-42ab-a1ef-3888260fb293"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 23:07:35.997885 master-0 kubenswrapper[33572]: I1204 23:07:35.997862 33572 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-utilities\") on node \"master-0\" DevicePath \"\"" Dec 04 23:07:36.005992 master-0 kubenswrapper[33572]: I1204 23:07:36.005928 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de98ac70-996d-42ab-a1ef-3888260fb293-kube-api-access-w2nph" (OuterVolumeSpecName: "kube-api-access-w2nph") pod "de98ac70-996d-42ab-a1ef-3888260fb293" (UID: "de98ac70-996d-42ab-a1ef-3888260fb293"). InnerVolumeSpecName "kube-api-access-w2nph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 23:07:36.100058 master-0 kubenswrapper[33572]: I1204 23:07:36.100010 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2nph\" (UniqueName: \"kubernetes.io/projected/de98ac70-996d-42ab-a1ef-3888260fb293-kube-api-access-w2nph\") on node \"master-0\" DevicePath \"\"" Dec 04 23:07:36.130980 master-0 kubenswrapper[33572]: I1204 23:07:36.130915 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de98ac70-996d-42ab-a1ef-3888260fb293" (UID: "de98ac70-996d-42ab-a1ef-3888260fb293"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 23:07:36.202884 master-0 kubenswrapper[33572]: I1204 23:07:36.202822 33572 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de98ac70-996d-42ab-a1ef-3888260fb293-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 04 23:07:36.249186 master-0 kubenswrapper[33572]: I1204 23:07:36.249102 33572 generic.go:334] "Generic (PLEG): container finished" podID="de98ac70-996d-42ab-a1ef-3888260fb293" containerID="c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b" exitCode=0 Dec 04 23:07:36.249186 master-0 kubenswrapper[33572]: I1204 23:07:36.249184 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwtbv" event={"ID":"de98ac70-996d-42ab-a1ef-3888260fb293","Type":"ContainerDied","Data":"c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b"} Dec 04 23:07:36.249925 master-0 kubenswrapper[33572]: I1204 23:07:36.249233 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwtbv" event={"ID":"de98ac70-996d-42ab-a1ef-3888260fb293","Type":"ContainerDied","Data":"bbc08cc0da941f68bbc05220c63b7ac1b2eedbdb6f752630df039b35ffc7bc95"} Dec 04 23:07:36.249925 master-0 kubenswrapper[33572]: I1204 23:07:36.249264 33572 scope.go:117] "RemoveContainer" containerID="c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b" Dec 04 23:07:36.249925 master-0 kubenswrapper[33572]: I1204 23:07:36.249535 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwtbv" Dec 04 23:07:36.292064 master-0 kubenswrapper[33572]: I1204 23:07:36.291568 33572 scope.go:117] "RemoveContainer" containerID="e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab" Dec 04 23:07:36.318632 master-0 kubenswrapper[33572]: I1204 23:07:36.318572 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hwtbv"] Dec 04 23:07:36.327963 master-0 kubenswrapper[33572]: I1204 23:07:36.327798 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hwtbv"] Dec 04 23:07:36.340634 master-0 kubenswrapper[33572]: I1204 23:07:36.340573 33572 scope.go:117] "RemoveContainer" containerID="1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6" Dec 04 23:07:36.401290 master-0 kubenswrapper[33572]: I1204 23:07:36.401228 33572 scope.go:117] "RemoveContainer" containerID="c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b" Dec 04 23:07:36.402347 master-0 kubenswrapper[33572]: E1204 23:07:36.402214 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b\": container with ID starting with c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b not found: ID does not exist" containerID="c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b" Dec 04 23:07:36.402420 master-0 kubenswrapper[33572]: I1204 23:07:36.402363 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b"} err="failed to get container status \"c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b\": rpc error: code = NotFound desc = could not find container \"c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b\": container with ID starting with c346b3c6ba7710439929b407d1c7341a81aa131249c55e59684ab057dcbb084b not found: ID does not exist" Dec 04 23:07:36.402420 master-0 kubenswrapper[33572]: I1204 23:07:36.402409 33572 scope.go:117] "RemoveContainer" containerID="e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab" Dec 04 23:07:36.403206 master-0 kubenswrapper[33572]: E1204 23:07:36.403131 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab\": container with ID starting with e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab not found: ID does not exist" containerID="e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab" Dec 04 23:07:36.403270 master-0 kubenswrapper[33572]: I1204 23:07:36.403190 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab"} err="failed to get container status \"e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab\": rpc error: code = NotFound desc = could not find container \"e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab\": container with ID starting with e325f6752317697ceed9c25e92cba7534c79729ab5469069080f405953d894ab not found: ID does not exist" Dec 04 23:07:36.403270 master-0 kubenswrapper[33572]: I1204 23:07:36.403223 33572 scope.go:117] "RemoveContainer" containerID="1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6" Dec 04 23:07:36.404522 master-0 kubenswrapper[33572]: E1204 23:07:36.404448 33572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6\": container with ID starting with 1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6 not found: ID does not exist" containerID="1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6" Dec 04 23:07:36.404607 master-0 kubenswrapper[33572]: I1204 23:07:36.404520 33572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6"} err="failed to get container status \"1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6\": rpc error: code = NotFound desc = could not find container \"1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6\": container with ID starting with 1ecf5b3dbea5f7a7e202bf837844ac2ee1c8e01e2de028d8278823a516372cd6 not found: ID does not exist" Dec 04 23:07:36.545774 master-0 kubenswrapper[33572]: I1204 23:07:36.545669 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de98ac70-996d-42ab-a1ef-3888260fb293" path="/var/lib/kubelet/pods/de98ac70-996d-42ab-a1ef-3888260fb293/volumes" Dec 04 23:10:06.233116 master-0 kubenswrapper[33572]: I1204 23:10:06.233024 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pwzxr/must-gather-s7k85"] Dec 04 23:10:06.233986 master-0 kubenswrapper[33572]: E1204 23:10:06.233941 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de98ac70-996d-42ab-a1ef-3888260fb293" containerName="extract-content" Dec 04 23:10:06.233986 master-0 kubenswrapper[33572]: I1204 23:10:06.233977 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de98ac70-996d-42ab-a1ef-3888260fb293" containerName="extract-content" Dec 04 23:10:06.234105 master-0 kubenswrapper[33572]: E1204 23:10:06.234032 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de98ac70-996d-42ab-a1ef-3888260fb293" containerName="registry-server" Dec 04 23:10:06.234105 master-0 kubenswrapper[33572]: I1204 23:10:06.234047 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de98ac70-996d-42ab-a1ef-3888260fb293" containerName="registry-server" Dec 04 23:10:06.234105 master-0 kubenswrapper[33572]: E1204 23:10:06.234083 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de98ac70-996d-42ab-a1ef-3888260fb293" containerName="extract-utilities" Dec 04 23:10:06.234105 master-0 kubenswrapper[33572]: I1204 23:10:06.234101 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="de98ac70-996d-42ab-a1ef-3888260fb293" containerName="extract-utilities" Dec 04 23:10:06.234645 master-0 kubenswrapper[33572]: I1204 23:10:06.234599 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="de98ac70-996d-42ab-a1ef-3888260fb293" containerName="registry-server" Dec 04 23:10:06.237321 master-0 kubenswrapper[33572]: I1204 23:10:06.237236 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/must-gather-s7k85" Dec 04 23:10:06.239480 master-0 kubenswrapper[33572]: I1204 23:10:06.239401 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pwzxr"/"openshift-service-ca.crt" Dec 04 23:10:06.239680 master-0 kubenswrapper[33572]: I1204 23:10:06.239626 33572 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pwzxr"/"kube-root-ca.crt" Dec 04 23:10:06.245253 master-0 kubenswrapper[33572]: I1204 23:10:06.245170 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pwzxr/must-gather-mr4ch"] Dec 04 23:10:06.248711 master-0 kubenswrapper[33572]: I1204 23:10:06.248651 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/must-gather-mr4ch" Dec 04 23:10:06.258271 master-0 kubenswrapper[33572]: I1204 23:10:06.258184 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pwzxr/must-gather-s7k85"] Dec 04 23:10:06.312153 master-0 kubenswrapper[33572]: I1204 23:10:06.304742 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pwzxr/must-gather-mr4ch"] Dec 04 23:10:06.312153 master-0 kubenswrapper[33572]: I1204 23:10:06.311621 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/87336090-08d8-4bbc-abe9-ae767dda0838-must-gather-output\") pod \"must-gather-mr4ch\" (UID: \"87336090-08d8-4bbc-abe9-ae767dda0838\") " pod="openshift-must-gather-pwzxr/must-gather-mr4ch" Dec 04 23:10:06.312153 master-0 kubenswrapper[33572]: I1204 23:10:06.311721 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6fjc\" (UniqueName: \"kubernetes.io/projected/87336090-08d8-4bbc-abe9-ae767dda0838-kube-api-access-l6fjc\") pod \"must-gather-mr4ch\" (UID: \"87336090-08d8-4bbc-abe9-ae767dda0838\") " pod="openshift-must-gather-pwzxr/must-gather-mr4ch" Dec 04 23:10:06.312153 master-0 kubenswrapper[33572]: I1204 23:10:06.311960 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ad607ea-3069-4a8b-816a-3256f11227d8-must-gather-output\") pod \"must-gather-s7k85\" (UID: \"2ad607ea-3069-4a8b-816a-3256f11227d8\") " pod="openshift-must-gather-pwzxr/must-gather-s7k85" Dec 04 23:10:06.312153 master-0 kubenswrapper[33572]: I1204 23:10:06.312038 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbzsn\" (UniqueName: \"kubernetes.io/projected/2ad607ea-3069-4a8b-816a-3256f11227d8-kube-api-access-qbzsn\") pod \"must-gather-s7k85\" (UID: \"2ad607ea-3069-4a8b-816a-3256f11227d8\") " pod="openshift-must-gather-pwzxr/must-gather-s7k85" Dec 04 23:10:06.415974 master-0 kubenswrapper[33572]: I1204 23:10:06.414032 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6fjc\" (UniqueName: \"kubernetes.io/projected/87336090-08d8-4bbc-abe9-ae767dda0838-kube-api-access-l6fjc\") pod \"must-gather-mr4ch\" (UID: \"87336090-08d8-4bbc-abe9-ae767dda0838\") " pod="openshift-must-gather-pwzxr/must-gather-mr4ch" Dec 04 23:10:06.415974 master-0 kubenswrapper[33572]: I1204 23:10:06.414180 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ad607ea-3069-4a8b-816a-3256f11227d8-must-gather-output\") pod \"must-gather-s7k85\" (UID: \"2ad607ea-3069-4a8b-816a-3256f11227d8\") " pod="openshift-must-gather-pwzxr/must-gather-s7k85" Dec 04 23:10:06.415974 master-0 kubenswrapper[33572]: I1204 23:10:06.414462 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbzsn\" (UniqueName: \"kubernetes.io/projected/2ad607ea-3069-4a8b-816a-3256f11227d8-kube-api-access-qbzsn\") pod \"must-gather-s7k85\" (UID: \"2ad607ea-3069-4a8b-816a-3256f11227d8\") " pod="openshift-must-gather-pwzxr/must-gather-s7k85" Dec 04 23:10:06.415974 master-0 kubenswrapper[33572]: I1204 23:10:06.414667 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2ad607ea-3069-4a8b-816a-3256f11227d8-must-gather-output\") pod \"must-gather-s7k85\" (UID: \"2ad607ea-3069-4a8b-816a-3256f11227d8\") " pod="openshift-must-gather-pwzxr/must-gather-s7k85" Dec 04 23:10:06.415974 master-0 kubenswrapper[33572]: I1204 23:10:06.414712 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/87336090-08d8-4bbc-abe9-ae767dda0838-must-gather-output\") pod \"must-gather-mr4ch\" (UID: \"87336090-08d8-4bbc-abe9-ae767dda0838\") " pod="openshift-must-gather-pwzxr/must-gather-mr4ch" Dec 04 23:10:06.415974 master-0 kubenswrapper[33572]: I1204 23:10:06.415015 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/87336090-08d8-4bbc-abe9-ae767dda0838-must-gather-output\") pod \"must-gather-mr4ch\" (UID: \"87336090-08d8-4bbc-abe9-ae767dda0838\") " pod="openshift-must-gather-pwzxr/must-gather-mr4ch" Dec 04 23:10:06.433682 master-0 kubenswrapper[33572]: I1204 23:10:06.433615 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6fjc\" (UniqueName: \"kubernetes.io/projected/87336090-08d8-4bbc-abe9-ae767dda0838-kube-api-access-l6fjc\") pod \"must-gather-mr4ch\" (UID: \"87336090-08d8-4bbc-abe9-ae767dda0838\") " pod="openshift-must-gather-pwzxr/must-gather-mr4ch" Dec 04 23:10:06.435970 master-0 kubenswrapper[33572]: I1204 23:10:06.435913 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbzsn\" (UniqueName: \"kubernetes.io/projected/2ad607ea-3069-4a8b-816a-3256f11227d8-kube-api-access-qbzsn\") pod \"must-gather-s7k85\" (UID: \"2ad607ea-3069-4a8b-816a-3256f11227d8\") " pod="openshift-must-gather-pwzxr/must-gather-s7k85" Dec 04 23:10:06.569798 master-0 kubenswrapper[33572]: I1204 23:10:06.569710 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/must-gather-s7k85" Dec 04 23:10:06.587021 master-0 kubenswrapper[33572]: I1204 23:10:06.586947 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/must-gather-mr4ch" Dec 04 23:10:07.145097 master-0 kubenswrapper[33572]: I1204 23:10:07.145040 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pwzxr/must-gather-mr4ch"] Dec 04 23:10:07.278323 master-0 kubenswrapper[33572]: I1204 23:10:07.278240 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pwzxr/must-gather-s7k85"] Dec 04 23:10:08.597269 master-0 kubenswrapper[33572]: I1204 23:10:08.597190 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/must-gather-s7k85" event={"ID":"2ad607ea-3069-4a8b-816a-3256f11227d8","Type":"ContainerStarted","Data":"ee0879576fdad3fb849cbacb335a70735afa90a2928db6db798f1c91f88f5e3b"} Dec 04 23:10:08.598716 master-0 kubenswrapper[33572]: I1204 23:10:08.598668 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/must-gather-mr4ch" event={"ID":"87336090-08d8-4bbc-abe9-ae767dda0838","Type":"ContainerStarted","Data":"3cb5b29b5e6bafdcce6d7fa07dd3f9752c21618e40ab4f8c0a0cc47846c3775a"} Dec 04 23:10:09.613138 master-0 kubenswrapper[33572]: I1204 23:10:09.612048 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/must-gather-mr4ch" event={"ID":"87336090-08d8-4bbc-abe9-ae767dda0838","Type":"ContainerStarted","Data":"1bd303d1043f136f9a251c7ef0424c07ad4df193c63e9f6db49d291ae4128700"} Dec 04 23:10:10.626743 master-0 kubenswrapper[33572]: I1204 23:10:10.626682 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/must-gather-mr4ch" event={"ID":"87336090-08d8-4bbc-abe9-ae767dda0838","Type":"ContainerStarted","Data":"5d5bef1154b32b6c121cdfe1644bca4621ace3f109adce0964cb9a05184e10bf"} Dec 04 23:10:10.655216 master-0 kubenswrapper[33572]: I1204 23:10:10.655125 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pwzxr/must-gather-mr4ch" podStartSLOduration=3.571063563 podStartE2EDuration="4.655101801s" podCreationTimestamp="2025-12-04 23:10:06 +0000 UTC" firstStartedPulling="2025-12-04 23:10:07.9038618 +0000 UTC m=+3071.631387489" lastFinishedPulling="2025-12-04 23:10:08.987900068 +0000 UTC m=+3072.715425727" observedRunningTime="2025-12-04 23:10:10.643030669 +0000 UTC m=+3074.370556338" watchObservedRunningTime="2025-12-04 23:10:10.655101801 +0000 UTC m=+3074.382627460" Dec 04 23:10:11.058432 master-0 kubenswrapper[33572]: I1204 23:10:11.058371 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-6d5d5dcc89-t7cc5_a8636bd7-fa9e-44b9-82df-9d37b398736d/cluster-version-operator/0.log" Dec 04 23:10:11.177520 master-0 kubenswrapper[33572]: I1204 23:10:11.176920 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-6d5d5dcc89-t7cc5_a8636bd7-fa9e-44b9-82df-9d37b398736d/cluster-version-operator/1.log" Dec 04 23:10:14.712734 master-0 kubenswrapper[33572]: I1204 23:10:14.712050 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/must-gather-s7k85" event={"ID":"2ad607ea-3069-4a8b-816a-3256f11227d8","Type":"ContainerStarted","Data":"84ba49ff0857ff102aad1544d61f7518124a9e272f6cbc093b5c9f58714fecd8"} Dec 04 23:10:14.874089 master-0 kubenswrapper[33572]: I1204 23:10:14.873957 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-twslb_22dd80ed-ccaf-40b1-8837-a8aeb3c28140/nmstate-console-plugin/0.log" Dec 04 23:10:14.920523 master-0 kubenswrapper[33572]: I1204 23:10:14.912112 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-v5nvt_421b3d90-c030-4f07-bbfa-bcb40fcfef1f/controller/0.log" Dec 04 23:10:15.099535 master-0 kubenswrapper[33572]: I1204 23:10:15.093972 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-v5nvt_421b3d90-c030-4f07-bbfa-bcb40fcfef1f/kube-rbac-proxy/0.log" Dec 04 23:10:15.135522 master-0 kubenswrapper[33572]: I1204 23:10:15.129903 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mcmbn_0a44f088-be39-47f3-8e34-9cade0076325/nmstate-handler/0.log" Dec 04 23:10:15.147524 master-0 kubenswrapper[33572]: I1204 23:10:15.140397 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/controller/0.log" Dec 04 23:10:15.147524 master-0 kubenswrapper[33572]: I1204 23:10:15.146400 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8rwmp_ed12588b-8304-4d07-9055-6944c540d15f/nmstate-metrics/0.log" Dec 04 23:10:15.165529 master-0 kubenswrapper[33572]: I1204 23:10:15.163406 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8rwmp_ed12588b-8304-4d07-9055-6944c540d15f/kube-rbac-proxy/0.log" Dec 04 23:10:15.207529 master-0 kubenswrapper[33572]: I1204 23:10:15.206490 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-n77lr_103e82e6-9c8a-497b-9c46-bec48796d1a0/nmstate-operator/0.log" Dec 04 23:10:15.226596 master-0 kubenswrapper[33572]: I1204 23:10:15.224385 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-265zs_4e7a211d-0841-473d-87c2-953375146be8/nmstate-webhook/0.log" Dec 04 23:10:15.732649 master-0 kubenswrapper[33572]: I1204 23:10:15.731728 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/must-gather-s7k85" event={"ID":"2ad607ea-3069-4a8b-816a-3256f11227d8","Type":"ContainerStarted","Data":"cc867d2fa3b3383b03038e91c1f9916ea2152ec629df170e8405b16cac56ea7d"} Dec 04 23:10:15.760534 master-0 kubenswrapper[33572]: I1204 23:10:15.757959 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pwzxr/must-gather-s7k85" podStartSLOduration=4.2008247 podStartE2EDuration="9.757943543s" podCreationTimestamp="2025-12-04 23:10:06 +0000 UTC" firstStartedPulling="2025-12-04 23:10:07.923592263 +0000 UTC m=+3071.651117932" lastFinishedPulling="2025-12-04 23:10:13.480711126 +0000 UTC m=+3077.208236775" observedRunningTime="2025-12-04 23:10:15.755425213 +0000 UTC m=+3079.482950862" watchObservedRunningTime="2025-12-04 23:10:15.757943543 +0000 UTC m=+3079.485469192" Dec 04 23:10:15.790538 master-0 kubenswrapper[33572]: I1204 23:10:15.790237 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcdctl/0.log" Dec 04 23:10:16.666131 master-0 kubenswrapper[33572]: I1204 23:10:16.666061 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd/0.log" Dec 04 23:10:16.691417 master-0 kubenswrapper[33572]: I1204 23:10:16.691375 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd-metrics/0.log" Dec 04 23:10:16.705915 master-0 kubenswrapper[33572]: I1204 23:10:16.705870 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd-readyz/0.log" Dec 04 23:10:16.722524 master-0 kubenswrapper[33572]: I1204 23:10:16.722462 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/frr/0.log" Dec 04 23:10:16.725366 master-0 kubenswrapper[33572]: I1204 23:10:16.724858 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd-rev/0.log" Dec 04 23:10:16.733518 master-0 kubenswrapper[33572]: I1204 23:10:16.731355 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/reloader/0.log" Dec 04 23:10:16.740541 master-0 kubenswrapper[33572]: I1204 23:10:16.738681 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/frr-metrics/0.log" Dec 04 23:10:16.740541 master-0 kubenswrapper[33572]: I1204 23:10:16.739222 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/setup/0.log" Dec 04 23:10:16.749976 master-0 kubenswrapper[33572]: I1204 23:10:16.749011 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/kube-rbac-proxy/0.log" Dec 04 23:10:16.753460 master-0 kubenswrapper[33572]: I1204 23:10:16.753421 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd-ensure-env-vars/0.log" Dec 04 23:10:16.767020 master-0 kubenswrapper[33572]: I1204 23:10:16.766461 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/kube-rbac-proxy-frr/0.log" Dec 04 23:10:16.776162 master-0 kubenswrapper[33572]: I1204 23:10:16.776031 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd-resources-copy/0.log" Dec 04 23:10:16.776162 master-0 kubenswrapper[33572]: I1204 23:10:16.776031 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/cp-frr-files/0.log" Dec 04 23:10:16.790203 master-0 kubenswrapper[33572]: I1204 23:10:16.790122 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/cp-reloader/0.log" Dec 04 23:10:16.800271 master-0 kubenswrapper[33572]: I1204 23:10:16.800189 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/cp-metrics/0.log" Dec 04 23:10:16.812799 master-0 kubenswrapper[33572]: I1204 23:10:16.812744 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-27xx2_f4b97d80-d72e-4c4c-92f0-4c9a983c5fca/frr-k8s-webhook-server/0.log" Dec 04 23:10:16.839383 master-0 kubenswrapper[33572]: I1204 23:10:16.839331 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85bc976bd6-scgdf_0f4308db-85a3-40de-881b-b54463263359/manager/0.log" Dec 04 23:10:16.854352 master-0 kubenswrapper[33572]: I1204 23:10:16.850715 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab/installer/0.log" Dec 04 23:10:16.866937 master-0 kubenswrapper[33572]: I1204 23:10:16.864461 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5844777bf9-wp7bl_3c189000-5407-41c1-825a-3ad7708d6b67/webhook-server/0.log" Dec 04 23:10:16.892544 master-0 kubenswrapper[33572]: I1204 23:10:16.892158 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_6e011b0a-89e2-47e3-9112-d46a828416b1/installer/0.log" Dec 04 23:10:17.229319 master-0 kubenswrapper[33572]: I1204 23:10:17.229182 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-clzpp_cc0da214-23b6-4e1e-aba6-bb88fe145246/speaker/0.log" Dec 04 23:10:17.236919 master-0 kubenswrapper[33572]: I1204 23:10:17.236875 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-clzpp_cc0da214-23b6-4e1e-aba6-bb88fe145246/kube-rbac-proxy/0.log" Dec 04 23:10:17.926716 master-0 kubenswrapper[33572]: I1204 23:10:17.926670 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/assisted-installer_assisted-installer-controller-mxfnl_9160fec1-743a-470e-b48f-95a7ddf1c0b2/assisted-installer-controller/0.log" Dec 04 23:10:18.027378 master-0 kubenswrapper[33572]: I1204 23:10:18.027326 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-6cfff4b945-wlg4k_1f9552e4-fda9-4207-a4ed-a0486fd1018e/oauth-openshift/0.log" Dec 04 23:10:19.127172 master-0 kubenswrapper[33572]: I1204 23:10:19.127028 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-bm2pk_f893663c-7c1e-4eda-9839-99c1c0440304/authentication-operator/2.log" Dec 04 23:10:19.166596 master-0 kubenswrapper[33572]: I1204 23:10:19.166246 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-bm2pk_f893663c-7c1e-4eda-9839-99c1c0440304/authentication-operator/3.log" Dec 04 23:10:20.099462 master-0 kubenswrapper[33572]: I1204 23:10:20.099428 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5465c8b4db-8vm66_c178afcf-b713-4c74-b22b-6169ba3123f5/router/5.log" Dec 04 23:10:20.106426 master-0 kubenswrapper[33572]: I1204 23:10:20.106374 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5465c8b4db-8vm66_c178afcf-b713-4c74-b22b-6169ba3123f5/router/4.log" Dec 04 23:10:20.225767 master-0 kubenswrapper[33572]: I1204 23:10:20.225711 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pwzxr/master-0-debug-rwf5q"] Dec 04 23:10:20.228816 master-0 kubenswrapper[33572]: I1204 23:10:20.227996 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" Dec 04 23:10:20.290807 master-0 kubenswrapper[33572]: I1204 23:10:20.290755 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cpbz\" (UniqueName: \"kubernetes.io/projected/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-kube-api-access-5cpbz\") pod \"master-0-debug-rwf5q\" (UID: \"a6c6e72a-2ff1-4c80-8670-c677aa07aa13\") " pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" Dec 04 23:10:20.290994 master-0 kubenswrapper[33572]: I1204 23:10:20.290921 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-host\") pod \"master-0-debug-rwf5q\" (UID: \"a6c6e72a-2ff1-4c80-8670-c677aa07aa13\") " pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" Dec 04 23:10:20.395718 master-0 kubenswrapper[33572]: I1204 23:10:20.394177 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cpbz\" (UniqueName: \"kubernetes.io/projected/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-kube-api-access-5cpbz\") pod \"master-0-debug-rwf5q\" (UID: \"a6c6e72a-2ff1-4c80-8670-c677aa07aa13\") " pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" Dec 04 23:10:20.395718 master-0 kubenswrapper[33572]: I1204 23:10:20.394271 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-host\") pod \"master-0-debug-rwf5q\" (UID: \"a6c6e72a-2ff1-4c80-8670-c677aa07aa13\") " pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" Dec 04 23:10:20.395718 master-0 kubenswrapper[33572]: I1204 23:10:20.394400 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-host\") pod \"master-0-debug-rwf5q\" (UID: \"a6c6e72a-2ff1-4c80-8670-c677aa07aa13\") " pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" Dec 04 23:10:20.413648 master-0 kubenswrapper[33572]: I1204 23:10:20.412013 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cpbz\" (UniqueName: \"kubernetes.io/projected/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-kube-api-access-5cpbz\") pod \"master-0-debug-rwf5q\" (UID: \"a6c6e72a-2ff1-4c80-8670-c677aa07aa13\") " pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" Dec 04 23:10:20.544908 master-0 kubenswrapper[33572]: I1204 23:10:20.544842 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" Dec 04 23:10:20.804199 master-0 kubenswrapper[33572]: I1204 23:10:20.804058 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" event={"ID":"a6c6e72a-2ff1-4c80-8670-c677aa07aa13","Type":"ContainerStarted","Data":"c9f7b0a6a31c622a656cc9f40382b72d52ad648ab1f62e910855083f56571b96"} Dec 04 23:10:21.147375 master-0 kubenswrapper[33572]: I1204 23:10:21.144669 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-58574fc8d8-gg42x_989a73ce-3898-4f65-a437-2c7061f9375f/oauth-apiserver/0.log" Dec 04 23:10:23.023154 master-0 kubenswrapper[33572]: I1204 23:10:23.023061 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-58574fc8d8-gg42x_989a73ce-3898-4f65-a437-2c7061f9375f/fix-audit-permissions/0.log" Dec 04 23:10:23.025434 master-0 kubenswrapper[33572]: I1204 23:10:23.025289 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7"] Dec 04 23:10:23.027714 master-0 kubenswrapper[33572]: I1204 23:10:23.027656 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.042307 master-0 kubenswrapper[33572]: I1204 23:10:23.042201 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7"] Dec 04 23:10:23.199653 master-0 kubenswrapper[33572]: I1204 23:10:23.199377 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-sys\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.200231 master-0 kubenswrapper[33572]: I1204 23:10:23.200178 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-proc\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.201922 master-0 kubenswrapper[33572]: I1204 23:10:23.201893 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-podres\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.202204 master-0 kubenswrapper[33572]: I1204 23:10:23.202181 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zpsp\" (UniqueName: \"kubernetes.io/projected/fbb58fda-04a5-4f57-93c5-2506e7217905-kube-api-access-7zpsp\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.202307 master-0 kubenswrapper[33572]: I1204 23:10:23.202292 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-lib-modules\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.304723 master-0 kubenswrapper[33572]: I1204 23:10:23.304662 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zpsp\" (UniqueName: \"kubernetes.io/projected/fbb58fda-04a5-4f57-93c5-2506e7217905-kube-api-access-7zpsp\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.304723 master-0 kubenswrapper[33572]: I1204 23:10:23.304723 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-lib-modules\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.304983 master-0 kubenswrapper[33572]: I1204 23:10:23.304801 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-sys\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.304983 master-0 kubenswrapper[33572]: I1204 23:10:23.304878 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-proc\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.304983 master-0 kubenswrapper[33572]: I1204 23:10:23.304901 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-podres\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.305084 master-0 kubenswrapper[33572]: I1204 23:10:23.305031 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-podres\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.305602 master-0 kubenswrapper[33572]: I1204 23:10:23.305360 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-lib-modules\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.305602 master-0 kubenswrapper[33572]: I1204 23:10:23.305398 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-sys\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.305602 master-0 kubenswrapper[33572]: I1204 23:10:23.305427 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fbb58fda-04a5-4f57-93c5-2506e7217905-proc\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.323246 master-0 kubenswrapper[33572]: I1204 23:10:23.323182 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zpsp\" (UniqueName: \"kubernetes.io/projected/fbb58fda-04a5-4f57-93c5-2506e7217905-kube-api-access-7zpsp\") pod \"perf-node-gather-daemonset-thwh7\" (UID: \"fbb58fda-04a5-4f57-93c5-2506e7217905\") " pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.391624 master-0 kubenswrapper[33572]: I1204 23:10:23.391565 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:23.848414 master-0 kubenswrapper[33572]: I1204 23:10:23.848377 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-5f49d774cd-5m4l9_5598683a-cd32-486d-8839-205829d55cc2/kube-rbac-proxy/0.log" Dec 04 23:10:23.876389 master-0 kubenswrapper[33572]: I1204 23:10:23.875909 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-5f49d774cd-5m4l9_5598683a-cd32-486d-8839-205829d55cc2/cluster-autoscaler-operator/0.log" Dec 04 23:10:23.894285 master-0 kubenswrapper[33572]: I1204 23:10:23.894243 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-5f49d774cd-5m4l9_5598683a-cd32-486d-8839-205829d55cc2/cluster-autoscaler-operator/1.log" Dec 04 23:10:23.902002 master-0 kubenswrapper[33572]: I1204 23:10:23.901943 33572 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7"] Dec 04 23:10:23.922042 master-0 kubenswrapper[33572]: I1204 23:10:23.922002 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/2.log" Dec 04 23:10:23.924958 master-0 kubenswrapper[33572]: I1204 23:10:23.924915 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/cluster-baremetal-operator/3.log" Dec 04 23:10:23.933673 master-0 kubenswrapper[33572]: I1204 23:10:23.933624 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" event={"ID":"fbb58fda-04a5-4f57-93c5-2506e7217905","Type":"ContainerStarted","Data":"3fc69dfe4c0fbd32d5bf9decc2175930af905879f09a397ea0c0cdbc540604fa"} Dec 04 23:10:23.951100 master-0 kubenswrapper[33572]: I1204 23:10:23.951047 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-44srj_a3899a38-39b8-4b48-81e5-4d8854ecc8ab/baremetal-kube-rbac-proxy/0.log" Dec 04 23:10:23.978956 master-0 kubenswrapper[33572]: I1204 23:10:23.978909 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-nznvn_f1534e25-7add-46a1-8f4e-0065c232aa4e/control-plane-machine-set-operator/2.log" Dec 04 23:10:23.981620 master-0 kubenswrapper[33572]: I1204 23:10:23.980349 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-nznvn_f1534e25-7add-46a1-8f4e-0065c232aa4e/control-plane-machine-set-operator/1.log" Dec 04 23:10:23.998861 master-0 kubenswrapper[33572]: I1204 23:10:23.998812 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-88d48b57d-pp4fd_74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/kube-rbac-proxy/0.log" Dec 04 23:10:24.019169 master-0 kubenswrapper[33572]: I1204 23:10:24.017999 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-88d48b57d-pp4fd_74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/machine-api-operator/0.log" Dec 04 23:10:24.019169 master-0 kubenswrapper[33572]: I1204 23:10:24.018565 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-88d48b57d-pp4fd_74b6acf7-78a4-449e-82e2-f1ed8d7ed1b9/machine-api-operator/1.log" Dec 04 23:10:24.953615 master-0 kubenswrapper[33572]: I1204 23:10:24.952042 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" event={"ID":"fbb58fda-04a5-4f57-93c5-2506e7217905","Type":"ContainerStarted","Data":"02b062e1670f2902214dae2febc96a58216df743c7f5ad2a6bd4af80a95b66fb"} Dec 04 23:10:24.953615 master-0 kubenswrapper[33572]: I1204 23:10:24.952313 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:24.981408 master-0 kubenswrapper[33572]: I1204 23:10:24.981335 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" podStartSLOduration=2.981303924 podStartE2EDuration="2.981303924s" podCreationTimestamp="2025-12-04 23:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 23:10:24.970726884 +0000 UTC m=+3088.698252543" watchObservedRunningTime="2025-12-04 23:10:24.981303924 +0000 UTC m=+3088.708829573" Dec 04 23:10:25.275203 master-0 kubenswrapper[33572]: I1204 23:10:25.275148 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/cluster-cloud-controller-manager/1.log" Dec 04 23:10:25.275907 master-0 kubenswrapper[33572]: I1204 23:10:25.275865 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/cluster-cloud-controller-manager/0.log" Dec 04 23:10:25.304189 master-0 kubenswrapper[33572]: I1204 23:10:25.304124 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/config-sync-controllers/0.log" Dec 04 23:10:25.306169 master-0 kubenswrapper[33572]: I1204 23:10:25.306147 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/config-sync-controllers/1.log" Dec 04 23:10:25.322417 master-0 kubenswrapper[33572]: I1204 23:10:25.322351 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-mwxf4_74197c50-9a41-40e8-9289-c7e6afbd3737/kube-rbac-proxy/0.log" Dec 04 23:10:26.878090 master-0 kubenswrapper[33572]: I1204 23:10:26.878047 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-698c598cfc-lgmqn_800f436c-145d-4281-8d4d-644ba2cb0ebb/kube-rbac-proxy/0.log" Dec 04 23:10:26.909366 master-0 kubenswrapper[33572]: I1204 23:10:26.909310 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-698c598cfc-lgmqn_800f436c-145d-4281-8d4d-644ba2cb0ebb/cloud-credential-operator/0.log" Dec 04 23:10:26.920214 master-0 kubenswrapper[33572]: I1204 23:10:26.920184 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-698c598cfc-lgmqn_800f436c-145d-4281-8d4d-644ba2cb0ebb/cloud-credential-operator/1.log" Dec 04 23:10:28.657950 master-0 kubenswrapper[33572]: I1204 23:10:28.657884 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/5.log" Dec 04 23:10:28.658814 master-0 kubenswrapper[33572]: I1204 23:10:28.658778 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-config-operator/4.log" Dec 04 23:10:28.671293 master-0 kubenswrapper[33572]: I1204 23:10:28.671232 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68758cbcdb-fg6vx_810c363b-a4c7-428d-a2fb-285adc29f477/openshift-api/0.log" Dec 04 23:10:29.547270 master-0 kubenswrapper[33572]: I1204 23:10:29.547194 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-54dbc87ccb-bgbjl_7500e099-8073-4485-9cc4-f4ad90556339/console-operator/0.log" Dec 04 23:10:30.257285 master-0 kubenswrapper[33572]: I1204 23:10:30.257241 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-795b68ff6d-p7dxw_800cb931-1357-490b-855e-d7b30b4a5593/console/0.log" Dec 04 23:10:30.283398 master-0 kubenswrapper[33572]: I1204 23:10:30.283349 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-69cd4c69bf-b4qng_4c1b60d6-ca5c-40b5-a567-a48ef956163b/download-server/0.log" Dec 04 23:10:31.161084 master-0 kubenswrapper[33572]: I1204 23:10:31.160933 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-dcf7fc84b-qmhlw_a043ea49-97f9-4ae6-83b9-733f12754d94/cluster-storage-operator/1.log" Dec 04 23:10:31.164033 master-0 kubenswrapper[33572]: I1204 23:10:31.163990 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-dcf7fc84b-qmhlw_a043ea49-97f9-4ae6-83b9-733f12754d94/cluster-storage-operator/2.log" Dec 04 23:10:31.176629 master-0 kubenswrapper[33572]: I1204 23:10:31.176565 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/4.log" Dec 04 23:10:31.178020 master-0 kubenswrapper[33572]: I1204 23:10:31.177837 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-w7hnc_4f22eee4-a42d-4d2b-bffa-6c3f29f1f026/snapshot-controller/5.log" Dec 04 23:10:31.208535 master-0 kubenswrapper[33572]: I1204 23:10:31.208158 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-6bc8656fdc-xhndk_e37d318a-5bf8-46ed-b6de-494102738da7/csi-snapshot-controller-operator/0.log" Dec 04 23:10:31.209790 master-0 kubenswrapper[33572]: I1204 23:10:31.209173 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-6bc8656fdc-xhndk_e37d318a-5bf8-46ed-b6de-494102738da7/csi-snapshot-controller-operator/1.log" Dec 04 23:10:31.896542 master-0 kubenswrapper[33572]: I1204 23:10:31.896416 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-7c56cf9b74-sshsd_ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/dns-operator/0.log" Dec 04 23:10:31.906415 master-0 kubenswrapper[33572]: I1204 23:10:31.906377 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-7c56cf9b74-sshsd_ddc4ea72-b84f-4ec1-a2ba-e0c8a75ba42e/kube-rbac-proxy/0.log" Dec 04 23:10:32.449995 master-0 kubenswrapper[33572]: I1204 23:10:32.449944 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vvs9c_a5c2d3b8-41c0-4531-b770-57b7c567fe30/dns/0.log" Dec 04 23:10:32.464560 master-0 kubenswrapper[33572]: I1204 23:10:32.463873 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vvs9c_a5c2d3b8-41c0-4531-b770-57b7c567fe30/kube-rbac-proxy/0.log" Dec 04 23:10:32.482322 master-0 kubenswrapper[33572]: I1204 23:10:32.482190 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6mgn6_c2279404-fa75-4de2-a302-d7b15ead5232/dns-node-resolver/0.log" Dec 04 23:10:33.130456 master-0 kubenswrapper[33572]: I1204 23:10:33.130379 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-flrrb_ceb419e4-d804-4111-b8d8-8436cc2ee617/etcd-operator/1.log" Dec 04 23:10:33.133574 master-0 kubenswrapper[33572]: I1204 23:10:33.132130 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-flrrb_ceb419e4-d804-4111-b8d8-8436cc2ee617/etcd-operator/2.log" Dec 04 23:10:33.424152 master-0 kubenswrapper[33572]: I1204 23:10:33.424103 33572 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pwzxr/perf-node-gather-daemonset-thwh7" Dec 04 23:10:33.769120 master-0 kubenswrapper[33572]: I1204 23:10:33.769075 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcdctl/0.log" Dec 04 23:10:34.385373 master-0 kubenswrapper[33572]: I1204 23:10:34.385323 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd/0.log" Dec 04 23:10:34.436520 master-0 kubenswrapper[33572]: I1204 23:10:34.434036 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd-metrics/0.log" Dec 04 23:10:34.447248 master-0 kubenswrapper[33572]: I1204 23:10:34.446921 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd-readyz/0.log" Dec 04 23:10:34.465365 master-0 kubenswrapper[33572]: I1204 23:10:34.465218 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd-rev/0.log" Dec 04 23:10:34.477618 master-0 kubenswrapper[33572]: I1204 23:10:34.477568 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/setup/0.log" Dec 04 23:10:34.494052 master-0 kubenswrapper[33572]: I1204 23:10:34.494024 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd-ensure-env-vars/0.log" Dec 04 23:10:34.508671 master-0 kubenswrapper[33572]: I1204 23:10:34.508639 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_58d12e893528ad53a994f10901a644ea/etcd-resources-copy/0.log" Dec 04 23:10:34.571560 master-0 kubenswrapper[33572]: I1204 23:10:34.571521 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_0b9d1530-9fd8-4c69-8ed7-62b7af1f4eab/installer/0.log" Dec 04 23:10:34.612875 master-0 kubenswrapper[33572]: I1204 23:10:34.612837 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_6e011b0a-89e2-47e3-9112-d46a828416b1/installer/0.log" Dec 04 23:10:35.475932 master-0 kubenswrapper[33572]: I1204 23:10:35.475547 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-6fb9f88b7-r7wcq_35821f48-b000-4915-847f-a739b6efc5ee/cluster-image-registry-operator/0.log" Dec 04 23:10:35.480226 master-0 kubenswrapper[33572]: I1204 23:10:35.479769 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-6fb9f88b7-r7wcq_35821f48-b000-4915-847f-a739b6efc5ee/cluster-image-registry-operator/1.log" Dec 04 23:10:35.491361 master-0 kubenswrapper[33572]: I1204 23:10:35.490543 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5c4bw_924b2123-1f68-49a0-9c4f-ae978be03e40/node-ca/0.log" Dec 04 23:10:36.063155 master-0 kubenswrapper[33572]: I1204 23:10:36.063078 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/5.log" Dec 04 23:10:36.064962 master-0 kubenswrapper[33572]: I1204 23:10:36.064905 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/ingress-operator/6.log" Dec 04 23:10:36.075436 master-0 kubenswrapper[33572]: I1204 23:10:36.075392 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-qlkgh_addddaac-a31a-4dbf-b78f-87225b11b463/kube-rbac-proxy/0.log" Dec 04 23:10:36.770020 master-0 kubenswrapper[33572]: I1204 23:10:36.769964 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7cr8g_651c0fad-1577-4a7f-8718-ec2fd2f06c3e/serve-healthcheck-canary/0.log" Dec 04 23:10:37.114162 master-0 kubenswrapper[33572]: I1204 23:10:37.114103 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" event={"ID":"a6c6e72a-2ff1-4c80-8670-c677aa07aa13","Type":"ContainerStarted","Data":"f653f374b7d8de99ef36134641588c8330c435b6f7471d8be37434f3d145e6b2"} Dec 04 23:10:37.137910 master-0 kubenswrapper[33572]: I1204 23:10:37.137759 33572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" podStartSLOduration=1.190993774 podStartE2EDuration="17.137743345s" podCreationTimestamp="2025-12-04 23:10:20 +0000 UTC" firstStartedPulling="2025-12-04 23:10:20.591977483 +0000 UTC m=+3084.319503132" lastFinishedPulling="2025-12-04 23:10:36.538727054 +0000 UTC m=+3100.266252703" observedRunningTime="2025-12-04 23:10:37.136893462 +0000 UTC m=+3100.864419151" watchObservedRunningTime="2025-12-04 23:10:37.137743345 +0000 UTC m=+3100.865268994" Dec 04 23:10:37.411075 master-0 kubenswrapper[33572]: I1204 23:10:37.410974 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-55965856b6-7vlpp_2d142201-6e77-4828-b86b-05d4144a2f08/insights-operator/0.log" Dec 04 23:10:37.439278 master-0 kubenswrapper[33572]: I1204 23:10:37.439194 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-55965856b6-7vlpp_2d142201-6e77-4828-b86b-05d4144a2f08/insights-operator/1.log" Dec 04 23:10:39.322882 master-0 kubenswrapper[33572]: I1204 23:10:39.322822 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d2204da-2102-48fb-9865-ff8f367a02f3/alertmanager/0.log" Dec 04 23:10:39.337066 master-0 kubenswrapper[33572]: I1204 23:10:39.336983 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d2204da-2102-48fb-9865-ff8f367a02f3/config-reloader/0.log" Dec 04 23:10:39.355879 master-0 kubenswrapper[33572]: I1204 23:10:39.355834 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d2204da-2102-48fb-9865-ff8f367a02f3/kube-rbac-proxy-web/0.log" Dec 04 23:10:39.374956 master-0 kubenswrapper[33572]: I1204 23:10:39.374904 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d2204da-2102-48fb-9865-ff8f367a02f3/kube-rbac-proxy/0.log" Dec 04 23:10:39.390075 master-0 kubenswrapper[33572]: I1204 23:10:39.389930 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d2204da-2102-48fb-9865-ff8f367a02f3/kube-rbac-proxy-metric/0.log" Dec 04 23:10:39.412856 master-0 kubenswrapper[33572]: I1204 23:10:39.412808 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d2204da-2102-48fb-9865-ff8f367a02f3/prom-label-proxy/0.log" Dec 04 23:10:39.435937 master-0 kubenswrapper[33572]: I1204 23:10:39.435902 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6d2204da-2102-48fb-9865-ff8f367a02f3/init-config-reloader/0.log" Dec 04 23:10:39.533614 master-0 kubenswrapper[33572]: I1204 23:10:39.533554 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-7ff994598c-rn6cz_512ba6af-11ad-4217-a1ce-a2ab3ef67ec5/cluster-monitoring-operator/0.log" Dec 04 23:10:39.550786 master-0 kubenswrapper[33572]: I1204 23:10:39.550725 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-5857974f64-qqxk9_8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/kube-state-metrics/0.log" Dec 04 23:10:39.571352 master-0 kubenswrapper[33572]: I1204 23:10:39.571308 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-5857974f64-qqxk9_8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/kube-rbac-proxy-main/0.log" Dec 04 23:10:39.587095 master-0 kubenswrapper[33572]: I1204 23:10:39.586982 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-5857974f64-qqxk9_8d84a7d3-46d1-48e3-83f3-f6b32f16cc76/kube-rbac-proxy-self/0.log" Dec 04 23:10:39.604020 master-0 kubenswrapper[33572]: I1204 23:10:39.603928 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-65f77db9b4-9s9lq_49753afa-912e-44bc-ad0b-8d5f61ab7300/metrics-server/0.log" Dec 04 23:10:39.622829 master-0 kubenswrapper[33572]: I1204 23:10:39.622772 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-6559dcc668-87vwg_9414862e-1a68-4ed3-8229-f25e6a2ba0fa/monitoring-plugin/0.log" Dec 04 23:10:39.645143 master-0 kubenswrapper[33572]: I1204 23:10:39.645079 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p5qlk_0a726f44-a509-46b3-a6d5-70afe3b55e9f/node-exporter/0.log" Dec 04 23:10:39.661201 master-0 kubenswrapper[33572]: I1204 23:10:39.661130 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p5qlk_0a726f44-a509-46b3-a6d5-70afe3b55e9f/kube-rbac-proxy/0.log" Dec 04 23:10:39.673102 master-0 kubenswrapper[33572]: I1204 23:10:39.673069 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p5qlk_0a726f44-a509-46b3-a6d5-70afe3b55e9f/init-textfile/0.log" Dec 04 23:10:39.739543 master-0 kubenswrapper[33572]: I1204 23:10:39.739481 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5974b6b869-jm2hq_17912746-74eb-4c78-8c1b-2f66e7ce4299/kube-rbac-proxy-main/0.log" Dec 04 23:10:39.828335 master-0 kubenswrapper[33572]: I1204 23:10:39.828212 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5974b6b869-jm2hq_17912746-74eb-4c78-8c1b-2f66e7ce4299/kube-rbac-proxy-self/0.log" Dec 04 23:10:39.873360 master-0 kubenswrapper[33572]: I1204 23:10:39.873256 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5974b6b869-jm2hq_17912746-74eb-4c78-8c1b-2f66e7ce4299/openshift-state-metrics/0.log" Dec 04 23:10:39.909523 master-0 kubenswrapper[33572]: I1204 23:10:39.908113 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7c874e55-3cdf-44cf-85a1-36bfadc33a31/prometheus/0.log" Dec 04 23:10:39.928613 master-0 kubenswrapper[33572]: I1204 23:10:39.927670 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7c874e55-3cdf-44cf-85a1-36bfadc33a31/config-reloader/0.log" Dec 04 23:10:39.952532 master-0 kubenswrapper[33572]: I1204 23:10:39.949783 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7c874e55-3cdf-44cf-85a1-36bfadc33a31/thanos-sidecar/0.log" Dec 04 23:10:39.966530 master-0 kubenswrapper[33572]: I1204 23:10:39.966223 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7c874e55-3cdf-44cf-85a1-36bfadc33a31/kube-rbac-proxy-web/0.log" Dec 04 23:10:39.978539 master-0 kubenswrapper[33572]: I1204 23:10:39.978071 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7c874e55-3cdf-44cf-85a1-36bfadc33a31/kube-rbac-proxy/0.log" Dec 04 23:10:39.997538 master-0 kubenswrapper[33572]: I1204 23:10:39.994284 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7c874e55-3cdf-44cf-85a1-36bfadc33a31/kube-rbac-proxy-thanos/0.log" Dec 04 23:10:40.032530 master-0 kubenswrapper[33572]: I1204 23:10:40.030277 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_7c874e55-3cdf-44cf-85a1-36bfadc33a31/init-config-reloader/0.log" Dec 04 23:10:40.076852 master-0 kubenswrapper[33572]: I1204 23:10:40.076810 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6c74d9cb9f-9cnnh_6684358b-d7a6-4396-9b4f-ea67d85e4517/prometheus-operator/0.log" Dec 04 23:10:40.087911 master-0 kubenswrapper[33572]: I1204 23:10:40.087857 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6c74d9cb9f-9cnnh_6684358b-d7a6-4396-9b4f-ea67d85e4517/kube-rbac-proxy/0.log" Dec 04 23:10:40.104381 master-0 kubenswrapper[33572]: I1204 23:10:40.104344 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-7c85c4dffd-mp4qx_b966c210-5415-4fa5-88ab-c85aba979b28/prometheus-operator-admission-webhook/0.log" Dec 04 23:10:40.237785 master-0 kubenswrapper[33572]: I1204 23:10:40.237230 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79f5646748-zd47k_0153e881-4d2d-4ff6-9e70-d6163a62970c/telemeter-client/0.log" Dec 04 23:10:40.237785 master-0 kubenswrapper[33572]: I1204 23:10:40.237670 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79f5646748-zd47k_0153e881-4d2d-4ff6-9e70-d6163a62970c/telemeter-client/1.log" Dec 04 23:10:40.249022 master-0 kubenswrapper[33572]: I1204 23:10:40.248983 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79f5646748-zd47k_0153e881-4d2d-4ff6-9e70-d6163a62970c/reload/0.log" Dec 04 23:10:40.262741 master-0 kubenswrapper[33572]: I1204 23:10:40.262705 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79f5646748-zd47k_0153e881-4d2d-4ff6-9e70-d6163a62970c/kube-rbac-proxy/0.log" Dec 04 23:10:40.297080 master-0 kubenswrapper[33572]: I1204 23:10:40.297020 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c8647588d-8b8m8_30c16431-1b9d-4c3c-a570-b208d3ec0e95/thanos-query/0.log" Dec 04 23:10:40.315311 master-0 kubenswrapper[33572]: I1204 23:10:40.315262 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c8647588d-8b8m8_30c16431-1b9d-4c3c-a570-b208d3ec0e95/kube-rbac-proxy-web/0.log" Dec 04 23:10:40.328804 master-0 kubenswrapper[33572]: I1204 23:10:40.328718 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c8647588d-8b8m8_30c16431-1b9d-4c3c-a570-b208d3ec0e95/kube-rbac-proxy/0.log" Dec 04 23:10:40.345234 master-0 kubenswrapper[33572]: I1204 23:10:40.345174 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c8647588d-8b8m8_30c16431-1b9d-4c3c-a570-b208d3ec0e95/prom-label-proxy/0.log" Dec 04 23:10:40.360786 master-0 kubenswrapper[33572]: I1204 23:10:40.360744 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c8647588d-8b8m8_30c16431-1b9d-4c3c-a570-b208d3ec0e95/kube-rbac-proxy-rules/0.log" Dec 04 23:10:40.378169 master-0 kubenswrapper[33572]: I1204 23:10:40.378122 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c8647588d-8b8m8_30c16431-1b9d-4c3c-a570-b208d3ec0e95/kube-rbac-proxy-metrics/0.log" Dec 04 23:10:43.165683 master-0 kubenswrapper[33572]: I1204 23:10:43.164910 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-v5nvt_421b3d90-c030-4f07-bbfa-bcb40fcfef1f/controller/0.log" Dec 04 23:10:43.260212 master-0 kubenswrapper[33572]: I1204 23:10:43.260140 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-v5nvt_421b3d90-c030-4f07-bbfa-bcb40fcfef1f/kube-rbac-proxy/0.log" Dec 04 23:10:43.297888 master-0 kubenswrapper[33572]: I1204 23:10:43.297818 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/controller/0.log" Dec 04 23:10:44.663622 master-0 kubenswrapper[33572]: I1204 23:10:44.663569 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/frr/0.log" Dec 04 23:10:46.119034 master-0 kubenswrapper[33572]: I1204 23:10:46.118980 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/reloader/0.log" Dec 04 23:10:46.137846 master-0 kubenswrapper[33572]: I1204 23:10:46.137788 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/frr-metrics/0.log" Dec 04 23:10:46.158269 master-0 kubenswrapper[33572]: I1204 23:10:46.158208 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/kube-rbac-proxy/0.log" Dec 04 23:10:46.177328 master-0 kubenswrapper[33572]: I1204 23:10:46.177276 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/kube-rbac-proxy-frr/0.log" Dec 04 23:10:46.226544 master-0 kubenswrapper[33572]: I1204 23:10:46.226478 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/cp-frr-files/0.log" Dec 04 23:10:46.245024 master-0 kubenswrapper[33572]: I1204 23:10:46.244975 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/cp-reloader/0.log" Dec 04 23:10:46.258169 master-0 kubenswrapper[33572]: I1204 23:10:46.258135 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mbggv_92452d91-986b-42fc-9778-2f78ad4482a9/cp-metrics/0.log" Dec 04 23:10:46.281345 master-0 kubenswrapper[33572]: I1204 23:10:46.281287 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-27xx2_f4b97d80-d72e-4c4c-92f0-4c9a983c5fca/frr-k8s-webhook-server/0.log" Dec 04 23:10:46.314962 master-0 kubenswrapper[33572]: I1204 23:10:46.314915 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85bc976bd6-scgdf_0f4308db-85a3-40de-881b-b54463263359/manager/0.log" Dec 04 23:10:46.456828 master-0 kubenswrapper[33572]: I1204 23:10:46.456735 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5844777bf9-wp7bl_3c189000-5407-41c1-825a-3ad7708d6b67/webhook-server/0.log" Dec 04 23:10:46.780754 master-0 kubenswrapper[33572]: I1204 23:10:46.780703 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-clzpp_cc0da214-23b6-4e1e-aba6-bb88fe145246/speaker/0.log" Dec 04 23:10:46.830168 master-0 kubenswrapper[33572]: I1204 23:10:46.830113 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-clzpp_cc0da214-23b6-4e1e-aba6-bb88fe145246/kube-rbac-proxy/0.log" Dec 04 23:10:49.640436 master-0 kubenswrapper[33572]: I1204 23:10:49.640321 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-85cff47f46-4dv2b_0beb871c-3bf1-471c-a028-746a650267bf/cluster-node-tuning-operator/0.log" Dec 04 23:10:49.640436 master-0 kubenswrapper[33572]: I1204 23:10:49.640347 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-85cff47f46-4dv2b_0beb871c-3bf1-471c-a028-746a650267bf/cluster-node-tuning-operator/1.log" Dec 04 23:10:49.662392 master-0 kubenswrapper[33572]: I1204 23:10:49.662329 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-jn88h_fbb8e73f-7e50-451b-b400-e88a86b51e09/tuned/0.log" Dec 04 23:10:51.430208 master-0 kubenswrapper[33572]: I1204 23:10:51.430159 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-vwpdg_56f25fad-089d-4df6-abb1-10d4c76750f1/kube-apiserver-operator/1.log" Dec 04 23:10:51.432610 master-0 kubenswrapper[33572]: I1204 23:10:51.432564 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-vwpdg_56f25fad-089d-4df6-abb1-10d4c76750f1/kube-apiserver-operator/0.log" Dec 04 23:10:52.235432 master-0 kubenswrapper[33572]: I1204 23:10:52.235339 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_4b9fbd90-66d5-4637-9821-22242aa6f6d7/installer/0.log" Dec 04 23:10:52.260457 master-0 kubenswrapper[33572]: I1204 23:10:52.260410 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_dbe54b09-0399-4fbe-9f84-dd9dede0ab96/installer/0.log" Dec 04 23:10:52.303244 master-0 kubenswrapper[33572]: I1204 23:10:52.303146 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_d8307846-d1cf-4357-bcc0-b3531d34dc8b/installer/0.log" Dec 04 23:10:52.795963 master-0 kubenswrapper[33572]: I1204 23:10:52.795921 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_33faf0280764958ffeecc2dce44a9bfc/kube-apiserver/0.log" Dec 04 23:10:52.808090 master-0 kubenswrapper[33572]: I1204 23:10:52.808030 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_33faf0280764958ffeecc2dce44a9bfc/kube-apiserver-cert-syncer/0.log" Dec 04 23:10:52.820710 master-0 kubenswrapper[33572]: I1204 23:10:52.820664 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_33faf0280764958ffeecc2dce44a9bfc/kube-apiserver-cert-regeneration-controller/0.log" Dec 04 23:10:52.828319 master-0 kubenswrapper[33572]: I1204 23:10:52.828285 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_33faf0280764958ffeecc2dce44a9bfc/kube-apiserver-insecure-readyz/0.log" Dec 04 23:10:52.842255 master-0 kubenswrapper[33572]: I1204 23:10:52.842201 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_33faf0280764958ffeecc2dce44a9bfc/kube-apiserver-check-endpoints/0.log" Dec 04 23:10:52.851376 master-0 kubenswrapper[33572]: I1204 23:10:52.851337 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_33faf0280764958ffeecc2dce44a9bfc/setup/0.log" Dec 04 23:10:53.672782 master-0 kubenswrapper[33572]: I1204 23:10:53.672721 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-v7zfw_fb0274dc-fac1-41f9-b3e5-77253d851fdf/kube-rbac-proxy/0.log" Dec 04 23:10:53.690578 master-0 kubenswrapper[33572]: I1204 23:10:53.689208 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-v7zfw_fb0274dc-fac1-41f9-b3e5-77253d851fdf/manager/1.log" Dec 04 23:10:53.705098 master-0 kubenswrapper[33572]: I1204 23:10:53.705049 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-v7zfw_fb0274dc-fac1-41f9-b3e5-77253d851fdf/manager/2.log" Dec 04 23:10:54.412850 master-0 kubenswrapper[33572]: I1204 23:10:54.412798 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-gh5j2_dc5cee88-523c-4fae-af63-d18313c388cd/cert-manager-controller/0.log" Dec 04 23:10:54.437696 master-0 kubenswrapper[33572]: I1204 23:10:54.437243 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-vx58f_fa4c1453-1a2a-4f5c-a4f9-71189203fe94/cert-manager-cainjector/0.log" Dec 04 23:10:54.455688 master-0 kubenswrapper[33572]: I1204 23:10:54.455637 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-tgx98_780dc1ba-097b-4417-a960-9c6e5e7a3d40/cert-manager-webhook/0.log" Dec 04 23:10:55.231620 master-0 kubenswrapper[33572]: I1204 23:10:55.231524 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25_9157b62a-a6a9-4203-b23b-1e4657e06d49/extract/0.log" Dec 04 23:10:55.243489 master-0 kubenswrapper[33572]: I1204 23:10:55.243432 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25_9157b62a-a6a9-4203-b23b-1e4657e06d49/util/0.log" Dec 04 23:10:55.258436 master-0 kubenswrapper[33572]: I1204 23:10:55.258356 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafhvn25_9157b62a-a6a9-4203-b23b-1e4657e06d49/pull/0.log" Dec 04 23:10:55.275131 master-0 kubenswrapper[33572]: I1204 23:10:55.275048 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cd89994b5-74h4k_c9cefa2f-e4d2-4c8a-b27c-c71af710a6df/manager/0.log" Dec 04 23:10:55.289713 master-0 kubenswrapper[33572]: I1204 23:10:55.289646 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cd89994b5-74h4k_c9cefa2f-e4d2-4c8a-b27c-c71af710a6df/kube-rbac-proxy/0.log" Dec 04 23:10:55.362950 master-0 kubenswrapper[33572]: I1204 23:10:55.362884 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-f8856dd79-ds48v_68c529a2-072e-4777-9fb9-ec34aa5396ae/manager/0.log" Dec 04 23:10:55.373921 master-0 kubenswrapper[33572]: I1204 23:10:55.373860 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-f8856dd79-ds48v_68c529a2-072e-4777-9fb9-ec34aa5396ae/kube-rbac-proxy/0.log" Dec 04 23:10:55.393967 master-0 kubenswrapper[33572]: I1204 23:10:55.393913 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84bc9f68f5-7rc6r_e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca/manager/0.log" Dec 04 23:10:55.410170 master-0 kubenswrapper[33572]: I1204 23:10:55.410110 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84bc9f68f5-7rc6r_e045cf7d-f5d2-4a2c-bd6f-13f68dc903ca/kube-rbac-proxy/0.log" Dec 04 23:10:55.513203 master-0 kubenswrapper[33572]: I1204 23:10:55.513144 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78cd4f7769-wcm5p_f9ec5f3f-d171-4fc3-abb1-489d49fe30d9/manager/0.log" Dec 04 23:10:55.525163 master-0 kubenswrapper[33572]: I1204 23:10:55.525120 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78cd4f7769-wcm5p_f9ec5f3f-d171-4fc3-abb1-489d49fe30d9/kube-rbac-proxy/0.log" Dec 04 23:10:55.543391 master-0 kubenswrapper[33572]: I1204 23:10:55.543336 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7fd96594c7-5sgkl_1d25b1d0-f7a6-451b-897f-474f32b23ef1/manager/0.log" Dec 04 23:10:55.559704 master-0 kubenswrapper[33572]: I1204 23:10:55.559645 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-7fd96594c7-5sgkl_1d25b1d0-f7a6-451b-897f-474f32b23ef1/kube-rbac-proxy/0.log" Dec 04 23:10:55.576026 master-0 kubenswrapper[33572]: I1204 23:10:55.575972 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-f6cc97788-khfnz_075b6463-8b4e-47e9-9662-c6fa561e9079/manager/0.log" Dec 04 23:10:55.591822 master-0 kubenswrapper[33572]: I1204 23:10:55.591764 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-f6cc97788-khfnz_075b6463-8b4e-47e9-9662-c6fa561e9079/kube-rbac-proxy/0.log" Dec 04 23:10:55.746376 master-0 kubenswrapper[33572]: I1204 23:10:55.746268 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c9d7fd8-qr956_f9e3f583-1af1-44f2-a6e3-271336e2cf1e/manager/0.log" Dec 04 23:10:55.757880 master-0 kubenswrapper[33572]: I1204 23:10:55.757828 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d9c9d7fd8-qr956_f9e3f583-1af1-44f2-a6e3-271336e2cf1e/kube-rbac-proxy/0.log" Dec 04 23:10:55.861049 master-0 kubenswrapper[33572]: I1204 23:10:55.860948 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7c9bfd6967-5pn2v_5c9990c8-62a6-42fd-9965-50331f773940/manager/0.log" Dec 04 23:10:55.872749 master-0 kubenswrapper[33572]: I1204 23:10:55.872710 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7c9bfd6967-5pn2v_5c9990c8-62a6-42fd-9965-50331f773940/kube-rbac-proxy/0.log" Dec 04 23:10:55.963415 master-0 kubenswrapper[33572]: I1204 23:10:55.963367 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-58b8dcc5fb-pnhmq_23f59072-43f4-4dc3-a484-48ae31ed7eee/manager/0.log" Dec 04 23:10:55.975372 master-0 kubenswrapper[33572]: I1204 23:10:55.975310 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-58b8dcc5fb-pnhmq_23f59072-43f4-4dc3-a484-48ae31ed7eee/kube-rbac-proxy/0.log" Dec 04 23:10:55.996558 master-0 kubenswrapper[33572]: I1204 23:10:55.996434 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56f9fbf74b-xsxzr_c8e6de34-2747-42ce-b9c1-dfdcd71d3707/manager/0.log" Dec 04 23:10:56.007478 master-0 kubenswrapper[33572]: I1204 23:10:56.007432 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-56f9fbf74b-xsxzr_c8e6de34-2747-42ce-b9c1-dfdcd71d3707/kube-rbac-proxy/0.log" Dec 04 23:10:56.047026 master-0 kubenswrapper[33572]: I1204 23:10:56.046971 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-647d75769b-v8srz_fb197cd4-44a7-414d-aef1-a6faf66f22d6/manager/0.log" Dec 04 23:10:56.058956 master-0 kubenswrapper[33572]: I1204 23:10:56.058897 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-647d75769b-v8srz_fb197cd4-44a7-414d-aef1-a6faf66f22d6/kube-rbac-proxy/0.log" Dec 04 23:10:56.124362 master-0 kubenswrapper[33572]: I1204 23:10:56.124305 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cdd6b54fb-jjxh8_fd79fe85-44e9-47e6-857a-672f42581cb8/manager/0.log" Dec 04 23:10:56.137161 master-0 kubenswrapper[33572]: I1204 23:10:56.137094 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cdd6b54fb-jjxh8_fd79fe85-44e9-47e6-857a-672f42581cb8/kube-rbac-proxy/0.log" Dec 04 23:10:56.262596 master-0 kubenswrapper[33572]: I1204 23:10:56.262448 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-865fc86d5b-pzbmd_2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3/manager/0.log" Dec 04 23:10:56.604884 master-0 kubenswrapper[33572]: I1204 23:10:56.604806 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-865fc86d5b-pzbmd_2c70bb93-775c-4c39-a3f5-6bc1ad2e0dd3/kube-rbac-proxy/0.log" Dec 04 23:10:56.636961 master-0 kubenswrapper[33572]: I1204 23:10:56.636905 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-845b79dc4f-7v5g8_8d0cfe95-c307-45c5-aba1-45c7d5217b2b/manager/0.log" Dec 04 23:10:56.657878 master-0 kubenswrapper[33572]: I1204 23:10:56.657835 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-845b79dc4f-7v5g8_8d0cfe95-c307-45c5-aba1-45c7d5217b2b/kube-rbac-proxy/0.log" Dec 04 23:10:56.693294 master-0 kubenswrapper[33572]: I1204 23:10:56.693168 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f998f5746vn4vf_590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf/manager/0.log" Dec 04 23:10:56.706414 master-0 kubenswrapper[33572]: I1204 23:10:56.705903 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f998f5746vn4vf_590f3f7c-1d4d-45d7-a42b-2a78fe5d3dbf/kube-rbac-proxy/0.log" Dec 04 23:10:57.373278 master-0 kubenswrapper[33572]: I1204 23:10:57.373024 33572 generic.go:334] "Generic (PLEG): container finished" podID="a6c6e72a-2ff1-4c80-8670-c677aa07aa13" containerID="f653f374b7d8de99ef36134641588c8330c435b6f7471d8be37434f3d145e6b2" exitCode=0 Dec 04 23:10:57.373278 master-0 kubenswrapper[33572]: I1204 23:10:57.373075 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" event={"ID":"a6c6e72a-2ff1-4c80-8670-c677aa07aa13","Type":"ContainerDied","Data":"f653f374b7d8de99ef36134641588c8330c435b6f7471d8be37434f3d145e6b2"} Dec 04 23:10:57.544674 master-0 kubenswrapper[33572]: I1204 23:10:57.544615 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-599cfccd85-gvd74_e76f37b2-5ae2-4931-ac95-8d6161415d17/manager/0.log" Dec 04 23:10:57.716044 master-0 kubenswrapper[33572]: I1204 23:10:57.715898 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-589d7b4556-6vpst_339808d7-8bed-4362-a4d0-c367d186332d/operator/0.log" Dec 04 23:10:57.766312 master-0 kubenswrapper[33572]: I1204 23:10:57.766257 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zbrtw_a08580e2-9965-4142-91cf-0c09d82a50b7/registry-server/0.log" Dec 04 23:10:57.973783 master-0 kubenswrapper[33572]: I1204 23:10:57.973585 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-647f96877-748fk_a08eac0d-e5f3-41bd-b255-bb4e6546b7f9/manager/0.log" Dec 04 23:10:57.990220 master-0 kubenswrapper[33572]: I1204 23:10:57.990146 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-647f96877-748fk_a08eac0d-e5f3-41bd-b255-bb4e6546b7f9/kube-rbac-proxy/0.log" Dec 04 23:10:58.037921 master-0 kubenswrapper[33572]: I1204 23:10:58.037850 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-6b64f6f645-llths_61f59de0-4c3d-40f3-88bc-05e8174f41de/manager/0.log" Dec 04 23:10:58.054144 master-0 kubenswrapper[33572]: I1204 23:10:58.053668 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-6b64f6f645-llths_61f59de0-4c3d-40f3-88bc-05e8174f41de/kube-rbac-proxy/0.log" Dec 04 23:10:58.079181 master-0 kubenswrapper[33572]: I1204 23:10:58.079118 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-78955d896f-qffjg_7eebd686-6e62-4fe5-9d52-f09c57e50b2e/operator/0.log" Dec 04 23:10:58.138018 master-0 kubenswrapper[33572]: I1204 23:10:58.137971 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-696b999796-jbqjt_8ea53b4d-8452-4f0d-9508-28101ae503a5/manager/0.log" Dec 04 23:10:58.149348 master-0 kubenswrapper[33572]: I1204 23:10:58.149310 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-696b999796-jbqjt_8ea53b4d-8452-4f0d-9508-28101ae503a5/kube-rbac-proxy/0.log" Dec 04 23:10:58.183604 master-0 kubenswrapper[33572]: I1204 23:10:58.181891 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7b5867bfc7-4nnvm_b64f7476-1cd1-4d66-ad63-0a78ff022873/manager/0.log" Dec 04 23:10:58.206429 master-0 kubenswrapper[33572]: I1204 23:10:58.206201 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7b5867bfc7-4nnvm_b64f7476-1cd1-4d66-ad63-0a78ff022873/kube-rbac-proxy/0.log" Dec 04 23:10:58.230989 master-0 kubenswrapper[33572]: I1204 23:10:58.230840 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-57dfcdd5b8-qqh65_502a1fc5-ff64-42b5-be7c-9cba5c1044c5/manager/0.log" Dec 04 23:10:58.253761 master-0 kubenswrapper[33572]: I1204 23:10:58.253719 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-57dfcdd5b8-qqh65_502a1fc5-ff64-42b5-be7c-9cba5c1044c5/kube-rbac-proxy/0.log" Dec 04 23:10:58.270018 master-0 kubenswrapper[33572]: I1204 23:10:58.269958 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9b669fdb-r87g9_de6b792f-5002-4bc4-8586-b8f76e57bdf1/manager/0.log" Dec 04 23:10:58.281724 master-0 kubenswrapper[33572]: I1204 23:10:58.281686 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9b669fdb-r87g9_de6b792f-5002-4bc4-8586-b8f76e57bdf1/kube-rbac-proxy/0.log" Dec 04 23:10:58.510553 master-0 kubenswrapper[33572]: I1204 23:10:58.510436 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" Dec 04 23:10:58.563748 master-0 kubenswrapper[33572]: I1204 23:10:58.563680 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pwzxr/master-0-debug-rwf5q"] Dec 04 23:10:58.573897 master-0 kubenswrapper[33572]: I1204 23:10:58.573819 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pwzxr/master-0-debug-rwf5q"] Dec 04 23:10:58.627160 master-0 kubenswrapper[33572]: I1204 23:10:58.627093 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cpbz\" (UniqueName: \"kubernetes.io/projected/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-kube-api-access-5cpbz\") pod \"a6c6e72a-2ff1-4c80-8670-c677aa07aa13\" (UID: \"a6c6e72a-2ff1-4c80-8670-c677aa07aa13\") " Dec 04 23:10:58.627373 master-0 kubenswrapper[33572]: I1204 23:10:58.627253 33572 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-host\") pod \"a6c6e72a-2ff1-4c80-8670-c677aa07aa13\" (UID: \"a6c6e72a-2ff1-4c80-8670-c677aa07aa13\") " Dec 04 23:10:58.627486 master-0 kubenswrapper[33572]: I1204 23:10:58.627423 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-host" (OuterVolumeSpecName: "host") pod "a6c6e72a-2ff1-4c80-8670-c677aa07aa13" (UID: "a6c6e72a-2ff1-4c80-8670-c677aa07aa13"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 23:10:58.629008 master-0 kubenswrapper[33572]: I1204 23:10:58.628971 33572 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-host\") on node \"master-0\" DevicePath \"\"" Dec 04 23:10:58.638524 master-0 kubenswrapper[33572]: I1204 23:10:58.629811 33572 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-kube-api-access-5cpbz" (OuterVolumeSpecName: "kube-api-access-5cpbz") pod "a6c6e72a-2ff1-4c80-8670-c677aa07aa13" (UID: "a6c6e72a-2ff1-4c80-8670-c677aa07aa13"). InnerVolumeSpecName "kube-api-access-5cpbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 23:10:58.731206 master-0 kubenswrapper[33572]: I1204 23:10:58.731136 33572 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cpbz\" (UniqueName: \"kubernetes.io/projected/a6c6e72a-2ff1-4c80-8670-c677aa07aa13-kube-api-access-5cpbz\") on node \"master-0\" DevicePath \"\"" Dec 04 23:10:59.400241 master-0 kubenswrapper[33572]: I1204 23:10:59.400160 33572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9f7b0a6a31c622a656cc9f40382b72d52ad648ab1f62e910855083f56571b96" Dec 04 23:10:59.400544 master-0 kubenswrapper[33572]: I1204 23:10:59.400262 33572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/master-0-debug-rwf5q" Dec 04 23:10:59.771554 master-0 kubenswrapper[33572]: I1204 23:10:59.771473 33572 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pwzxr/master-0-debug-nndf9"] Dec 04 23:10:59.772162 master-0 kubenswrapper[33572]: E1204 23:10:59.772107 33572 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c6e72a-2ff1-4c80-8670-c677aa07aa13" containerName="container-00" Dec 04 23:10:59.772162 master-0 kubenswrapper[33572]: I1204 23:10:59.772130 33572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c6e72a-2ff1-4c80-8670-c677aa07aa13" containerName="container-00" Dec 04 23:10:59.772616 master-0 kubenswrapper[33572]: I1204 23:10:59.772516 33572 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c6e72a-2ff1-4c80-8670-c677aa07aa13" containerName="container-00" Dec 04 23:10:59.773836 master-0 kubenswrapper[33572]: I1204 23:10:59.773809 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/master-0-debug-nndf9" Dec 04 23:10:59.851256 master-0 kubenswrapper[33572]: I1204 23:10:59.851194 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-twslb_22dd80ed-ccaf-40b1-8837-a8aeb3c28140/nmstate-console-plugin/0.log" Dec 04 23:10:59.860308 master-0 kubenswrapper[33572]: I1204 23:10:59.860251 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmnck\" (UniqueName: \"kubernetes.io/projected/34a3eb72-829a-4267-a4e6-ce3ba9c693c5-kube-api-access-tmnck\") pod \"master-0-debug-nndf9\" (UID: \"34a3eb72-829a-4267-a4e6-ce3ba9c693c5\") " pod="openshift-must-gather-pwzxr/master-0-debug-nndf9" Dec 04 23:10:59.860511 master-0 kubenswrapper[33572]: I1204 23:10:59.860354 33572 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34a3eb72-829a-4267-a4e6-ce3ba9c693c5-host\") pod \"master-0-debug-nndf9\" (UID: \"34a3eb72-829a-4267-a4e6-ce3ba9c693c5\") " pod="openshift-must-gather-pwzxr/master-0-debug-nndf9" Dec 04 23:10:59.870980 master-0 kubenswrapper[33572]: I1204 23:10:59.870921 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mcmbn_0a44f088-be39-47f3-8e34-9cade0076325/nmstate-handler/0.log" Dec 04 23:10:59.884068 master-0 kubenswrapper[33572]: I1204 23:10:59.884024 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8rwmp_ed12588b-8304-4d07-9055-6944c540d15f/nmstate-metrics/0.log" Dec 04 23:10:59.907439 master-0 kubenswrapper[33572]: I1204 23:10:59.907384 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8rwmp_ed12588b-8304-4d07-9055-6944c540d15f/kube-rbac-proxy/0.log" Dec 04 23:10:59.927807 master-0 kubenswrapper[33572]: I1204 23:10:59.926480 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-n77lr_103e82e6-9c8a-497b-9c46-bec48796d1a0/nmstate-operator/0.log" Dec 04 23:10:59.946521 master-0 kubenswrapper[33572]: I1204 23:10:59.941648 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-265zs_4e7a211d-0841-473d-87c2-953375146be8/nmstate-webhook/0.log" Dec 04 23:10:59.964774 master-0 kubenswrapper[33572]: I1204 23:10:59.964040 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmnck\" (UniqueName: \"kubernetes.io/projected/34a3eb72-829a-4267-a4e6-ce3ba9c693c5-kube-api-access-tmnck\") pod \"master-0-debug-nndf9\" (UID: \"34a3eb72-829a-4267-a4e6-ce3ba9c693c5\") " pod="openshift-must-gather-pwzxr/master-0-debug-nndf9" Dec 04 23:10:59.964774 master-0 kubenswrapper[33572]: I1204 23:10:59.964141 33572 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34a3eb72-829a-4267-a4e6-ce3ba9c693c5-host\") pod \"master-0-debug-nndf9\" (UID: \"34a3eb72-829a-4267-a4e6-ce3ba9c693c5\") " pod="openshift-must-gather-pwzxr/master-0-debug-nndf9" Dec 04 23:10:59.964774 master-0 kubenswrapper[33572]: I1204 23:10:59.964368 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34a3eb72-829a-4267-a4e6-ce3ba9c693c5-host\") pod \"master-0-debug-nndf9\" (UID: \"34a3eb72-829a-4267-a4e6-ce3ba9c693c5\") " pod="openshift-must-gather-pwzxr/master-0-debug-nndf9" Dec 04 23:10:59.985541 master-0 kubenswrapper[33572]: I1204 23:10:59.979436 33572 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmnck\" (UniqueName: \"kubernetes.io/projected/34a3eb72-829a-4267-a4e6-ce3ba9c693c5-kube-api-access-tmnck\") pod \"master-0-debug-nndf9\" (UID: \"34a3eb72-829a-4267-a4e6-ce3ba9c693c5\") " pod="openshift-must-gather-pwzxr/master-0-debug-nndf9" Dec 04 23:11:00.103586 master-0 kubenswrapper[33572]: I1204 23:11:00.102913 33572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pwzxr/master-0-debug-nndf9" Dec 04 23:11:00.415867 master-0 kubenswrapper[33572]: I1204 23:11:00.415495 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/master-0-debug-nndf9" event={"ID":"34a3eb72-829a-4267-a4e6-ce3ba9c693c5","Type":"ContainerStarted","Data":"ebfbb9f33b819f4010e7ac64c76d082bbec6a82fba1c97d37bea510e4ffb71b0"} Dec 04 23:11:00.415867 master-0 kubenswrapper[33572]: I1204 23:11:00.415561 33572 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pwzxr/master-0-debug-nndf9" event={"ID":"34a3eb72-829a-4267-a4e6-ce3ba9c693c5","Type":"ContainerStarted","Data":"044167181ab44c66bf5183b1ff79472847f9d0a68ea9f0570152a09f4e0d3de4"} Dec 04 23:11:00.469952 master-0 kubenswrapper[33572]: I1204 23:11:00.467879 33572 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pwzxr/master-0-debug-nndf9"] Dec 04 23:11:00.487081 master-0 kubenswrapper[33572]: I1204 23:11:00.486578 33572 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pwzxr/master-0-debug-nndf9"] Dec 04 23:11:00.537454 master-0 kubenswrapper[33572]: I1204 23:11:00.537396 33572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6c6e72a-2ff1-4c80-8670-c677aa07aa13" path="/var/lib/kubelet/pods/a6c6e72a-2ff1-4c80-8670-c677aa07aa13/volumes" Dec 04 23:11:00.774011 master-0 kubenswrapper[33572]: I1204 23:11:00.773969 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-vm5f5_d46a38f0-9645-40af-a5e6-c5104d5189ba/prometheus-operator/0.log" Dec 04 23:11:00.788518 master-0 kubenswrapper[33572]: I1204 23:11:00.786878 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b974c8fd6-mldr5_5eb5e75b-473b-4288-9ca8-4683ae8063ec/prometheus-operator-admission-webhook/0.log" Dec 04 23:11:00.800837 master-0 kubenswrapper[33572]: I1204 23:11:00.800790 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b974c8fd6-wdfr2_c17c3a99-1b56-4b32-8e51-cd160fb4062d/prometheus-operator-admission-webhook/0.log" Dec 04 23:11:00.838943 master-0 kubenswrapper[33572]: I1204 23:11:00.838691 33572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-qsbhs_5b2e5d49-a0dc-454c-876f-2d46c7f68061/operator/0.log"